Friday, November 7, 2008

Get Noticed, Get Clicked

 


There are no shortcuts to currying favour with search engines, it seems







With more than 135 million Web sites out there, you’ve got your job cut out to get your Web site in front of surfers. Search engines have long been recognised as a good source of visitor traffic, and over the years, Search Engine Optimisation (or SEO as its popularly called) has developed into an active, dynamic industry in its own right.



A Little History

If you still think stuffing keywords into the meta tags of your Web pages are what will push you up the rankings in the SERPs (Search Engine Results Pages), this is a wake-up call. You’ve been sleeping and “keyword stuffing” is about as dead as it can get. The Net has moved on, and how! The practice of keyword stuffing originated in the ‘90s when Webmasters found that search engines gave special weightage to the META tag of HTML pages (right-click on a Web page and view its source, and inspect the <meta> HTML tag near the top). This was in the days when bandwidth and hardware were still at a premium. Canny Webmasters began stuffing keywords into their meta tags, which pushed them up in the SERP rankings. Search engines quickly caught on, compensating for the skew in the results, and ever since it has been a game of tag between the search engines and the SEO industry. SEO consultants discover a way to gain an edge in the rankings; the search engines move in and plug the hole.

 

What Search Engines Do


These days, most search engines look at multiple factors in determining a Web site’s ranking in the search results. Some are known, most are unknown. The search algorithm is the secret sauce that drives the accuracy of the search results, and is therefore one of the most closely-guarded secrets. Each search engine has a different set of factors to determine rankings. Google, it is rumoured, uses more than 200 data points! The others (Yahoo! and Windows Live are the main ones which matter these days) are not far behind.



One of the known factors that play a role in determining the rankings in Google is PageRank. This is a score that Google assigns to a Web site on a scale of 1 to 10. A higher PageRank gives you higher weightage. PageRank itself is determined using “…more than 500 million variables and 2 billion terms.” (http://www.google.com/corporate/tech.html). And like PageRank, there are other hundreds of other factors that go into determining the ranking in the SERP.



Black Hat Vs. White Hat


Even with these many factors to juggle around with, some Webmasters resort to using deceptive practices in an attempt to fool the search engines and push up their ranking. This using of deception—and techniques that attempt to fool the search algorithm—is known as Black Hat SEO, and search engines have special filters that monitor for known Black Hat techniques. If you are looking for long-term recognition and are building a quality Web site, Black Hat SEO is just not the way to go! Search engines penalise sites that knowingly or unknowingly use Black Hat techniques, and this can include suspension or even complete removal from the search engine’s listings.



Case in point: the German Web sites of BMW and RICOH were removed temporarily from the search listings for using deceptive techniques. They were only restored after they addressed the issue.



                                                  ...search engines not only monitor the search results page,

                                     but also monitor user behaviour for a given search term.




Deceptive techniques can be broadly classified into two types: those that try to fool the search engines, and those that try to fool human searchers. For example, the title of a page may read “Free Software Download.” This will be displayed in the search results if the Black Hat SEO for this page is successful. However, when the user clicks on the link, it takes them to a page where they are served more ads to click on—and with absolutely nothing to download. Search engines are aware of such techniques and compensate for them. They have become increasingly efficient—though not perfect—at detecting Black Hat SEO. As time progresses, the cost of Black Hat SEO is going to far outweigh the cost of following legitimate practices in making your Web site search-engine-friendly.









White Hat SEO, on the other hand, employs—or at least tries to use—only non-deceptive techniques. The qualification “tries to” stems from the fact that there is a certain amount of grey area in what Webmasters and SEO consultants consider legitimate optimisation and what the search engines agree on. Early on this year, there was a spate of sites that sprang up offering paid reviews from bloggers. This seemed a good idea, especially for little-known sites who wanted to piggyback on the popularity of an established blogger. That’s until Google declared paid reviews (and their links) might get penalised or removed from search listings.



Black Hat works on the assumption that they will eventually be caught by the search engines and are ready to move on. There is a law of diminishing returns in operation, as search engines get increasingly efficient in proactively detecting Black Hat sites. Eventually, we expect to see the practice minimised to only those die-hard tech criminals with enough resources (time, money, and determination) to use Black Hat as a part of a larger scam operation.



White Hat works with the sole objective of creating a sustainable ranking in the search engine results. Again, this is not bullet-proof: search engines are constantly refining their algorithms to keep up with the latest developments that impact what users want to search for. These days, search engines not only monitor the search results page, but also monitor user behaviour for a given search term. That is, if site A was ranked number 1 on the SERP but users regularly clicked on site B, the number 2 link, then site B would move to the number 1 position. Add to this the fact that search is getting more and more personalised. Google, according to what can be inferred from patent filings, will customise the SERPs based on geographic location, similarity of search patterns with other users, membership on certain Web sites or forums that are discoverable by the Googlebot (the search engine spider software that crawls and indexes Web pages), and other factors that would, in aggregate, group you with a specific set of users who have similar interests.



For example, if you start using Google Finance, Google will start serving financial information in the SERPs. If you are, say, a regular participant at Digit’s forum (http://www.thinkdigit. com/forum/) and you run a search, Google will look at similar searches from other users in the Digit forum and try to guess what will interest you. Of course, this is a simplified example. Your actual profile will be much more complex, and grouping you with users with similar tastes will involve much more. This is, of course, very good for the user: relevance improves.



But this kind of complexity is a nightmare for SEO practitioners. You can be forgiven for thinking that even White Hat SEO is more voodoo than an art or a science. Fortunately, there are some published guidelines, and over the years, the SEO industry has matured enough to be aware of some basic best practices that can help you in achieving the rankings that you feel you deserve.



SEO Basics


Determining a specific SEO strategy for your Web site will depend on a number of factors that are beyond our scope here. These include things like at what stage your Web site is—whether it’s a startup or an established site, what your industry is, your target audience, as well as what type of content you have on your site. If you are an e-commerce site, you might need to follow certain optimisation techniques like exposing your product catalogue and making it easy for the search spiders to index the product pages. Here, we look at the basics that will give you a general idea of what you need to do stay on the good side of the search engines. Even if you are not interested in SEO, you might still want to pay attention to these guidelines which, at the very minimum, will ensure that you don’t accidentally get blacklisted.



                                            Nothing increases your search engine rankings

                               like relevant, on-topic, in-context content.




While all these techniques are applicable to some degree to the Big Three (Google, Yahoo!, and Windows Live), the emphasis given by each will vary. To a large extent we have kept Google in mind, considering that they hold the largest market share.



Content Relevance


Content is still king. Nothing increases your search engine rankings like relevant, on-topic, in-context content. All the other factors you need to pay attention to will carry no weight if you do not have relevant content. Even if you have a poorly-designed site, but with content that is useful to your visitors, search engines can be very forgiving and push you up the rankings, especially if a lot of other Web sites link to your site referring to it as an authority in the specific subject area.



Linking


Link Popularity is one of the main criteria in judging the relevance of a site. This is based on the assumption that if you have good, high-quality, authoritative content, more people will link to you. This might give some the idea that many dummy sites linking to your site can artificially inflate your rankings. Not any more. Search engines have wised up to this technique—known as link farming—and will actually penalise you if they detect that you’re using link farms. What counts are organic links, that is, links to your sites by other sites that are in context.



Search engines not only count the links to your site but also look at the surrounding words to identify the context in which the link is placed. For example, if you are running a motorbike forum and you are linked to from a tech blog, a plain link to your Web site in the links section will carry less weight than if you were mentioned in a blog post. Of course, the exception is if a number of bloggers link to your forum, and a large percentage of them are passionate bikers with posts on biking. Secondly, the “anchor text” for the link also plays a part in determining the value of the link. “Check this out” will have less weightage than “Check out Rajesh’s Motorcycle Forum”. The link anchor text “this” is less descriptive and of insignificant context when compared to “Rajesh’s Motorcycle Forum”, and hence will have lesser value.



Of course, you have no control over how a blogger or another Web site will link to you, but you can at least do some groundwork (run a search for link:yourdomain_dot_domain _extension in Google) and politely request the Webmasters or bloggers you can reach if they would be so kind as to edit the anchor text.



While link farming is dead, there is still some life left in “reciprocal linking.” This is where you agree with another Web site to show their links on your Web site in exchange for linking to your site. Note, however, that this should be done intelligently, or you might run foul of the search engines and get blacklisted. Stick to sites and blogs that are in topic and context with your site. Linking from a site on, say, Ayurvedic medicine will have no value, and links from many such non-relevant sites may end up getting treated as link spam, attracting penalties and even blacklisting.



One other technique is to participate in other forums, write articles for other Web sites, and/or leave comments on blogs. You may use these to provide links to your site, but ensure that the links are in context and provide value to the conversation. If you simply pop into a comment thread and say “Hey, check out my motorcycle forum” and leave a link, you will quickly get flagged for comment spam. Look for articles or blogs on motorcycles (our example) and leave an intelligent comment linking to a thread in your forum where they discussed the same issue—or something similar.



Links (and therefore the site it is linked to) gain value in other ways too:


 



  • the older they are,

  • if they are located higher up on a page than towards the bottom,


  • if its part of the content (higher value) as against it being in a collection of other links like in a business directory listing (lower value), and


  • if the link is from an authority site on the subject.



While all these links are something you have no direct control over, what you can ensure is that you have relevant on-topic content that creates the environment that will influence others to link to your content.



W3C Compliance And Site Structure


The one cardinal rule that all search engines insist on is that your Web site be primarily designed for humans and not search engines. The easiest way to do this is to bring your site into W3C (World Wide Web Consortium) compliance. The W3C is the standards body that defines development standards for Web technologies, and making your site W3C-compliant will give you a boost in the rankings with at least one of the search engines (Yahoo!). There are many validation tools that you can use to find out how compliant (or non-compliant) your Web site is. A basic “validator” for individual Web pages is available at http://validator.w3.org.



Bringing your site into W3C compliance is hard. The upside, though, is that when used in combination with CSS (Cascading Style Sheets), you can get a much cleaner and better-performing site. In addition, from an SEO perspective, the actual content will move up higher in the page code and hence be treated as more valuable by the search engines.



From a pure SEO perspective, compliance is not the holy grail to climbing up the SERP. The objective behind attempting to become compliant is that it ensures that the copy is marked up so that it is “clear” to search engines. Achieving compliance also ensures better compatibility with mobile devices, with less chances of the the code not working in mobile browsers. In any case, the point is to minimise errors rather than seek to achieve full compliance overnight.



Often, search engines do not crawl very deep into your site; you can remedy this by submitting your site and the deeper pages in your site directly. Go to http://google.com/ Webmasters/tools for Google, http://siteexplorer.search.yahoo.com for Yahoo!, and to http://search.msn.com.sg/docs/submit.aspx for Windows Live. Also, create a site map based on the XML Site Map standard—which is supported by the Big Three and others. There are many free tools that can quickly generate an XML sitemap of your site (search for “XML Sitemap Generator”). Create a folder called “public_ html” in the root folder of your Web site, put the sitemap xml file there, and submit the URL of the xml file to the search engines. This way, the spider will crawl the entire site and index all the pages. One added benefit is that when your site does come up as the first result in SERPs, it will automatically show the internal links under the main result heading, making it easier for users to quickly reach your internal pages faster.



Formatting And Keyword Density


While we did imply at the outset that META tags are no longer relevant, it still carries a little weight. The meta tag of a page indicates some basic information about your page. For example...

<meta name=”Description” content=”Website for Indian Motorcycle Fans.” />

<meta name=”Keywords” content=”india,motorcycles,hero honda, bajaj, bullet, royal enfield “ />

...is an example of a good meta tag. The keywords can, of course, be expanded, but ideally should be kept to the bare minimum and only include those important words that appear in the content. In addition, ensure that the Title tag of your page corresponds to the title of the content of the page. This informs the search engine that your Web page is what it claims to be.

You can also use different fonts and formatting for specific keywords. Search engines will give different weightage for words that are bolded or underlined, and will give different weightage for bolded or underlined links. Needless to say, this has to be in context. You can’t bold all the text! The search engine will take this as the standard formatting for the page.



The formatting you apply should be primarily designed to inform the human reader of different levels of emphasis that you wish to apply. Also, for images, include anchor text (alt text) that gives a description for the image rather than just the image name.

In addition to applying formatting that can help increase the weightage of your pages, you should also keep keyword density in mind. The keyword density is the ratio of the total number of keywords to the total number of words. There is no magic number for this; it keeps changing constantly and may be different for different industries. One way is to run a search with the keywords and look at the density in the top 10 sites in the SERP and determine a density somewhat midway between the highest and lowest values. You might want to do the same for the keywords themselves—see what the competition is using!



Now for the bad news. While optimum keyword density may not be given a great deal of weight by the search engine, too high a figure for keyword density can attract penalties—search engines may rightly or wrongly suspect you of keyword stuffing.

Keep your keywords focused. If you are a packer and mover in India, keep your words limited to “packer mover India”; Google, in any case, clumps words with similar meanings together. You might consider using related keywords like “point-to-point shipping”, “door delivery”, etc.



Site And Page Age


This is a simple but important metric. The longer the site has been running and the longer the page has been up, search engines give it a better ranking on the assumption that you are not a fly-by-night operator and have really something useful to say. The same can be said for links. The longer someone links to your page, the more it gains in value.

The search engines apply, to varying degrees, an aging delay on new Web sites. This is based on the assumption that sites are trying to game the search algorithm, and they are progressively removed over a period of time. From an optimization perspective there is nothing you can do about it but wait. So if your pages do not start appearing in the search results straightaway and you are a fairly new site, you know why!



Going To The Professionals


The SEO industry being what it is, there are as many con artists as there are genuine practitioners of the craft. In most cases, by simply focusing on delivering great content and by following the basic rules to prevent you from get blacklisted—along with some light PR linking activity in relevant blogs and forums—will enable your site to gradually rise up the rankings. However, if you need results, and fast, you should turn to an SEO Professional. A true SEO consultant can deliver superior results to what you could achieve on your own. However, given the prevalence of so many con artists who make your contract contingent on specific benchmarks, you need to define the search keywords that you wish to rank in, specify the ranking range that you want to appear in (if they promise to deliver number 1 ranking, politely thank them and slam the door in their face), and specify the time period that this must happen in. Only after you define at least these three metrics should you even consider inking a deal.

No comments: