Anticipating what the future may bring –even when this future is immediate- can always be a risky business. But if the subject being considered relates to the Internet, rather than trying to project or estimate, one is better off relying on the magic powers of a good crystal ball. In spite of these difficulties, businesses must work within predictable and potential boundaries that can generate good intuition for the future. This will allow firms to establish strategies that will prepare them for changes before they occur. But, just like Bill Gates, who at one time said that 640 KB of RAM would be all anybody would ever need, we humbly accept the possibility of making erroneous assumptions and predictions while delineating what Human Level Communications anticipates for the fast growing and highly competitive world of search engine marketing during 2005.
All against one, and one against all
The consolidation of search engines
2005 culminates an on-going search engine consolidation process that has left three indisputable star players: Google as the number one, Yahoo! and MSN Search. These three search engines represent between 85% and 95% of all traffic that arrives at a website directed by a search engine or directory service, and their databases feed results to many other search engines and meta-engines that either revolve around Yahoo!, such as AltaVista, All the Web, Overture and Inktomi, or are fed by Google, like Netscape, AOL, Terra, or Lycos. Meanwhile, the giant Microsoft continues to fine-tune its own technology in the form of a beta version of MSN Search.
These three leaders are expected to fiercely compete in the areas listed below, significantly impacting search engine marketing strategies.
- Quality of Natural Results : only time and the market will tell what search engine will be able to generate the best natural results (i.e. non-sponsored). Google seems to have the most experience in this area, while Yahoo!, based on the knowledge gained from the acquisition of Inktomi, one of the leaders in this domain, has developed its own website indexing algorithm. MSN Search, on the other hand, has been crawling the Web for well over a year to generate its own database for Microsoft.
- Local Search : being able to use a search engine to find the closest pizza restaurant to our current location introduces an important geographic segmentation challenge. For such a strategy to succeed, the global focus of on-line businesses needs to be adapted to satisfy the needs of a more restricted geographical market. Google and Yahoo! have already taken the first steps towards solving this problem and MSN Search will follow suit very soon (as we were finishing this article, Microsoft launched a beta version).
- Free e-mail : Yahoo! Mail and MSN Hotmail used to share the spotlight as being the leaders in offering free e-mail accounts. Google then pulled from its sleeve GMail, a free e-mail account service with 1GB of storage that uses a contextual publicity mechanism based on the automatic analysis of e-mail content, hurting the feelings of those so committed to protecting the privacy of on-line communications. Yahoo! Mail has reacted rapidly raising their free e-mail storage quota to 250 MB and offering a new e-mail variety, Mail Plus, that for less than $20 a year doubles the storage space offered by Google. Once again, Microsoft has fallen behind in this contest even though it has elevated its storage quota to 250MB. It will be interesting to follow MSN’s reaction to Google’s bold move.
- Desktop Search : like many other computer advances, the idea of a web search integrated with a local search of a user’s hard disk is not new. Apple, once again, pioneered this concept and offered a solution in their Sherlock tool, which has been for some time a standard component of MacOS, Apple’s operating system. However, Google Desktop has been the first offering of such an application for Windows desktops and once again, Google has beat the giant from Redmond in its own turf, perhaps anticipating that Microsoft could integrate MSN Search in their next release of the Windows operating system, just like they did a while back with Explorer, critically wounding Netscape Navigator.
Personalized searching, Eurekster , or results shown interrelated by categories, Kartoo, are some examples of risky bets that new players in the search engine arena are developing to defy the leadership imposed to date by the three giants. There is still a lot to experiment with and invent in the Internet and the prior experience of a company that started inside a garage, Google, and that defied and beat –at least for now- the giant Yahoo! is encouraging many start-up ventures to assume the role of David versus Goliath.
Optimize and re-optimize, it pays off
Google’s success has proven that users will utilize those search engines that are capable of generating results that present in their first page of results those websites that are most relevant to the searching criteria provided. With this mission in mind, both webmasters and web positioning experts have tried to position their websites within the highest ranking results for the maximum number of categories, all to attract the largest amount of traffic. The traditional bag of “tricks”, such as keyword saturation, hidden or invisible text, meta-tags saturation, link farms, mirror sites, continuous search engine registrations, etc., is a thing of the past. The algorithms employed by modern search engines have been learning every new “trick” just as quickly as the “experts” have been discovering and taking advantage of all the weaknesses of web crawlers. This constant confrontation has spawned a mutual perfectionism. On one hand, the searching algorithms are now harder to “trick” and they are able to progressively generate more pertinent results ordered in a hierarchy that users find more useful and efficient. On the other hand, webmasters and Internet professionals have discovered that it is more important to correctly identify the market niche being targeted, generate rich and high quality contents, provide frequent updates, use well structured and purified code, make the site highly usable and accessible, rather than utilizing an absurd language simply to “satisfy” Googlebot, for example.
In the area of web optimization, immediate attention should be given to the following points:
- Common sense – How useful is it to appear first on a search if when the users arrive at a page they find a repetitive and absurd language and an ill-formed and difficult to follow structure? Or, to put it in a different way, do we have a website designed to attract a lot of visitors or to attract a lot of clients ? Appearing first on a search engine is similar to having your prospective clients lined outside your door. There is still a long way before those clients walk in, actually choose something and pay for it.
- Professional involvement – The time for optimization “tricks” has passed. Today, the task of optimizing a website belongs to the experts. These are professionals capable of generating coherent navigational structures, suggesting meaningful language for the search engines and natural for the visitors, adding access links from important and reputable external websites, and providing advise beyond the technical aspects to identify the market niches for each website and those keywords that will generate the maximum rate of conversion from visitors to clients.
- Contents – The creation of original and high quality contents provides the highest return on the investment. Writing articles, participating as a moderator in forums or chat rooms, or developing e-books or newsletters will translate into a rapid surge of qualified traffic not only from potential customers, but also from providers, collaborators and possible partners.
- Access links – A very effective way to direct traffic to our website is by including links to it inside other important and related websites. This is due to several reasons.
- Appearing as a recommended link inside important portals positions us as experts or leaders in our domain, not only in front of an audience of potential customers, but also in front of our competition.
- Those external links pointing to our website increase our Page Rank, a popularity index that Google uses to express the importance of a website and one of the parameters that it considers when is placing the final search results
- Links from other websites generate more qualified traffic in a direct way, by users clicking on the link itself, and indirectly, by users who click on Google because they obtained better rankings due to the popularity of our website.
- Usability – Two aspects are important for a website to produce a satisfying performance. Not only it needs to be well optimized for the search engines, or be search engine friendly, but it also needs to be optimized for the visitors, or be user friendly. That is why usability factors are so important when it comes to designing the navigation, framework, and contents of a website.
- Feedback – One cannot afford to ignore the type of keywords and other search criteria that will generate the largest rate of conversion from visitors to clients. By analyzing this data periodically we can continually optimize our website, not necessarily to obtain a greater number of top search engine rankings, but to improve our efficiency and profitability (our real goal).
From information to knowledge
This will be one of the buzzwords for 2005. The true goal for this year will be to achieve that a larger number of web visitors actually become clients. Obtaining top rankings in the search engines, even though is one of the most worthwhile investments, still costs money. It is even more expensive to sign up for an AdWords campaign or any other pay-per-click scheme. Besides, the higher the traffic level, the more bandwidth will be required and, in some instances, the cost to analyze the traffic statistics will increase. That is why the quality of the traffic is always going to be valued over the quantity. This paradigm shift will involve efforts in two different areas: positioning and results analysis and measurement.
It will be time well spent to identify and select those key concepts for our niche market and the most efficient search engines and value-added portals to direct quality traffic to our website. The majority of the effort will be concentrated in the areas listed below.
- Web optimization will no longer belong to just a few. It is foreseeable that by generalizing the web optimization procedures, competition among websites in the same line of business will shift from the application of certain “tricks” (e.g., meta-tags, keyword saturations, etc.) to improvements in areas such as content enrichment, usability, a genuine interest towards the end-users and the addition of external references by placing access links in related websites.
- Quality code. XHTML 1.0 code and CSS 2.0 style sheets are becoming increasingly popular syntaxes for the development of web pages. They provide flexibility and the possibility to easily customize a web design, while significantly increasing page loading speeds, thus improving the accessibility and indexing of a website.
- Cultural and geographical content adaptation. Global websites go beyond the translation of their text to multiple languages. They also adapt their entire content to match the local peculiarities and cultural and linguistic differences of each country or region. In the United States, for example, websites adapted to capture the attention of the Hispanic community, the number one minority in that country, have already started to surface. In the multi-cultural environment of the European Union, an even larger need exists to adapt website contents to the different languages and cultural interests of each country. And finally, one cannot ignore the emergent interest by western businesses of capturing Chinese, Indian, and Middle-eastern audiences.
- Usability and accessibility. These are two principles that besides improving and facilitating the conversion process from users to clients, they demand a greater social responsibility towards minorities, and generally speaking, an increase in the degree in which a website has to be indexed for a search engine.
Web traffic analysis:
Web positioning and web traffic reports, in isolation, no longer have valuable meaning if they are being used as the only set of metrics to determine the success of a website. Instead, one should focus not only in the selection of the most significant set of performance metrics for a website (the so-called key Performance Indicators), but also in the continuous monitoring of these indicators in order to establish trends and anticipate changes. To ensure success, this type of analysis must follow the following guidelines:
- Each department should establish their own set of metrics and associated key performance indicators. Examples of such indicators could be the rate of satisfactory results generated by an internal product search engine, the rate of cases that were resolved at a help desk, the participation rate among the various website sections as depicted in the web traffic reports, the rate of products that get added to a shopping cart, the rate of visits that concluded in a successful sale, the recommendations sent to other users, how visitors valued the different products, the number of contacts established, etc.
- Once the performance indicators have been identified for each case, a monitoring calendar should be put in place to periodically track the various metrics. Instead of receiving hundreds of web traffic statistics, each department head can then focus in tracking their own performance indicators.
- This form of tracking does allow each responsible party to contemplate improvement plans and determine if previous strategies are having the anticipated success. Since they are only monitoring their own area, corrective measures can be quickly identified and applied.
- Based on the information obtained from these indicators, the entire cycle can be started again.
The search engine consolidation in three primary players –Google, Yahoo! and MSN Search- has simplified, in a way, the task of positioning websites for those engines. Notwithstanding, the generalization of this type of strategies has started to erode the competitive edge that the early websites obtained from using “tricks” to attract the attention of certain search engines. From here on, however, the battle for a successful website placement strays from the application of “tricks” by expert wannabes towards more sound, while more complex, strategies including not only the optimization of meta-tags, but also the enrichment of page contents, code correctness, generation of contents for other websites, participation in forums and chat rooms, active search for partners and access links, on-line training, usability, accessibility, etc. And it is in this new environment, armed with a good set of increasingly sophisticated web traffic analysis tools, that we can make well educated marketing decisions, not only to improve our on-line presence, but to benefit our off-line world as well. As a final comment, I would like to raise the possibility that some or all these tactics may prove ineffective by year end. However, since we have not found that perfect crystal ball yet, it seems wise and prudent to start from the basis suggested in this paper and keep a watchful eye on any new developments that may impact the course that we decide to set.