Tuesday, July 17, 2012

A Way for Search Engines to Improve | seo consulting

Wouldn?t it be nice if the search engines could comprehend our impressions of search results and adjust their databases accordingly? Properly optimized web pages would show up well in contextual searches and be rewarded with favorable reviews and listings. Pages which were spam or which had content that did not properly match the query would get negative responses and be pushed down in the search results.

Well, this reality is much closer than you might think.

To date, most webmasters and search engine marketers have ignored or overlooked the importance of traffic as part of a search engine algorithm, and thus, not taken it into consideration as part of their search engine optimization strategy. However, that might soon change as search engines explore new methods to improve their search result offerings. Teoma and Alexa already employ traffic as a factor in the presentation of their search results. Teoma incorporated the technology used by Direct Hit, the first engine to use click through tracking and stickiness measurement as part of their ranking algorithm. More about Alexa below.

How can Traffic be a Factor?

Click popularity sorting algorithms track how many users click on a link and stickiness measurement calculates how long they stay at a website. Properly used and combined, this data can make it possible for users, via passive feedback, to help search engines organize and present relevant search results.

Click popularity is calculated by measuring the number of clicks each web site receives from a search engine?s results page. The theory is that the more often the search result is clicked, the more popular the web site must be. For many engines the click through calculation ends there. But for the search engines that have enabled toolbars, the possibilities are enormous.

Stickiness measurement is a really great idea in theory, the premise being that a user will click the first result, and either spend time reading a relevant web page, or will click on the back button, and look at the next result. The longer a user spends on each page, the more relevant it must be. This measurement does go a long way to fixing the problem with ?spoofing? click popularity results. A great example of a search engine that uses this type of data in their algorithms is Alexa.

Alexa?s algorithm is different from the other search engines. Their click popularity algorithm collects traffic pattern data from their own site, partner sites, and also from their own toolbar. Alexa combines three distinct concepts: link popularity, click popularity and click depth. Its directory ranks related links based on popularity, so if your web site is popular, your site will be well placed in Alexa.

The Alexa toolbar doesn?t just allow searches, it also reports on people?s Internet navigation patterns. It records where people who use the Alexa toolbar go. For example, their technology is able to build a profile of which web sites are popular in the context of which search topic, and display the results sorted according to overall popularity on the Internet.

For example a user clicks a link to a ?financial planner?, but the web site content is an ?online casino?. They curse for a moment, sigh, and click back to get back to the search results, and look at the next result; the web site gets a low score. The next result is on topic, and they read 4 or 5 pages of content. This pattern is clearly identifiable and used by Alexa to help them sort results by popularity. The theory is that the more page views a web page has, the more useful a resource it must be. For

example, follow this link today -

http://www.alexa.com/data/details/traffic_details?q=&url=http://www.metamend.com/

- look at the traffic details chart, and then click the ?Go to site now? button. Repeat the procedure again tomorrow and you should see a spike in user traffic. This shows how Alexa ranks a web site for a single day.

What Can I Do To Score Higher With Click Popularity Algorithms?

Since the scores that generate search engine rankings are based on numerous factors, there?s no magic formula to improve your site?s placement. It?s a combination of things. Optimizing your content, structure and meta tags, and increasing keyword density won?t directly change how your site performs in click-tracking systems, but optimizing them will help your web site?s stickiness measurement by ensuring that the content is relevant to the search query. This relevance will help it move up the rankings and thus improve its click popularity score.

Search Engines Can Use the Click Through Strategy to Improve Results

Search engines need to keep an eye to new technologies and innovative techniques to improve the quality of their search results. Their business model is based on providing highly relevant results to a query quickly and efficiently. If they deliver inaccurate results too often, searchers will go elsewhere to find a more reliable information resource. The proper and carefully balanced application of usage data, such as that collected by Alexa, combined with a comprehensive ranking algorithm could be employed to improve the quality of search results for web searchers.

Such a ranking formula would certainly cause some waves within the search engine community and with good reason. It would turn existing search engine results on their head by demonstrating that search results need not be passive. Public feedback to previous search results could be factored into improving future search results.

Is any search engine employing such a ranking formula? The answer is yes. Exactseek recently announced it had implemented such a system, making it the first search engine to integrate direct customer feedback into its results. Exactseek still places an emphasis on content and quality of optimization, so a well optimized web site, which meets their guidelines will perform well. What this customer feedback system will do is validate the entire process, automatically letting the search engine know how well received a search result is. Popular results will get extended views, whereas unpopular results will be pushed down in ranking.

Exactseek has recently entered into a variety of technology alliances, including the creation of an Exactseek Meta Tag awarded solely to web sites that meet their quality of optimization standards. Cumulatively, their alliances combine to dramatically improve their search results.

ExactSeek?s innovative approach to ranking search results could be the beginning of a trend among search engines to incorporate traffic data into their ranking algorithms. The searching public will likely have the last word, but webmasters and search engine marketers should take notice that the winds of change are once again blowing on the search engine playing field.

Did you find the information in this article useful? Feel free to pass it along to a friend or drop us a line at comments@metamend.com.

About The Author

Richard Zwicky is a founder and the CEO of Metamend Software, a Victoria, B.C. based firm whose cutting edge Search Engine Optimization software has been recognized around the world as a leader in its field. Employing a staff of 10, the firm?s business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS technology, which correlates online businesses with their physical locations, as well as their cutting edge advances in contextual search algorithms.

articles@metamend.com

Source: http://www.webprone.com/archives/413

superbowl 2012 kickoff time what time is the super bowl 2012 nfl mvp lana del rey snl performance nick diaz sheryl sandberg superbowl recipes

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.