Site   Web

December 31, 2008

Evolving Algorithms

Once upon a time, search engines may have used keywords and a few basic bits of code in order to calculate their rankings. This evolved to include other key elements, such as the number of links other people made to your page, and then evolved still further to blacklist certain links in order to avoid specious link farms or exchanges.

Now, search engine algorithms are beginning to take both user behavior and dynamic aspects of the site into account. This was largely predicted by the SEO community as a natural progression of the discipline, and its day seems to have arrived.

First and foremost, bounce rate should be kept under fifty percent. If a user returns directly from your site main page to the search results, chances are they are not finding what they are looking for. This degrades the search engine’s performance for its users, so it will respond by reducing your site’s rankings for those search keys. Degrading to a 70 percent or 80 percent bounce back rate will likely decrease your rankings, whereas getting up to a 20 or 30 percent bounce rate can help ensure that you consistently make top rankings.

Site performance is another key element in the new paradigm of search engine algorithms. High performance rates, very low down time, speedy searches, working links, and anything else you can do to improve site speed and reliability will prove that your site can handle the traffic the search engine drives to you.

Linking is an old standard way to increase rankings, but diversity and quality are becoming increasingly critical. It is far better to have a handful of high quality links from respected sources than to have mountains of suspicious or useless links. Making an effort to promote your site only with well-respected sites and web masters, and avoiding link exchanges like the plague, should give results that are well worth the effort.

An excellent way to promote good quality linking and interactivity with the rest of the web is to utilize the most popular social networking and bookmaking sites. Twitter, and especially its blog-broadcasting counterpart Loud Twitter, can be a great way to get exponential linking. Social linking and news sites such as Digg and Stumble Upon can also help you capitalize on Web 2.0 viral growth rates. Sites like Facebook, Myspace, Livejournal, and other massive blog and social networking sites are becoming crawlable; you want to already have a significant foothold in those arenas when other web masters are scrambling to catch up with the newest search algorithm tweaks. And do not overlook the power of RSS. The number of subscribers to your RSS feed is already becoming a key measure of your site relevance and thus your site ranking.

Search engines exist not to count links, but to give users what they want, and content is still what users are looking for. You do still need to do the “real work” of web sites: getting respected authorities to review your products, providing excellent multimedia offerings, and writing articles and posts.

About the Author: SEO Sapien is a SEO Company. We offer affordable and guaranteed search engine optimization services. You can visit our site at http://www.seosapien.com for more information and SEO Prices.

One Response to “Evolving Algorithms

    And I also know that this is cool and nice to be the done guy.

Submit a Comment

Your email address will not be published. Required fields are marked *






You may use these HTML tags and attributes: <a href="" title="" rel=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 3,855,824 bad guys.

css.php