Site   Web

October 13, 2015

Is Your Website a Serial Killer?

SEO Targeting
Photo Credit: Augur Marketing via flickr

It goes without saying, no one wants to upset the 800-pound gorilla in the room named Google.  Yet, that is precisely what many business owners do when they employ serial sites to do their bidding.  Google hates serial sites with a passion.  That’s because these online clones attempt to generate position by the cookie cutter method, where the same site is used to target a number of individual keywords.  If the Googlebots catches you using serial sites, all your sites could wind up sandboxed (or at the very least lose ranking).

The important question becomes: “What is a serial site?”  The answer to this question has changed over the past year or so.  In fact, landing pages that were once considered perfectly legitimate by Google have been deemed serial sites. The way the website owners found this out was when their landing pages disappeared from the first page of Google.  Other recent changes in Google’s algorithms also turned out to kill site ranking, including such things as being mobile-unfriendly, using certain kinds of programming languages, specific programming methodologies, as well as where and how your site employs backlinks.  On today’s Working the Web to Win blog, I am going to explore the ins and outs of technology that can kill your site stone cold dead in the eyes of the world’s most popular search engine.

search-1

Photo credit: Wikipedia

Why all the Fuss?

Those who have been working the Web for any length of time you have no doubt heard about Google’s algorithm changes that sport cute names like Panda, Penguin, Hummingbird and Pigeon.  The majority of these sea changes were brought out to deal with professional search engine optimizers who for years plied their trade by using a number of no-holds-barred techniques to advance their client’s rankings regardless of the rules.  This is what is known in the business as black hatting.

Until roughly 2010, the search engine spiders weren’t sophisticated when it came to understanding what they read on websites.  So black hat techniques like keyword stuffing, using invisible text, cloaking, redirecting, content spamming and link farms were employed with glee.  Many black hat SEO pros made a tidy sum by helping clients cheat their way to the top of the search engines. Then a funny thing happened on the way to the bank.  The search engines started programming their algorithms to selectively search for black hat technology.

search-2

Courtesy of:  en.wikipedia.org

The Panda Pounces

Panda, which struck in February of 2011, was the first time the Googlebots were able to start looking at websites from a contextual standpoint. In other words, they were not only able to read the page, they could also make qualitative judgments on the validity of the text they were seeing.  Among other things, they looked for such things as nonsense statements stuffed with keywords that were typically used by black hatters. They also kept a weather eye alert for factual errors, invisible or micro text, duplicate text, redirects and a number of other telltale hints that a site was employing black hat techniques. While this didn’t exactly put all black hat operators out of business overnight, it did put a dent in their nefarious business practices.

Penguin Takes Flight

search-3

Courtesy of: www.flickr.com

Up next was Penguin.  It first waddled onto the ice in April 2012.  Its foremost task was to curtail link farms which had been popping up like weeds.  Google had always put a premium on backlinks and a number of black hat operators were capitalizing on this trend by creating scads of bogus sites that were then employed exclusively to provide backlinks by the boatload to clients near and far.  Once the Googlebots had been specifically programmed to search and destroy sites that were hiring link farms to improve their ranking, it wasn’t long before the farms bought the farm.

Hummingbird Hums a Different Tune

search-4

Courtesy of: commons.wikimedia.org

With the number of people using Smartphones to surf the Web on the rise, by late 2013, Google mandated that website owners needed to make sure that their content was readily accessible to every available platform.  This meant either commissioning a .mobi site, or employing a dynamic programming language that adjusted the content to fit tablet PCs and Smartphones.  This edict was taken to the extreme in April 2015 when Google unleashed what became Mobilegeddon, where websites owner were told to screen their websites to see if they passed the equivalent of an electronic scratch-and-sniff test. By submitting their URL to Google’s online test, they could tell whether a site was considered “mobile-friendly.”

search-5

Courtesy of: commons.wikimedia.org

Pigeon Flies the Coop

Taking flight on July 24, 2014, Pigeon was tasked with increasing the value of local search.  What this algorithm tweak was supposed to do was make local searches more intuitive by providing search results based upon the geographic location of the website.  While this change benefited a number of local businesses, it also had the unsettling effect of diminishing the results of a number of businesses that worked on a national or even global scale.  It also gave more weight to online directories and portals that aggregated local listings.  Like most algorithm changes this caused initial panic among those whose page 1 positions were usurped, followed by damage control to reclaim this lost territory.

The Geotargeted Faux Pa

This brings us full circle — back to the top of our story since serial sites were often used to regain lost ground by creating geotargeted sites.  If you sold hotdogs online, you might commission a number of sites that were targeting major cities, such as PhiladelphiaHotDogs.com, DetroitHotDogs.com and DenverHotDogs.com. These sites would be virtual clones of one another with the exception of their url and the name of the city in the content.  While initially successful, this technique was also deemed off limits and the Googlebots were once again programmed to seek and destroy those who employed serial sites.  The fallout meant that others who were using legitimate landing pages were also scooped into the serial site net and tarred with the same brush.

search-6

Courtesy of: commons.wikimedia.org

That does not mean, however, that landing pages have to be abandoned altogether.  Let’s say that you sell apples, bananas, oranges and grapes online.  While you can set up a FruitsRUs website that gathers all these elements under one umbrella, this isn’t the most efficient way to please either the Googlebots or prospective website visitors.  In the first place by featuring 4 different fruits on one site, the Googlebots will not give priority to any single one of them.  This means watered down search results.  It also means that a potential customer that happens upon your site has to hunt for the fruit they seek.  This usually translates into a high bounce rate.

Pleasing the Search Gods

To minimize bounce rate and improve ranking, what many savvy business owners did was create four separate landing pages, each of which gave priority to a single item.  So ApplesRUs, BananasRUs, GrapesRUs and OrangesRUS were created.  While this was cost and time efficient, it was not effective. If each of these sites were virtual clones of one another, with the exception of the URL, they would soon be deemed serial sites and sandboxed.  In order to avoid the ire of the Googlebots, what needs to be done is that each site, while retaining the FruitsRUS brand (i.e. the look and feel) they need to make sure that the text, graphics, videos and even the offers on each of these pages is unique and focus on the particular fruit.

search-7

Courtesy of: en.wikipedia.org

Your landing pages need to be focused on the geotargeted area (verbiage, pictures, maps, keywords and URLs). This also makes them non-serial in nature. On top of that, they also need to be listed in as many legitimate authoritative directories as possible. We post our clients to the top 100 search directories whenever possible to make sure they are listed. If a website is not listed in a search directory, that site can’t show up in search when Google serves up directory listings. Today Google’s Pigeon update almost makes this mandatory.

The better your landing pages focus on a single subject, the higher they will rank on those specific keywords (assuming all other ranking factors are equal). So for example: if this particular page focuses on grapes, its keywords are on grapes, its content, offer, pictures, videos, testimonials are all about how great your grapes are, it will rank higher. It also needs to be well shared on the social nets, and has to be back-linked to many authoritative sites. This method of creating landing pages will outrank other sites that are less focused (more than one fruit) or a site not as well connected.

search-8

Courtesy of: pixabay.com

We have, in past blog posts, written about how focused content effects ranking. We have also written about the importance of having all the conversion factors on a page. How having those factor showing above the fold of awebsite will increase leads and sales and more. Focused pages (single subject) will always outperform general pages (multiple subjects) if all else is equal. Obviously there are many organic page ranking factors involved (my last count was more than 250 factors).

We believe the top ranking factors start with the quality of your content, the timeliness, relevance, the connectedness, along with whether your social engagement and page sharing are positive or negative, have the greatest impact on your ranking. Make no mistake, content is king. Having a focused page that has high quality, relevant and timely content that is being shared and is well connected, will rank higher than any competing page that is not equal in these aspects.

The following is a list of Must Read Articles to help you achieve a page one organic ranking.

Is SEO Still Important Today or in the Future?

What’s Up With SEO

How To Avoid Being Caught in an SEO Phishing Trap

search-9

Courtesy of: pixabay.com

http://workingthewebtowin.blogspot.com/2015/01/how-to-avoid-being-caught-in-seo.html

How to Make Google your Best Friend

What your Webmaster Should be Telling You

Who Needs Cyber Babble

Holy Algorithm Batman Penguin Strikes Back

It’s a New Year and the Rules have Changed

Dirty Tricks can Deep Six Your Business

Teams Win in the Search Engine Game

Has Google Given Everyone the Bird with its Pigeon Update?

What Exactly Does SEO Really Mean Today?

Extreme Website Makeovers Fixes that Boost Traffic, Conversions and Rankings

If you put in the effort to understand and avoid the speed bumps that Google has erected on the Information Superhighway, there isn’t any reason you should be labeled a serial killer by the world’s most popular search engine. At least not until their next algorithm “tweak” rears its ugly head.


avatar

Carl Weiss has been working the web to win since 1995 and has helped hundreds of companies increase their online results. He is president of W Squared Media and co-host of the weekly radio show Working the Web to Win which airs Tuesdays at 4pm Eastern on BlogTalkRadio.com. Click here to get his latest book "Working The Web to Win: When it comes to online marketing, you can't win, if you don't know how to play the game!".

Submit a Comment

Your email address will not be published. Required fields are marked *






Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 6,300,968 bad guys.

css.php