Site   Web

March 6, 2011

Can Google Truly Make Quality Content King of The Web? – A SPN Exclusive Article

spn_exclusive1

For those webmasters and e-marketers who can remember a web without Google, life was much less complicated and a lot less tumultuous back then. Since Google came onto the scene and became the dominant search engine in the world… things have changed drastically.

And not necessarily in a negative way, those same webmasters probably jumped for joy when they reached the top of Google for their keywords. Then they complained just as loudly when Google made one of their never-ending changes to their algorithm and these webmasters saw their rankings drop or in some severe cases, disappear from the web altogether. In those early days, most of Google’s major updates were kept secret until the fallout got webmasters fuming or rejoicing.

However, in recent algorithm updates or changes, Google has openly broadcasted these changes to anyone who was listening. The same openness applies to Google’s recent changes dealing with “content farms” and “low quality content” in Google’s SERPs. Matt Cutts stated in his blog, “we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”

Basically, what Google is trying to do with these changes is to increase the overall quality of their search results by lowering the rankings of sites which it perceives as low quality and containing little or no original content. These would be sites that have scraped content from other websites and that have displayed it, usually along with ads and/or links to affiliate products or other related sites.

At present, this only affects search traffic in the States, but this is no small update since 11% of the queries have changed. And as some webmasters have noted, these changes are indeed improving search results.

One interesting find comes from Alexis Madrigal at www.theatlantic.com that looks at Google’s new improved search results for the keywords “drywall dust” and shows that there were indeed less “content farm” listings in the new results.

However, it’s Google’s definition of “content farms” which has many long-time webmasters concerned. As an online marketer who attributes most of his success to article marketing, Google’s recent updates have me somewhat worried. I contribute articles on a regular basis to many online article directories, most of which are free for other webmasters to use as long as they keep my resource box and links attached. These articles get picked up and displayed on countless sites around the web. I also feature many of those same articles on my own site. I am sure there are thousands of webmasters who do the same thing and who are also worrying how Google’s new changes will affect all this duplicate content.

In most cases, my articles in Ezinearticles usually get displayed at the top of the rankings in Google, sometimes even above the same article on my own main site. This is understandable since Ezinearticles is a much more respected authority site in the eyes of the search engines. Also, all sites come with their own unique keyword ranking “DNA,” which means they optimized for those keywords and any keyword related content which is added to those sites will rank higher in the search engines, especially Google.

Years ago, I tried on several occasions to use “spin software” to make all my articles unique, but I could never bring myself to accept the resulting spins or versions of my articles. They just didn’t seem right and didn’t have the right flow. For me, writing has always been more of a pleasure than a chore and corrupting it in any way is just not worth it. Besides, I have been horrified more than once by seeing fragments of my articles mutilated on some of the aforementioned low quality sites which have scraped my content from the web.

Instead, I started writing unique articles or content which I placed on other sites. One of the main directories for this was Buzzle which switched over to only accepting unique content two or three years ago. I have monitored Buzzle over the years and noticed that it’s traffic stats keep increasing in a steady line upwards, probably due to adding all this constant, unique content. Other article directories seem to have a more see-saw flow to their traffic numbers, if you compare them on sites like Alexa.

I believe this whole issue comes down to quality content and what the search engines perceive as quality. Just because something is unique, doesn’t mean it’s quality content. Google has to judge the quality of the content it finds on the web and it has over 200 ranking factors which it says can filter out the top content and present it to the searcher. The recent “content farm” update is difficult to evaluate… just because content is duplicated or appears on another site doesn’t mean it lacks quality.

As a webmaster, I have always placed related videos and press releases on my sites to complement my own content. I also reference other sites and data in my articles to back up an opinion or to prove a point. Although going forward, I will be very wary of placing content on my sites which is not unique.

Sometimes I find it ironic that Google, since day one, has not created any unique content… its robots crawl the web and compile that information into search results. The quality of those results largely depend on the quality of the scraped content and how well its algorithm can filter out the low quality stuff.

Supposedly, no human eyes judge this whole process which I don’t believe for a minute. Google’s engineers are constantly monitoring the results and constantly adjusting its algorithm to filter out what they don’t like which brings us back to the question at hand. Can Google really perfect a system where only the quality content on the web rises to the top?

Obviously, they can use such factors as bounce rates, time spent on-site, pageviews per visitor, direct access, backlinks from authority sites… and bookmarks in the social media/networking sites. Let’s face it, if a piece of content has 2,000 re-tweets, it must contain something of interest/quality for a lot of people. Likewise, if a piece of content or video has 5,000 comments attached to it and 10,000 Diggs or Likes…chances are good that it is of high quality.

Of course, there is also the technological/mechanical side to site quality. If a site loads slowly and has countless dead links, then it can be easily ranked as low quality. Google has always maintained a user/surfer’s experience is important to what it lists in its results. Content farms and sites with little or no unique content would probably be high on Google’s list of what not to display.

However, judging the quality of a piece of information or writing, without human eyes viewing it, is not so easy. Unless Google has a thousand little Watsons running quietly in the background, intelligently reading and rating all that content, making only quality content king of the web will be extremely difficult for Google to do. Time will tell.


The author is a full time online marketer who operates numerous niche sites, as well as two sites on Internet Marketing, where you can get valuable marketing tips for free: internet marketing tools or try here marketing tools. Titus Hoskins Copyright 2011.

15 Responses to “Can Google Truly Make Quality Content King of The Web? – A SPN Exclusive Article

    avatar Aaron says:

    I am also concerned about the idea that bots can take care of determining quality… Frankly, the bot/algorithm does odd things and does them all the time. For example, my latest site: I constructed the ‘bones’ of the entire site first (page structure without importing content), and forgot to insert the file to not have those pages crawled. The only other thing I had constructed were the metatags for each page. Literally, they were blank pages with metatags. What happened? They were all crawled and indexed. After importing 100% unique, quality content, 3/4 of those pages either dropped or were no longer indexed at all…To be fair, it is certainly possible that I have simply made some sort of technical error that is now making it hard for the bot to read the pages–but this is unlikely as Google’s own webmaster tools say that there is no problem. Moreover, there is no technical difference in the pages that remain indexed and the ones that now are not.

    At any rate, and without writing an essay: Content is king? No human intervention?…I’m not entirely sure how I feel about either one of the Google ideals.

    I know the folks a Google are a clever bunch but I also have often wondered about whether they really do rely on their algorithm for determining rankings or whether they have any kind of human intervention.
    Cheers,
    Karl

    I think am fine with this. I am sick of seeing the same sites at the top just because some web guru knows how to play Google’s game. If they really are looking for more relevant content then maybe I have a chance of getting further up the search.
    Xavier

    avatar Brad Barrett says:

    I’ve been following on WSJ and sent in this letter to editor. Glad to see this issue front and center- the market wants quality search not clever info-pirates.

    Letter to the Editor: WSJ
    2/27/11

    GOOGLE Algorithms Won’t Stop the Gaming of the System only People Can

    Google Revamps to Fight Cheaters (WSJ 2/26/11) gave me hope, but when I read that all Google is doing is modifying their Algorithms I realized that they don’t really want to clean up search. Taking the low road is more profitable. Trying to stop the gaming of the system by real companies like JC Penny and Overstock.com is not even the core issue. It’s the bogus shopping sites and content mills that steal original content and hawk products via the accomplices at Amazon. Google and Amazon have enabled this gaming of the system and profited magnificently from it. Companies bid up search words, SEO becomes a huge industry and Amazon cleans up by providing commissions to pirates.

    If Google truly wants to clean up their search and make page 1 relevant again here’s how to do it in 90 days:
    1. Staff a group of entry-level interns into a department to root out and black list content and information piracy.
    2. Create a system for legitimate companies to submit to Google content pirates and sites that create neighboring urls to divert traffic. It will take a new hire 5 minutes to know that http://www.grillgrates.org is ripping off my company http://www.grillgrate.com. Weber (the king of grills) would have a list 2 pages deep of leaches drafting on their brand!
    3. Contact the offending firms and the reporting firm of your decision and allow rebuttal.

    An algorithm can’t do that and I doubt Google will either. Fortunately the market is sending loud signals to Google themselves to stop gaming the system by enabling and rewarding pirates to steal content, divert traffic, and reap commissions from affiliate sites such as Amazon. I’ve always viewed Google as a high road company, but now that my livelihood depends on ‘being found’ on the Internet my perspective is a lot different. It truly is Somalia in cyberspace and the pirates are everywhere, and they could be banished in a few clicks.

    Brad Barrett
    Cartersville, GA

    brad@grillgrate.com

    avatar Calle Zorro says:

    Message to Google: Do something RIGHT in your algorithm and change it so that it does NOT show all of the thieving, stealing, unethical, illegal peer-to-peer download sites.

    From what I can tell, there is virtually nothing legal on these kinds of sites…virtually everything is in breach of copyright laws…videos, music, digital products, and so on…the whole scheme is a clever way to be illegal without being “illegal”…so why does Google present them at the TOP of pretty much any search result for a digital product that has value?

    If Google would simply stop presenting these type of sites in their search results, that would go a long way towards improving the world economy. I can’t imagine how many millions or even billions of dollars are STOLEN each year through these sites…and Google is serving them right up at the top of their search results for all the thieves to easily find…and to make it easy for people who are on the edge of honest to be dishonest.

    I would love for other producers of digital products to get involved and add their comment/vote on this subject…and just maybe there can be enough of us that we can get Google to pay attention and make the change.

    avatar Garage Door Repairs says:

    The last search I did for one of our major keywords still brought up 4 pages from content farms. The search was in the UK so Google’s new algo may not be in full force here yet but I ask you? Is it working, I am not so sure!

    avatar Himagain says:

    THIS was a good article. The comments above are also relevant.
    BUT it is the “elephant in the room” that is the problem.
    Google’s money is in pulling me to a site – any site – where an ad – any ad – can be displayed, but best if it is $$ display.
    It is compounded by selling the listing access.

    Just as SPAM could be stopped by a simple limitation of bulk traffic numbers thru the system.
    But it is part of the giant money pool.

    avatar Adrian Head says:

    I agree with the comments above. When I search the web using Google the results that infuriate me is when I end up opening a link to what in effect is another search site frequently loaded with Adsense advertisements and nothing on the subject I searched for unless I click an advert or use their outbound links. This is not quality content by any stretch of the imagination.

    I also wonder about bias to Google’s own controlled websites. They have just paid 37.7m for a UK based comparison site BeatThatQuote.com. One can’t help wondering how long it will be before this site outranks other more established comparison sites. Should a search engine with the majority share of the search market be owning websites in competition with others? I don’t think so. It smacks of unfair competition. If Google want to be seen as an independent provider of valid search results it needs to be totally independent with no websites of it’s own to direct traffic towards. Google might say it’s own sites are subject to the same criteria as all other sites on the web so if it appears on page one it is on merit but then they have access to their algorithm whilst the rest of us are only let in on certain aspects of it!

    avatar Brad Barrett says:

    Himagain hits the root of the problem. It’s in Google’s financial interest to allow the gaming of the system. Big money in play. Fortunately the market is speaking louder and big money will likely move the playing field to honest and true search. I hope. No one wants regulation to foul up the Internet do they?

    avatar Aaron says:

    I just wanted to add a big “wow.” It’s really nice to read so many people on a topic with whom I agree so wholeheartedly. Speaking of content farms, thieves, copyright infringements and page rankings, my primary niches are those of “make money” and “affiliate marketing.” (Yes, I know–hear me out before you projectile vomit onto your monitor). For anyone remotely aware of these niches and their representation on the Internet, I don’t really need to say much. However, attempting to wade-through the dank miasma of 1st page search results of ANY search engine for these niches is a singularly depressing experience. There really are those of us out there attempting to do things ethically, and in a manner which any and all reasonable human beings would call ‘relevant.’ But, I have to fight with one of two extremes given how the Google algorithm, in particular, appears to work: ClickBank-laden, punch-the-monkey banner flashing, popup infested garbage; OR, sites so conservative about making money on the Internet and affiliate marketing as to be essentially worthless. I am convinced that it simply defies the nature of the human experience to assert an algorithm that can be the ultimate arbiter of relevance; and, what is, and is not, ‘quality’ content.

    Until algorithms can read, and UNDERSTAND, a colloquialism as simple as, “The forest for the trees,” without being programed for that specific colloquial instance, algorithms have no business being the first, last and ultimate decision maker in regards to page ranking, relevance or ‘quality.’

    avatar SEO Bedford says:

    Was there Internet before Google? LOL

    Seriously now, Google has the tools and the knowledge to make quality content rise and stay at the top of results pages. It is not a simple task but certain ranking signals like authoritative links and retweets can assure that.

    No matter what update google makes.
    The privacy policy is still only for legal formalities and no on will read it.
    And the privacy policy is just a formality Google interprets it
    as they they need when there is really the time to use it.

    Its really amazing words you wrote into this website about the website content. its helped lot ., thanks mate.

    Google can make content king…they pretty much run the show.

    google is de actual king…whether it makes content as de top gun or not…dat doesnt change de fact dat google can modify dem wenever it wants as per their wish

Submit a Comment

Your email address will not be published. Required fields are marked *






You may use these HTML tags and attributes: <a href="" title="" rel=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 3,847,088 bad guys.

css.php