Featured Google Search Engines

GoogleBot Wants to See the Web the Way You Do

Google never, ever does something for no reason. Sometimes it’s just a matter of waiting patiently to figure out what that reason is.

In May, Google created a fetch and render tool in Google Webmaster Tools that was built to render web pages properly for GoogleBot. At the time, it was unclear why the company was introducing the tool, though it hinted at future plans that would involve fetch and render.

On Oct. 27, we got a definitive answer.

That fetch and render tool foreshadowed the introduction of new guidelines that say you could be negatively impacted on search rankings and indexing when you block your CSS or JavaScript files from being crawled. When you allow Googlebot to access these things, as well as your image files, it will read your pages correctly. When you don’t, you could hurt the way the algorithms render your content and thus result in your page rankings declining.

So that tool that was put out a few months earlier was basically a warmup – it can be used to make sure GoogleBot is rendering your web pages correctly.

It’s all part of a drive toward better user experience that is ultimately behind the changes Google has made.

The Nitty-Gritty of the Changes

Google says the change was basically to make its indexing system more like a modern browser, which have CSS and JavaScript turned on. So, as always, Google’s claim is that it’s doing this for the greater good. It wants to make sure it’s reading things just like the people who will be looking for your content.

That’s a big change from before, when Google’s indexing systems were more like text-only browsers. Google cites the example of Lynx. But the search engine says that approach no longer made sense since modern browsers index based on page rendering.

The search engine offers a few suggestions for optimal indexing, including:

  • Getting rid of unnecessary downloads
  • Merging your CSS and Javascript files
  • Using the progressive enhancement guidelines in your web design

What This Means

With any Google change, the real question is what does this mean? How will it impact webmasters and what sort of impact could it have on SEO?

Clearly the answer to that second question is sites that do not adhere to the suggested guidelines will see their search results suffer. Make sure your webmaster fully understands what Google is asking for, and discuss what type of changes should be implemented and how they could affect Google rankings.

Your aim is to create crawlable content, and that means doing whatever Google suggests. Use the fetch and render tool to make sure everything on your site is in order. It will crawl and display your site just as it would come up in your target audience’s browsers.

The tool will gather all your resources: CSS files, Javascript files, pictures. Then it runs the code to render your page’s layout in an image. Once that has come up, you can do some detective work. Is Googlebot seeing the page in the same way it is rendered on your browser?

If yes, you are in good shape. If no, you need to figure out what tweaks to make so that Google is seeing the same thing you are. Here are potential problems that could be making your site’s content non-crawlable:

* Your website is blocking Javascript or CSS

* Your server can’t handle the number of crawl requests you receive

* Your Javascript is removing content from your pages

* Your Javascript is too complex and is stopping the pages from rendering correctly

Why These Changes, Why Now

Google always has intent behind what it does, and here’s my read on its intent with these changes: It’s making user experience a bigger factor in its search rankings. Think about it. The emphasis on page loads and rendering are two major steps in that direction.

That has also prompted speculation that the company could start using mobile user experience for its rankings as well. There has been rampant speculation in recent months, as mobile usage begins to overtake desktop, that Google will begin shifting its focus to the mobile web for search engine optimization.

So could this be one of the first steps on the way to those big changes? Perhaps. I always think it’s dangerous to try to get too many steps ahead of Google; the search engine likes to reverse course and throw people off from time to time. It does not like it when SEOs make changes in anticipation of its actions, preferring to dictate the course itself. And I do think the idea behind the crawlable-non-crawlable content changes makes sense. You have to keep up with the times.

But others could argue that keeping up with the times is exactly what Google will be doing by putting greater emphasis on mobile user experience.

The Bottom Line

Like any change from Google, this one will require adjustment and a fair bit of vigilance. I think it’s mostly a sign of things to come. User experience is really important to Google these days, and you would be wise to start looking at your mobile site in those terms. Make sure that you are doing everything you can to make your site mobile friendly, while still presenting a great desktop experience.

That way if Google does actually start penalizing based on poor mobile user experience, you will already be two steps ahead.

About the author


Adrienne Erin

Adrienne Erin writes twice weekly for SiteProNewsabout online marketing strategies that help businesses succeed. Follow @adrienneerin on Twitter or visit Design Roast to see more of her work or get in touch.


Click here to post a comment
  • Good information and thank you!

    I feel that Google is already looking at mobile search a lot closely based on the stats that I am getting from clients.

    Some people seem to rank higher on mobile, than on desktop. They also all have good mobile designs.

    This may also be geographical based on mobile GPS but we will find out soon.

  • Interesting article. As always website editing and tweaking is an on going process to get better rankings.

  • excellent article- thanks

    is there any SEO advantage to writing a site in only HTML5 and CSS3?

    Will that result in better ratings?

  • Basically, (get this), if website owners do not let Google ‘See EVERYTHING on the public side of your server, it will not let you play (so to speak).

    A bit of a playground mentality approach don’t you think?

    So, all websites have to decide – Be a Lion or a Sheep.

    Essentially, what Google is trying to do is not to dissimilar to what a dictatorship Government would do, which is to control what you do on the web.

    Matt Cutts (who I think is/was a prat) needed his head looked at when he started this ball rolling – and yes, it was him.

    Oh, and do I care about what Google does? Of course not?

    Why? Because there are so many other SE’s out there which, contrary to those with vested interests, people do actually use.

    Why don’t web owners stick an image with a link on their site, showing which SE they prefer…Get the idea?

    I’m just waiting for Google to fall over…

  • Content should be crawlable. So that Google can easily crwal or identify your content..Thanks for sharing good information.

  • Nowadays, Google updates their algorithms regularly. They want sites with quality content and that regularly update their content, not spun content.
    Nowadays they are also paying more attention to mobile and tablet users.

  • I agree with your thinking about that tool!
    Using it for one site we get errors after rendering because of the G** code on the pages!
    Strange isn’it??