Site   Web

May 1, 2011

Flashy Websites Can Be Too Hot for Bots

webdesign4

Website designers often employ technologies like Flash, JavaScript, Ajax, and Silverlight to make their sites attractive, fast, and easy to use.

While there are great reasons to use these technologies, they can create problems when it comes to search engine optimization (SEO). Web page content that’s wrapped in a fancy package can be difficult – or impossible – for a search engine to “see.” That means the search engine crawlers may have a hard time understanding what your page is about, and so may not index all your important pages.

The search engines may also find it difficult to follow any links – internal or external – you’ve placed in web page content rendered in Flash, Silverlight, or other technologies. That matters because search engines use your internal links to discover other pages on your site, to understand how pages on your site relate to each other, and to determine which pages on your website are more important than others.

Designers sometimes incorporate the search function for a website into their designs. That can be helpful for people, but it may pose problems for search engines trying to crawl your site. If pages on your site are accessible only from a search box, the search engine won’t be able to see those pages, because search engines don’t type keywords into search boxes to find relevant web pages.

Below I’ll tell you about some of the popular technologies for creating attractive, people-friendly web pages, describe the potential issues, and tell you how to avoid them.

JavaScript Menus

Web designers often use JavaScript to make navigation menus with special mouse-over effects, animated drop-downs and other interactive features. While these design innovations can be truly useful for human beings, they can also be a real problem for search engine crawlers.

Today, Google’s crawler – fondly known as Googlebot – can actually follow many links created in JavaScript. But it can’t follow all of them. And while Google is the dominant search engine, with about 70 percent of people using it, 30 percent of your potential customers are using a search engine other than Google. Those people are even less likely to see your JavaScript links. If your business depends on people coming to your site from search engines, saying that the bots can probably follow your JavaScript links is a bit like your boss saying your paycheck probably won’t bounce.

A CSS menu can do pretty much everything a JavaScript menu can do, and without any of the issues that cause problems for search engine crawlers. Don’t forget that mobile phones, tablets and the other small computers that are increasingly popular for surfing the Web also have problems displaying JavaScript, but do fine with CSS.

There’s a nice CSS menu generator here: www.CSSMenuMaker.com.

JavaScript Click-Tracking Links

People who are serious about tracking the business performance of their website use some form of analytics. Seeing how visitors get to your website, and where they go after they land on it, helps you understand how to turn more visitors into customers.

Sometimes web developers use a single page with pre-set parameters to track clicks. The page captures the information about which links were clicked, and then redirects the web browser to the final page that will be shown to the person who’s surfing.

This is very similar to the Javascript click-tracking function, and sadly, it has similar effects when it comes to search engines. Even if the website developer does the redirect with a [Glossary/301-redirect|301], some of the goodness of that link is lost. I don’t recommend this approach.

Solution? Use a free click-tracking service like Google Analytics instead of click-tracking JavaScripts. Yes, Google Analytics uses JavaScript, but NOT in the links themselves.

Flash

Flash is an incredible technology that enables a richer user experience. Flash is often used for video, slideshows and interactive features on a website. However, search engines can’t “see” any content that’s rendered in Flash.

Many websites have everything in Flash. It can look great to human visitors, but to search engines, it looks like the website consists of a single web page – and one with very little content, at that. If the search engines think your entire site consists of a single page, they’ll think your site doesn’t have much useful content, and won’t rank your site high in search results.

Google has improved its crawler’s ability to “see” what’s in a Flash object, especially if the web designer has followed some fairly straightforward rules.

Still, it’s not certain that all text rendered in Flash will be accessible to Googlebot. At the risk of repeating myself, let me remind you that 30 percent of searchers don’t use Google. Do you really want to fence out a third of your potential customers?

Bottom line: Use Flash for decorative elements. Render your links and navigation menus in HTML, so search engine bots can see them.

Silverlight

This technology, created by Microsoft Corp., enables rich media experiences similar to what you can do with Flash. Googlebot has problems seeing the text and links in Silverlight.

Just as with Flash, you’re best advised to use Silverlight for decorative purposes, and use HTML to render links and navigation menus.

Band-Aid Solutions

Some web designers apply a Band-Aid solution to the problems caused by rendering navigation menus in JavaScript, Flash, Silverlight or Ajax. They’ll create an HTML sitemap with links to all the pages, and sometimes submit an XML sitemap to the search engines.

These sitemaps will, in fact, allow search engines to see all the pages on your site. However, the search engines still won’t be able to see how many pages on your site link to any given page. That’s important information – the number of internal links to a page tell search engines how important that page is.

If your main navigation menu is in HTML or CSS, and all your major pages have the same navigation menu, then all your important pages will be linked from many pages on your site. Minor pages on your site will have just one or two links from specific pages. The variation in the number of links to each page tells search engines very clearly which are the most important pages on your site.

If, on the other hand, your navigation menu is entirely in Flash or JavaScript, and you’ve got a sitemap as a Band-Aid solution, the only internal link to each major page that search engines can see will be from the sitemap. That gives each page on your site just one link, making it appear to a search engine bot that each page is as important as every other. That’s not accurate, and means that your most important pages won’t show up as high in search results as they should.

Google Webmaster Tools can tell you how many pages on your site link to any other page. Log in to Google Webmaster Tools, click on Your Site On The Web, then click Internal Links.

Pages Accessible Only by Forms

Some sites have pages that can be reached only by filling out a form. For instance, one of the largest automobile insurance companies in the world used to have a simple form on its home page that asked for your postal code. You’d fill that out, click on Submit, and be directed to the portion of the insurer’s site that dealt with your region.

It sounds logical, but search engine crawlers don’t type in postal codes, and they don’t click on Submit. To the search engines, this insurer’s site looked like just a single page – and a pretty boring one, at that.

Search forms pose a similar problem. While this is a tremendously useful way for a human to find information on your site, it’s not a navigation method the crawlers can use. Crawlers don’t type words into a search box, and they don’t click on a Search button.

The solution? Keep the search form – it’s great for your human visitors. Add a sitemap, and submit to search engines an XML sitemap that links to every page you want indexed.

How to Check If You Have a Crawl Issue

Luckily, there are a number of tools that can tell you which web pages and links pose problems for search engine crawlers. A few of my favorites:

Xenu Link Sleuth – Download this one for free. It can also create an XML sitemap
for you.

SEOmoz Crawl Test – Part of a tremendous suite of site analysis features available to SEOmoz PRO subscribers.

AboutUs Website Visibility Report – You can quickly check how many pages of your site the major search engines have indexed. If the number is lower than you think it should be, you can investigate further.

More Resources

GoogleGuide description of how a crawler works

Learn how to use robots.txt correctly

Googlebot and indexing Flash content

Google Analytics

Google Webmaster Tools

Google’s guidance on making Ajax applications crawlable

Read about other classic SEO mistakes even good web designers can make.


Check out how your home page looks to search engines and people with the free Home Page Analysis. Want a deeper look at all your site’s pages? Try an AboutUs Site Report.

This article, originally published on AboutUs.org, was contributed by Michael Cottam of MichaelCottam.com (visit).

Michael is an independent SEO consultant in Portland, Oregon, and an associate at SEOmoz in Seattle, Washington. He recently created the lead-generation website Visual Itineraries for travel agents, and co-founded TheBigDay honeymoon travel and registry company in 2001, and was responsible for the website and SEO for that company. He’s on the board of SEMpdx and manages sponsorships for SEMpdx events, including the annual SearchFest conference.

7 Responses to “Flashy Websites Can Be Too Hot for Bots

    avatar Winnipeg Webpage says:

    This article is true of your have NO EXPERIENCE with Flash, Java, Action script and XML knowledge. Even if your not an advance flash developer you should know flash content is only displayed with an HTML File. You can add all you content info in an HTML CONTENT TEXT TAG. Again, Novice designers can practice this methode. The advanced developer will add an XML Site Map and submit all his pages to search engines. I think flash developers are just lazy. They optimize only the index or landing page in the HTML Meta Tags and don’t complete the SEO standards for all there pages. Of course if you look at the right place you will find Flash CMS website that even offer SEO Fields in each page, articles, ALT tags, Photos and videos!
    So keep in mind… Flash is no longer being looked and used as full page player but similar to a WIDGET PLAYER by advanced developers. Did you guys forget we can play a flash intro for example and all the text you see in the intro is pulled from a stand alone XML file?
    We can change the content of that flash file without changing the flash file or having to re publish it. The XML file included in the site map gets optimized by search engines without a glitch. This has been tested many times.
    So the real question is… Do you really know how to optimize flash content?
    until you get a good grip of flash trends and it’s fast pace changes you should stick with CSS and of course http://www.CSSMenuMaker.com. is a great place to start.

    Then again, What Do I know?
    Flashca, Flash Media Team.

    I’ve posted a 100% flash site. The key to SEO is promotion of the site itself. Your meta tags are important but promotion is key to high rankings. I’ve proven this time and time again. Its the promotion of the site that get’s your site ranked high. Anybody that would like to know more or have me assist them in getting a flash site ranked high in Google please contact me direct.

    avatar Zippy Cart says:

    Flash doesn’t have to be the kiss of death for a well-optimized site, but getting that rich text as far up the HTML document as possible looks best for crawlers, so I think that erring on the side of caution isn’t bad advice. Coders who know the techniques to circumvent this can still do it and should feel free to, but this article is just bringing up the idea that for some people, avoiding Flash isn’t a bad idea.

    Very Interesting food for thought. Thank you for your point of view

    avatar Julio Fraire says:

    It is good information to consider
    Thank you …

    avatar Seductor says:

    Very interesting food for thought. Thank you for your point of view. Incredible, very very good my friend.

    google Analitycs I do not like it, your post is very good but the truth is that they pay very little.

Submit a Comment

Your email address will not be published. Required fields are marked *






You may use these HTML tags and attributes: <a href="" title="" rel=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 3,838,293 bad guys.

css.php