Site   Web

March 15, 2010

10 Do’s and Don’ts to Avoid SEO Mistakes

With so much misinformation out there, along with a lack of knowledge about how SEO works, you could end up getting your website banned from the search engines. Learn how to avoid common mistakes with these 10 simple do’s and don’ts.

10. Don’t Use Flash for SEO. Flash websites are very eye-catching but search engines cannot read or index this type of content. If it is impossible to avoid a Flash-centric website and you need search engines to index it, you will have to offer an html version too. Search engines don’t like Flash sites for a reason – a spider can’t read Flash content and therefore can’t index it.

9. Don’t Use too Much JavaScript. Searchbots are not designed to read and understand JavaScript code. If a website contains a few lines of text in the JavaScript code, chances are that searchbots will ignore the entire block of code along with the text. This is true in the case of JavaScript menus. Try to keep the use of JavaScript to a minimum. Alternatively, create an external JavaScript file if it is unavoidable.

8. Do Implement a Robots.txt File. The primary purpose for using a robots.txt file is to gain complete control over the data indexed by the searchbots. Implement a Robots.txt file only when you want to prevent unwanted web pages from being indexed. A robots.txt file is always placed in the root folder of the website where the searchbots can access it easily.

7. Do Target the Correct Keywords. Targeting the wrong keywords is a common mistake many optimizers make and even worse – veteran SEO professionals do it. Marketers select keywords that they think are explanatory of their website but the average searcher does not think in those same keyword terms. Picking the right keywords can increase or decrease traffic to your SEO campaign. A first-class keyword suggestion aid, for example the Google search-based keyword tool will help you find keywords that are appropriate for your site.

6. Do Include Long Tail Keywords. With a million websites competing for short tail keywords, it can take more than 6 months to rank in the top 20 for a competitive keyword. In this case, long tail keywords come in handy. Long tail keywords are more specific and can contain the name of a specific product, brand or city. Ranking for long tail keywords is comparatively easier and the rate of conversion is better than that of short tail keywords. Do include keywords in the title tags.

5. Do Maintain a Uniform URL Structure. If your website is dynamic, then you need to modify the URL structure of the web pages. This maintains uniformity and helps searchbots to understand which page it is indexing. It is very easy to maintain the URL structure in dynamic websites. Blogging platforms like WordPress provide an option for permalinks. Customized dynamic websites can use URL rewrite in the .htaccess file for the same.

4. Don’t Link to Low Quality Websites. Link building is a very crucial aspect of search engine optimization. Search engines consider the number of incoming links to a website as an indication of their popularity and give them priority rankings. Many beginners fail to realize that it is links from authoritative and quality websites that are important and they mistakenly link to low quality websites for higher rankings. This tactic can cause the credibility of the website to go down with search engines and in some cases, the website may get banned.

3. Do Perform Competitive Intelligence. Before starting your search engine optimization program, visit the competing websites in the top results. Research these types of questions:

A. How many websites are competing for the same keyword? B. How old are the websites in top search engine results pages? C. How many back links do the top ranking websites list? D. What type of social media is used by the competing sites?

2. Do Take Advantage of Google Analytics. 2009 was the year when web analytics gained momentum. Google Analytics came up with advanced metrics and intelligence report features which revolutionized free analytics tools. Companies realized the benefits of using web analytics tools to extract their relevant data. Implement Google Analytics to analyze data and build a 2010 plan to increase traffic and rates of conversion.

1. Do Create Fresh Content. Search engines are famous for penalizing a website for publishing duplicate content. With plagiarism on the rise and availability of content checking tools such as Copyscape, marketers have become more cautious. Yahoo is considered to be among the harshest of all the search engines with regard to this penalty. Add fresh content to your website to help build visitor interest and credibility with the search engines.

These 10 simple do’s and don’ts can help you to avoid making potentially dangerous seo mistakes and ensure your site is indexed and boost rankings.

Debbie A. Everson is the CEO of, experienced SEO Consultants and Search Engine Optimization Agency to over 2,000 small businesses. Learn about search engine optimization, paid search advertising, social media, and email marketing. Read my SEO Blog for hints and tips. Follow @Searchmar on Twitter.

14 Responses to “10 Do’s and Don’ts to Avoid SEO Mistakes

    avatar remove negative results says:

    Very well done to this short but excellent outline of the basic SEO mistakes. A non professional can easily understand it and think with it. It helps as well for online reputation management.

    avatar Jose Damaso Ramon says:

    There are too many sites which are very eye catching but very difficult to navigate.
    Thanks a lot for your tips. They are very clear and to the point.
    SEO is very important but very difficult too if we don´t have the knowledge to do the right things for our websites to be found for search spiders and would be prospects alike.
    Thanks again for sharing your wisdom.

    I did see hidden text in many websites.. thanks for sharing your knowledge

    avatar Tony says:

    We have now discontinued use of Google Analytics due to the fact that it failed to provide accurate statistics for 3 of our websites… We’re talking 80-90% out based on the stats from the webserver’s log files.

    No solution was found and no help was provided by the GA team… I guess you really do get what you pay for.

    I would advise anyone who has access to buy a copy of WebLog Expert and use their server’s log files. Compare them to GAs results and you’ll be amazed how inaccurate they can be.

    You’ve made statements which are in error. Though you’re right about javascript menus being bad for SEO, putting them into an external file and then using a call in the header to call the file into the html is the same for SEO as if it had been in the header. The advantage of having it in an external file is caching, so it doesn’t have to be read for every page of a site.

    Also, though I am NO fan of flash, if it’s written correctly for both SEO and accessibility it can be read by the spiders, or at least by the google spiders. The problem is in knowing how to hire the contractor who can create your flash site this way, and most don’t give a damn, since they’re all about the visual wow factor. I suspect the developers who know how are the more in demand and more costly to hire.

    I find it very humorous that you advise using Google Analytics. Google’s algorithm now penalizes websites that load slowly. If you were to create two versions of a website, one with google analytics, and one without, and clock them, you’d know why I’m laughing. There are tracking scripts you can install on your own server which give you all the same types of stats as google analytics, but without the heavy load time, and without the flash animation.

    avatar Peter says:

    Mate, well done, great advise, wish more people follow at least the basic rules.

    avatar satellite uplink communication says:

    thanks for sharing this you have mentioned good points about the SEO point of veiw

    Debbie you said:

    “Implement a Robots.txt file only when you want to prevent unwanted web pages from being indexed.”

    That practice is outdated and inaccurate.

    Blocking Googlebot with robots.txt prevents from crawling but does not suppress SERP listings based on 3rd party signals.

    Google advises not to block crawlers accessing your pages, including duplicated.

    Such practices can lead to a sink of PageRank.

    If you do not want pages or files to be indexed, the appropriate solution would be the implementation of meta robots tags or X-Robots, with the directives “noindex,noarchive,nosnippet”. Never use in the above the “nofollow” directive, otherwise you will create “dangling” pages which leads to a leak of PageRank.

    avatar kittu says:

    Hey John,

    I was just wondering, If using robots.txt was a outdated method, why would Google still advice its use?? Read this post dated March 30 2010:

    I am new to the “dangling” pages concept & really confused about the implementation of robots.txt file. If you could explain further.

    “… if you are trying to block something out from robots.txt, often times we’ll still see that URL and keep a reference to it in our index. So it doesn’t necessarily save your crawl budget”

    I would suggest you to check this interview with Matt Cutts by Eric Enge and the video with Matt Cutts

    I think this should answer your questions.

    avatar andrew says:

    Thanks for the great tips..I have been doing seo stuff for friends web sites of late and info like this really helps.

    Thanks again.

    avatar Darlene says:

    Thanks for your informative SEO article. I definitely agree with points 9 and 10.

    avatar James Carry says:

    hey …… information is good but i want info about how to improve Google PR and Alexa ranking ….
    Bcoz my alexa ranking decrease continuously from last 2 months ……..

    Informative article about SEO Mistakes, I have web site I will try make my site mistake free thanks for sharing

Submit a Comment

Your email address will not be published. Required fields are marked *

Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 6,720,820 bad guys.