Site   Web

June 16, 2011

10 SEO Mistakes to Avoid – A SPN Exclusive Article

When you’re starting off in SEO there’s a lot of conflicting information about what to do and what not to do. There’s a constant battle between White Hat and Black Hat and which is more effective.

It’s pretty easy to get frustrated at how long it actually takes to achieve a decent ranking. This can lead to taking the wrong advice. I learned by trial and error the SEO mistakes listed below.

Here are some rules to follow if you DO NOT want your site to rank in search engines.

1. Build a Flash Only Website

The days of Flash only websites are probably behind us (thank God). Having said this, as a web developer it’s amazing how often I’m asked, ‘Does that include Flash?’

Search engines cannot read content embedded in Flash files so you shouldn’t use Flash to build websites. It’s perfectly all right to have a flash feature box or slideshow or something like that so long as the rest of your site is built with HTML.

2. Hide all Your Content in Images

This is another sin of the past, although you do still see it occasionally. If you embed your navigation and page copy in images, search engines will not be able to identify this content. Use text-based navigation and semantic mark-up instead.

3. Use Excessive JavaScript/AJAX

Search Engines do not understand JavaScript/Ajax. If you use too much of this and in particular in your navigation, you’re preventing your content being indexed. Use text-based HTML/CSS navigation instead.

4. Copy Someone Else’s Content

You’re not much of a writer and no one will ever know right? I’m afraid not. Duplicate content is one of the biggest sins you can commit as far as search engines are concerned. Not only do you prevent your own site from ranking, but you also impact the site you copied the material from.

You should either write all content yourself or hire somebody else to do it for you.

Editor’s Note: For an alternative view read Jill Whalen’s article “There Is No Duplicate Content Penalty”

5. Stuff Your Content with Keywords

Including the same keyword phrase in the Page Title, the Heading, three times in the copy, bolded, in a link and in an image alt tag is an example of keyword stuffing.

This is like telling the search engine, ‘I’m trying to manipulate you into ranking me higher’. They won’t rank you higher, they’ll penalize you instead.

Include your keyword phrase, but do it naturally. Write primarily for humans then worry about search engines.

Editor’s Note: For an alternative view read Ben Kemp’s article “Website Redesign Best Practices – Part 2”

6. Use Automated Directory Submission Software

There are loads of software programs to automate the directory submission process. Don’t use them.

First of all directory submissions are not that valuable in terms of PageRank anymore. Secondly you’ll end up with thousands of similar Titles and Descriptions that will mark you as a cheater as far as search engines are concerned.

Submit manually to well-established directories, write unique Titles and Descriptions and try to focus on local and niche directories.

7. Participate in Link Exchange Programs

Often you see websites with a ‘Links’ or ‘Resources’ page with a list of links a mile long. Often these have no relevance to the content of the site and have been reciprocated on an identical page on the corresponding site.

There are a number of problems with this:

* Reciprocal links are less valuable than one way links.
* Links from pages containing hundreds have no value.
* It looks like you are trying to cheat the search engines.
* It looks unprofessional to users.

Reciprocal links are fine in moderation. Try to only swap links with relevant organizations and try to link from, and get links from inside paragraphs of related text in articles or blogposts.

8. Use the same Page Title and Meta Description Across Your Site

If you do this, you are basically saying that all your webpages are the same. Search engines index webpages and not websites; this is why you should have a unique, accurate Page Title and Meta Description for every page. Include your keywords in the Page Title in particular but don’t go crazy with the Meta Description.

9. Focus on the Wrong Keywords

Often when companies hire an SEO they’ll tell them they want to rank number one in Google for a ‘short-tail’, high competition keyword phrase. For instance an accountant might want to rank for the term ‘accountant.’

I’m afraid it doesn’t really work like that. It takes a while for any site to get off the ground and this is why it is important to focus on ‘long-tail’ keyword phrases. This is achieved by adding modifiers to the original phrase like ‘tax accountant’ or ‘tax accountant manchester.’

By focusing on less competitive phrases you can get instant traffic while setting yourself up to compete for more competitive phrases in the future.

10. Don’t Use SEF URLs

A SEF (Search Engine Friendly) URL is really a human friendly URL. What I mean by this is one that contains actual words and not a list of numbers and symbols.



The first example contains words that relate to the content of the page as well as what category the page is in. The next example contains the article id which is of little use to anyone.

David Dungan works as a web consultant for the Irish online marketing and SEO company Predict Insight.

His has experience in a range of different areas including web development, search engine optimization and online marketing.

30 Responses to “10 SEO Mistakes to Avoid – A SPN Exclusive Article

    avatar Bob says:

    6. Use Automated Directory Submission Software. A: Submit manually to well-established directories, write unique Titles and Descriptions and try to focus on local and niche directories.

    First of all well-established directories won’t accept your site unless you pay. Second, most directories want the actual site title or domain so writing unique titles flies in the face of most terms of use. Writing thousands of unique descriptions is a waste of time as most listings serve as a back link only the search engine will see. Some directories give lip service to unique descriptions and then ask for the meta description for the details page anyway. Best advice: If you use a automated directory submit, take the time to write a decent description and target the proper category.

    avatar Mark says:

    So with regard to point number 4, are you saying that duplicating content among multiple domain names that provide the same service would cause a duplicate content penalty by the search engines? I’ve had multiple domain names with similar spellings that provide th same service for several years. On some I used a forwarded traffic from the duplicate domains to the one with the actual content. On others I set up a host for each domain posted the same content on all of them and have not seen any difference in the way the search engines treat them.

    avatar Joris says:

    10. Don’t Use SEF URLs

    Why are SEF urls bad?
    SEF URL:

    In this example I understand that it’s not the prettiest url, but if you drop the categories in between I think it’s better than using ?page_id=330

    avatar Brian Kenyon says:

    Item 10 is inaccurate. Search engines do like SEF urls. They not only contain anchor text keywords that help identity what you the page is about and with many of these give the search engines an overall idea of the topics covered by the website but they also help users know what is behind the link they are about to click.

    avatar Kendo says:

    Re: 5. Stuff Your Content with Keywords (Don’t)

    It’s impossible to create a web page about any topic and not do this!

    If Google penalizes this, then they are purposefully poisoning healthy search results to feed their ad machinery.

    avatar Vuk Miler says:

    The flash comment is inaccurate. Google has, for a while now, been able to see inside Flash.

    I would agree with the first two comments. I don’t think this article has been properly thought out or researched. It’s just a list of re-hashed and well-trodden ideas. Also, the manner in which it is written (how not to achieve something) is lazy and just a little bit confusing. Very disappointing.

    Vuk Miller is correct, Google can index Flash content and has been able to for a while now, so the article is wrong when it states “Search engines cannot read content embedded in Flash files”.

    However, the SEO freedom you receive with HTML content, as yet, hasn’t been replicated with Flash. Therefore, you are always in a much stronger position when you’re optimising a HTML website than you are tackling a Flash site.

    avatar Jim says:

    You say Don’t use SEF URL’s but you don’t say why.

    avatar Alex says:

    Isn’t #10 backwards? Why would you want to use a Non-SEF URL when you say that it is of no use to anyone?

    avatar nanci says:

    1. Build a Flash Only Website

    Could you please correct this sentence in “Build a Flash Only Website If You’re Not Able To Make It Rank”. First thing first, read this
    It’s 2008 stuff. 3 years ago.
    Second thing, this is my job. I do full-Flash websites and make them rank unbelievably high. Bad ranking has got nothing to do with Flash. A full Flash website with alternate content for devices that can’t read Flash (iPhone, iPad, etc…) is the perfect website. An HTML website, even if full of great content, with Flash boxes is just gonna be a bad user experience for Those who can’t see Flash.
    I repeat, this is my job and I’m not happy to read this kind of guidelines, mostly because also potential customers can read this, so you’re saying something false and bad for my business.

    avatar SEO Bedford says:

    Even though these are very basic mistake you would be impressed by the amount of website that commit them.

    I think Alex is correct, point #10 is backwards.

    Perhaps we should add #11 – always proof read your content before you add it, as mistakes will be picked up by a great many people 😉

    avatar alex says:

    helo, i find it great that there was a link to an other opinion in the article- i defenitly prefer the smoothi way doing the side for the customer first, then for marketing- of course on a way that the will find the side- but i have to say it again: nice tolook at booth side of the coin- sunny regards from the canary islands

    avatar Jim Welch says:

    I have a question about point number 6, don’t use automated directory submission software. You state that this gives you the same title and description across all the directories the site is submitted to, and to write different titles and descriptions for each directory. I must disagree. I have hand-submitted to more than a thousand directories, and the title MUST be the same as the one on the web site. Second, it would be impossible to re-write the description for each directory. Directories vary in the number of characters they accept for the description, from 100 to more than a thousand. I have written descriptions for each common size, and copy and paste them into the appropriate fields. By doing this, I am on page 1 of Google for one of my keywords, and page 2 for the other. Is hand-submitting in the manner I outlined different than auto-submission? Is there a better way to hand-submit to directories?

    avatar Humzee says:

    I agree with Jim on this one. I’m not familiar with any software that does this for you but if the software is too spammy so is doing the same thing by hand. I think what we should take away from this is that those thousands of links used to work like a charm but are now considered spam…or am I wrong Jim? Thanks.

    avatar Thailand says:

    Re, keyword stuffing, why hasn’t sites like yellow pages etc been penalised, pages full of the words “road” “avenue” “street” “phone”, yet they still seem to serp well when looking for a local business.

    As for duplicate content, everyday the news is spread across thousands of websites worldwide, they all serp good, as for duplicate content from article farms, most are so low end and generic rubbish that you wouldn’t want them on your site and nobody really appreciates google for putting them onto such rubbish content.

    avatar Empress Jelaine says:

    I think you guys didnt understand about, dont use SEF in URL,

    it means that if you dont use SEF in URL its a Mistake, thats what author is saying

    Watch FREE Movies ONLINE

    avatar tayyab says:

    very nice information.. thanks for sharing these ideas

    avatar Blessing says:

    Thanks for this info, i have gotten something, i am happy i came. As a newbie in blogging i want to learn from you guys that had gone ahead of me. Thank you!

    avatar O. Bachmann says:

    @Joris, Brian Kenyon, Jim, Alex, & Darren Jamieson: #10 is correct, that is unless he’s edited the post by the time I read it. He’s saying if you DON’T want to rank well in search engines, then DON’T use SEF URLs.

    Then he goes on to explain that (SEF URL):

    is better than (non-SEF URL):

    From his post:
    “The first example contains words that relate to the content of the page as well as what category the page is in. The next example contains the article id which is of little use to anyone.”

    I apologize if this is a duplicate comment.

    avatar Ahmed says:

    People are misreading No.10 “Don’t Use SEF URLs”. If you have realised these are things you do NOT do, so it’s like a double negative. “Don’t NOT use SEF URLs”, in other words do use SEF URLs. Really the title should be “use None Friendly URL’s”. Anyway a decent article but I think from points 5-7 you can get away with those techniques if you are careful.

    The more things change, the more they stay the same. Writing for Google is like writing a letter to your girlfriend. Express yourself to her. Tell her WHERE you’re at, WHAT you’re doing and HOW you’re doing it and you have the basics of writing SEF content. Don’t try to manipulate her but be sincere and make sense.

    As for the yellow pages not being penalised (Thailand). I think she has been penalised. I submitted over 30 Telephone Directories during 2010. The Key words being Telephone Directory, Phone Book, Contact Details for – any given area. I reached No.1 with 98% of them on Google, Yahoo and Bing. For example: Look for:
    KZN South Coast Telephone Directory
    Port Shepstone
    Durban Central
    Cape Town Central
    and so on …

    Then there’s this thing that using Free Technologies is bad for a website because you cannot optimize images and utilise meta tags etc. I have built 4 websites using free technologies and depended on text and content placement to manipulate these websites into the top 10 on Google, Yahoo and Bing. I slaughtered Bing and Yahoo and reached the top 10 on Google with more than 20 phrases relevant to the sites location, services and products

    It’s all in the writing and it’s all in the content. There are so many opinions out there. Trial and error is the only way to really develop your own.

    last one seems incorrect to me , what’s wrong with using SEF or human friendly URL In my opinion that should be the way to go.

    avatar Tim Barker says:

    I think the author was trying to outline a few areas which could make a site more difficult to rank. I actually agree with David Dugan, these are all good points. I am also surprised how the general consensus seems to be against him. It looks like there should be #11, relax after reading this article.

    Furthermore, we all know that a flash website can rank #1 and that Google are starting to read them. However, it has to be agreed that it is easier to get a site ranked without flash and there are many other problems associated with flash too.

    He is right about point 6 too, surely it is obvious that a million links from “created directories” are not going to be regarded as valuable. Google aren’t that daft. Although I use auto directory submitters, I only use them in a limited way, only for the highest PR directories and vary the titles and descriptions.

    Perhaps point 10 is the wrong way round, but I understand what he means. Which is more user friendly, a load of numbers or a name? Therefore, Google should look at it in the same way.

    I assume the point the author is trying to make is that there are some things we can avoid if we want to make our SEQ life easier.

    Correct me if I am wrong, but isn’t the idea of a website and Google’s aim (mission) to try and make a good user experience. It seems that sometimes there is a battle between us and them.

    avatar mohamed says:

    i don’t understand the point number 10, actually i got a courses on that said that is good idea to use A SEF (Search Engine Friendly) URL for SEO. and you said to not use them…
    can you explain why we can’t use them ?

    Thanks Dan,

    it’s handy to have back up for the advice that I give to clients on SEO.

    Sadly far too many website builders don’t understand that a pretty website is only 50% of building a successful website.

    Thanks again,

    avatar Houston says:

    It is so hard to determine what hurts seo anymore. Google has so many algorithms now that some things may help one day and hurt the nest. Many problems can be over come by other strategies…

    avatar Chuck says:

    Guys, remember the theme of this site. These are mistakes people make in SEO. #10 is a mistake people make, as in people “Don’t Use SEF URLs.”

    The author is saying to avoid this mistake. It’s basically a double-negative, urging you to USE SEF URLs.

    Informative article David. Although all the points mentioned are very important and shouldn’t be neglected, I think Numbers 1,3,10 are the most important for a website’s structure. Thanks for sharing this piece.

Submit a Comment

Your email address will not be published. Required fields are marked *

Please leave these two fields as-is:

Protected by Invisible Defender. Showed 403 to 6,648,381 bad guys.