Site   Web

August 12, 2009

SEO for Large Websites Part III

In SEO for Large Websites Part I & Part II, we discussed the importance of starting with a goal in mind and tying it to competitive keyword thresholds, an overview on keyword research & selection, the value of establishing preferred landing pages and the last topic covered was how to reinforce keyword categories using an array of SEO tactics.

Topics covered in the previous posts were:

Now, we will conclude this triage with the last few areas pertaining to optimizing large websites specifically through:

  • Strengthening Internal Linking
  • Using Succinct Titles and Meta Descriptions
  • Managing Site Wide Duplicate Content (Custom templates, Noindex, Follow, 301 Redirects, XML Sitemaps).

Strengthening Internal Linking – The purpose of internal linking is to reinforce specific keywords, topics or stemmed arrays of relevant information to search engines. Each link on your page is telling search engines “this page is important”, so, the first thing you need to understand is, make sure you are sending the right message.

Ironically, many webmasters omit using internal links to connect contextual content or help establishing a pecking order for important content. Here are a few posts that provide more detail from a how to perspective or tactical viewpoint on internal links, but for the sake of context we will stick to the premise of why it is important.

From the standpoint of a hierarchy, the page which is a destination for the most internal links from other pages within a website, earns a higher search engine position as search engines associate links with authority (both from other sites and within a website).

The larger a website becomes, the more important it is to distinguish each page from others so that it can (1) develop its own authority and (2) rank for a specific array of keywords. The less work you do, like share one template for example instead of create a series of topical templates to facilitate key landing pages, the less likely those pages will stand out from others.

The idea is, consider your trophy keywords; these landing pages should have the highest concentration of absolute internal links to them. To accomplish this, focus on the first occurrence of a keyword on a page within the body area of the content and use it to provide the link to the page you wish to emphasize as the new champion page for that keyword. Do this throughout the entire site to really see the beauty of this technique or you could implement a script or plugin to accomplish this tactic with ease.

It is important to seed these keywords or use keyword co-occurrence whenever possible to reinforce the internal linking effect to your top level landing pages.

Say for example you have 1000 pages in your website, and you have 10 landing pages you want to emphasize as the top level keyword and key phrases. Depending on the keyword threshold and how competitive it is, you may wish to add editorial content to support the main landing page.

This does two things (1) it helps to flag your website as a relevant match over and above say for example a site that is using a general shopping cart and just has cookie cutter descriptions to distinguish it and (2) it provides a platform for leverage for internal links and keyword stemming.

There are two solid ways to produce rankings (1) content or (2) links both have advantages but the precedence is even more prominent when you utilize both in tandem to accomplish a series of rankings for competitive keywords.

So, with 1,000 pages to work with you could potentially use 100 pages per landing page to support at the top of the hierarchy or silo/subfolder to produce buoyancy for your pages.

As a website gains more pages, the site gains more Pagerank; and not the kind in the toolbar, actual on page authority. Internal linking (linking to pages with relative keywords or key phrases) is your first line of offense and defense in layering a foundation which will provide stability and rankings as time progresses.

Each page only counts if its indexed and duplicity is not rewarded, so, using supporting pages and internal links can keep your content relevant to search engines (since they can determine the concentration of ranking factor) as well as alleviates the need to place all of your ranking factor on one page alone.

The way to leverage on page internal links is to find the other pages deemed significant and then provide a link to the target / preferred landing page potentially as close to the beginning of the body text as possible. Not that sidebar or stand alone links in the footer don’t pass value (a minute amount), but based on block segment analysis, keywords or links in the body text carry more weight for SERP (search engine result page) positioning.

Out of those 100 pages if you have one page that has 5 of the keywords from your 10 primary landing pages, then build an internal link from the content from that page to each of the 5 keyword/pages.

The idea is, concentrate as many keywords to as many pages as possible, but try to use the same target page for a given keyword or key phrase (so you don’t diffuse ranking factor). Once each landing page is sufficiently reinforced, then its dependency on off page ranking factor (links from other sites) is diminished and it will (a) get or stay indexed as well as (b) be able to pass that ranking factor along by consolidating link flow to another page (by way of the links that leave it to other pages).

Using Succinct Titles and Meta Descriptions – Your title tag is the most important element in search engine optimization. You can rank on an “exact match shingle occurrence” in a title tag alone. Then if reinforced by a supportive occurrence of the keyword, a plural or synonym variations and then reinforced in the H1 (header tag), in the links on the page and links to that page, creates a thorough degree of relevance for that page to search engines.

Top level categories should be succinct and descriptive. For example – ”Brand-X Designer Shoes: Shoe Collection from Company -Y” would be ideal if the company sold shoes or a collection from that designer.

The idea here is, if you have a main category page that feeds subsequent listings and then use the most competitive description in the title. The description could state… “Company-Y Provides Designer Shoes by Brand X as well as Model A, B and C”, this way the meta description reinforces the title. This type of relevant redundancy works well as long as it is not abused.

Never stuff titles with more than 6-8 words and keep your descriptions to 12 words or less whenever possible.

In addition, if your page has 500 words of unique text, a header (h1) tag that matches the first word in your title (your main keyword) and uses a relevant description and has internal links pointed to it, the only missing ingredient is a few deep links from other sites to cement its position from other large orphaned pages from competitors who are selling the same wares.

Managing Site Wide Duplicate Content – I mentioned before in Part I that search engines are not a fan of duplicity or duplicate content, so, the less you replicate across a site, the more distinct the signature becomes when attempting to equate relevance.

So, is having a page that has 100 links on every page across every category the same going to really help contribute to facilitating a higher relevance or quality score? Or what about having the same meta title on every page in a category in a large e-commerce site (as if search engines will not notice)… It is possible to SEO your CMS (content management system) to fine-tune relevance and eliminate duplicity with a few custom hacks/settings.

There is one thing to be said about duplicity and large sites “try to avoid it as much as possible”, which means create more templates or more content and make each page unique. True, domain authority can push through a great deal of duplicity and keep pages in the index, but if rankings are the objective, then think of each page as an island that needs to have its own ecosystem to survive in the SERPs.

So, to provide methods for sustenance you can employ tactics such as custom templates, using a noindex, Follow tag in the header / meta data, 301 Redirects or XML Sitemaps to provide irrigation to pages starving from link attrition.

Custom templates – If you have a large site such as an ecommerce site, try to create themed categories that logically support the next.

*This is your top level category, hence anything linked from this will rank well on the merit of the page strength funneled to it from the main site. The main site/navigation would like to it with the keyword “shoes”. That way, If you offered shoes, watches, jackets, etc., each would have its own main landing page / top level category.

On the top level category this should have the least amount of links leaving it and the most inbound links from other pages internally as well as links from other sites.

Then by adding brand modifiers or descriptors in the next tier you would have allinurl relevance meaning that your URL (universal resource locator) a.k.a. web address would reflect relevant naming conventions that reinforce topically what that pages contents are.

Combined with the keyword/naming conventions used in the title and description, this is a winning combination for relevance that borders on redundancy, but really drives the point home to both users and search engines alike.

If you wanted to further stem that tier, you could add mysite/products/shoes/black-highheel-pumps-size6.html

As an example, or the option to truncate the .html is also an option based on MOD rewrite preferences (preferences conducted and assigned at the server level). The point is, the more information you replicate and reinforce, the easier it is for search engines to sort and assign relevance score towards.

You could also take it a step further by using a specific category based template that only refers to the top level category in the sites secondary navigation (usually text links on the left side bar) to provide additional shopping options to users.

Then instead of having the template bleed ranking factor by trying to link to everything (like the main page of the site), consolidate links to only categories that are part of the same product type.

For example the main navigation on the internal shoes pages would have primary navigation to

Shoes | Pumps | Heels | Flats

Don’t forget to use breadcrumb navigation as well to aid user awareness, but then in the footer (using footer links) under each product I could have the other brands of shoes positioned there to reinforce internal links or even add links to other categories such as

Jackets | Socks | Hats – to provide a robust user experience…

The idea here is to create self contained segments that reinforce topicality and then use links from each category to push link flow deeper into the site where it can become buoyant over time. Using a series of rotating .php includes with pre assigned related links is one alternative that you can use to avoid duplicate content as well as drawing from a series of 2-3 different meta descriptions or opening paragraphs.

Nobody said getting deep pages indexed was easy and many simply rely on the brute-force tactic of links alone, yet 60% of the ranking factor comes from on page SEO and on page continuity. If you understand that as the premise, you can get pages indexed and ranking with just a few links each from the right sources and bury your competition.

Sitemaps – One of the best use of sitemaps is to create a sitemap of your sitemaps and then consolidate them to your key pages that pass the most link flow.

If you had a top level category called shoes (using the example above), then your sitemap linked from the top level category should be to a sitemap that contains all of the pages that have shoes. Each brand, subcategory, etc. should all be accessible from that sitemap (xml or html).

As a result, those pages are food for search engine spiders and as long as they (a) link back to the top level category and (b) the homepage, the cycle is complete. The idea is, not to have any page more than 4 folders or clicks away from the homepage and the more specific you are with your content the better it can rank on its own (making the home page less significant to its ranking factor).

Wikipedia applies this for hundreds of thousands of keywords and ranks in the top 2 results for virtually any keyword from any topic as a result. If you only apply 25% of their method, you can do extremely well in contrast to someone simply using a flat site architecture and relying on navigation alone to push ranking factor into thousands of “me too” shopping cart / product pages or hum drum blog posts or supporting articles.

I will have to pass on the value of using 301 redirects in the video follow up of this post since, at this point, it would have been a great topic for an e-book or webinar as each topical point leads to 10 more granular tactics that could be implemented.

I hope this has provided you with a few ideas or concepts for implementing a self supportive site architecture. For every tip provided, there are 10 more such as leveraging tags, canonization issues, etc. So, this is a wrap for this 3 part post.

Granted, there are multiple intricacies and layers to every metric discussed, each in fact could be taken to the extreme from the level of minute details to a thesis level dissertation.

However, the take away here is, if you apply principles of optimization across a broad array of metrics you increase the likelihood of attaining a higher relevance score for each page and developing a broader range of authority for your domain.

Optimization is to increase search engine exposure and positioning, yet positioning only matters if the content is geared for the user to take action.

Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.