April 2, 2009
Bruce started by posing a quick question to the audience about their SEO budgets. Most attendees at this session have a USD 4,000 or less budget for SEO this year. [That’s shocking!]
There are things you can consistently do that will help you perform well in the search engines, says Bruce. The basics are the same as always but redefined. The data points are moving constantly. Bruce sees a lot more competition every week for SEO. Google has changed their algorithm 450 times in a year [I noticed that this nugget prompted mass tweets including one from me]. Google Universal Search has changed everything, Bruce says.
Bruce showed the Search Engine Relationship Chart from around 2003 and then today’s version of the same chart. It showed interdependencies between engines and paid results. The chart is now a LOT simpler. Bruce made the point that the chart has been excellent link bait for his company over the years.
SEO has evolved. Before Google, links weren’t important, now they’re a vital part of the algorithm. If Google finds link spam, it automatically downplays the links. This is the type of thing that can’t be reverse engineered. The spam filter is highly intelligent. Bruce is a big believer in knowledge transfer.
SEO knowledge is paramount. Bruce says web forums are the biggest source of misinformation on the web [another mass Tweetathon follows this comment]. The TV show HOUSE is SEO, he says. Patient is dying at the start of the show, the staff experiment until they are healed. That’s SEO says Bruce.
Key factors of SEO:
1) On-Page Factors – tags (title description, keyword, headings, body copy) along with a clear subject matter focus.
2) Expertness – inbound links, outbound links, inernal links. Focus on controlling PageRank movement.
3) Copywriting- this is structural content (sentences versus bulleted lists), Kincaid levels (complexity), clarification words.
4) Engagement Objects – video, MP3, images, maps, books, news, blogs, etc. Everything in Google universal results except on your pages
5) Architecture / Siloing – you need to theme laign your content and internal link structures by the search query used. If your alignment matches, then your’e seen as more of an expert for that query. Take the content on your web site and rework it to adapt to the way people search. That’s siloing.
6) Spidering – a slow server discourages spiders, crawlable sitemap and XML files. You can have 500 XML site maps and daisy chain them but you also need HTML spider friendly sitemaps for crawling purposes.
Google has been spidering pages for Universal Search for some time, so add video, add images and name them to make them indexable. If Google has 200 variables in the algorithm and has tweaked it since Universal Search, that’s an important connection. Alter your SEO tactics accordingly.
Expertness via PageRank
PageRank is a representation of your site’s reputation on the internet. It does NOT guarantee rankings.
(1-d) + d (PRe/Le + … + PRi/Li +…)
Scale goes from 0 to 10
Means very little = bad idea to depend on
Lower PageRank pages may have more value to you if they have fewer outbound links on them. YOU can control the PageRank of your site using internal linking strategy, provided you understand how PageRank works.
Points accumulated for your site determines your PR on a 0-10 scale
Points are by theme and even by anchor text.
If the top number (points allocated to higher ranking sites) changes, your toolbar PageRank will change. And it means nothing.
Competitive in Algorithm
If you have 500 points but 180 points are for links pointing to your using your target keyword, your true point value is 180 points and the rest are meaningless. Your competitor might have 360 points in total but 200 points for keyword links, therefore, they’re point value is higher and more meaningful. They have better PageRank. Therefore you want fewer sites with a higher PageRank to link to you rather than many sites with a lower PageRank. The goal is that the quality of links matter, NOT the number of links.
If you wish to rank well for search queries related to academic research, you need to build using architecture suited to that (e.g. search results are biased depending on search query). Research type queries generally result in a different set of SERPs than say people typing in shopping-related terms. Use this information to lead your site architecture and design decisions.
The overall themes of your site will also impact your position in the SERPs, based on clustering technology.
Behavior and Intent Results
When you conduct a search on Google you’ll often see different results at different times. This is because search results are now biased based on your search history. See “more results” for a better understanding. So for example, coffee drinkers and programmers would be shown different results for the search query “java”. They are also looking at the intent of the query and the geographical location (IP) of the searcher. So rankings are no longer an accurate representation of your SEO performance.
GsiteCrawler for creating sitemaps
SEODigger is a tool to research competitor’s SEO terms
Compete.com is a site overview tool
iSpionage is a tool to research competitor’s PPC terms
PixelSilk is a SEO friendly Content Management System (CMS)
Enquisite provide search analytics
Altruik provide repetitive instrumentation for automated SEO tactics
SEOToolSet is Bruce Clay’s own set of SEO tools developed in-house