Site   Web

August 18, 2008

Search Engine Optimization Concepts

Sometimes another form of language and imagery is required in order to communicate vast ideas for common consumption. This is one of such instances where an array of algorithmic functions will be given a conceptual vehicle to map their role in the creation of a top 10 ranking.

Since the concept of the universe is a great model, we will borrow concepts originating from planetary forces ranging to universal phenomenon like gravity, orbits, the concept of the internet being a giant nebula of computers sharing virtual space and employing data retrieval, etc. Bear in mind, this is only an interpretation.

If you can visualize your website as a dot and each keyword as its own orbiting sphere of concentric influence, you have to create a series of eventful transactions (like creating content over time, building links, having a supportive format for internal linking) that classify and elevate your website as a candidate for relevance within the center of the sphere of influence for that keyword.

By viewing each keyword as a cumulative objective (like peeling away the layers of an onion) once you reach the center, your site radiates a beacon in all directions whenever a broad match or more general search is conducted with those keywords as search queries.

When you first start a website, since the site is empty your relevance is also at zero. As you continue to add content to your site (like filling up a glass with liquid) the site begins to take on a particular persona based upon what content is contained within the site.

As a result, your site creates its own orbit and gravitational pull where the content and subjects traversed within the context of your pages become beacons that attract similar orbiting keywords, concepts and semantic variations.

The analogies presented are to elaborate on the function of search engine algorithms which essentially consolidate each site into a type of conical tube that best describes the subject matter it contains. Then based on the vacuum created when someone actually invokes those qualities (through a linear search) the search bridges the gap and projects through all of the other spheres of influence (from the web) to extract the most relevant pieces of data and reformats them for communication via retrieval.

In order to appear as the most relevant result, your site must have enough supporting relevance from external clusters (others sites) that also contain the same type of data cloud (a collective summary of the sites content) which can change forms (like a liquid to a gas, a gas to a solid) and act as a homing signal to create continuity between its own orbit (on the micro level) and serve as a piece of the puzzle for the main theme in the index (as a part of the whole like a planet in the solar system, which belongs to a galaxy, etc.).

The Job of search engines is to be able to index, retrieve and create order from all of the orbits created from each websites signature as it occupies the cloud of online space (data shared across multiple servers), gauge a metric and then determine the usefulness of the metric and how it applies to a query (which like a vacuum) is seeking the most relevant result.

Just think of a search as pure potential that does not exist until executed, then on assembly its purpose is to seek out the most likely orbit (website) using the nebulous data cloud (the web) to bridge the gap using a vacuum that through traversing aggregate links and other sites (through assessing their relevance score) until it finds the most suitable supporting environment.

If you can use this visual map as a blueprint, then you understand that rankings are all a result of how strong the broadcast signal is, signal strength in this capacity is based on relevance, continuity and popularity. In order to produce a ranking of such magnitude (such as a competitive keyword) you must move from the outermost bounds of the keywords sphere of influence, into its center to attain the top ranking result.

The stages involved are research, planning, execution, testing, refinement, collaboration and strategy.

If any of the steps are excluded in does not mean you still cannot create top 10 placement for your website, it is just that if reproducing the phenomenon is important to you, then understanding the proper chronology and strategy behind the tactics should concern you. For example, topical relevance, site synergy / persona, authority and orderliness all impact placement for your website.

1) Research – You need to research the appropriate keywords to develop the appropriate gravity in your own site to attract the search engines crawlers (who index the content and include the site approximation of relevance score).

2) Planning – After finding your semantic base, create topical fields of information (multiple keyword-rich pages) within the site (using a content management system) or supporting site architecture.If search engines cannot retrieve the data (based on crawling errors from bots, poor site architecture or otherwise) then you is automatically excluded from participating with other forms of life contained in digital space (the web).

3) Execution – Build links, either through internal linking (if you have an authority site that is enough) or through external links from other sites (data clusters with relevance) that have similar signatures containing the topical relevance shared by your own site.

The closer the relevance between things like (a) is the site from the same industry or niche (b) is there related content between the sites (c) what keywords are used to link to and from the two sites (the vacuum) and (d) where each site resides in the overall relevance model for each of the overlapping keywords they both share, gives the target site the opportunity to receive a jolt of authority through osmosis and synergistic infusion. The bottom line is the quality of the links, but topical relevance also add even more weight to the orbit of the site receiving the link from an external site.

4) Testing – Test the results, ping the site from using a beacon in the nebula, use a search engine to determine how close to the center of the keywords your site has become. If you targeted 10 keywords for example and dedicated an ample amount of time, effort and energy to build a coherent series of pages on the site, created strong internal linking and then found at least 5 sources for links to each page (with a wide array of IP diversity) in other words not from the same series of sites, then you should have created enough orbital relevance as well as made a strong impression on the ethers in the data cloud (your sites algorithmic counterpart) to appear as a relevant result.

5) Refinement – If your website did not make the grade (is not ranking yet) then you can always (1) wait for all of the factors to settle a bit more and then re-evaluate the keyword saturation (2) build more content and shore up your main subject or subjects or (3) look for other sites with authority that can augment your sites reputation online (in the nebulous data cloud known as the web).

6) Collaboration – Moving a website closer to the center of a series of keywords requires understanding. You need to know that each time you add a page or a link it changes (1) the sites internal topical relevance and orbit / signature and (2) how other sites and keywords within the web react to it based on the search engines algorithm. The translation here is, by viewing each keyword as a cumulative apex, you can set in motion a series of events to create a chain reaction to close the gap for those keywords.

More competitive phrases may take many months to a year or more, other phrases in the nebula (the web) may only take a few hours or a few days at best to acquire. The point being, your websites orbit is entirely up to you, how others link to it and the collective blueprint it leaves on its environment in the nebula however is another part of the equation that you have to sculpt over time through releasing consistent information on a topic to reinforce the basis of its existence.

7) Strategy – the strategy is simple, rank for as many keywords that either funnel relevant traffic to your site or overlap with a series of other keywords that have latent potential down the road. The web is all about multiple layers overlapping and linking through internalization and expression. These two dynamic attributes are responsible for moving from site to site or from page to page within a website. Depending on your ability to see beyond the immediate goal (which is to create relevance) the real goal is to create a hub where your site literally overlaps with thousands of keywords and then refines each branch to delve deep into the long-tail as well as hit the high notes with the most sought after two word phrases as well.

By using the concepts such as topical relevance, gravity, orbit, continuity, the query-based vacuum, the nebulous data cloud and information retrieval in a way that anyone can understand, creating a systemic method for creating relevance is a by product which means multiple top 10 rankings to claim as trophies that increase website traffic.

Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.