December 11, 2015
Every website is optimistic about its framework, SEO and even content. But sadly, the first position in Google’s SERPs is elusive to most of them, with a very few reaching even the first page.
For a business to start and grow, the website needs to be created, initialized and then indexed by Google. Once the search engine starts crawling, it isn’t luck but proper skills that will see you through. Even after hiring the biggest SEO firms, fantastic Web designers and other professionals, we might often find companies, falling behind in pages and thus rankings. If you knew what to fix, things would certainly have been easier. Google changes its algorithms constantly and, thus, a predefined approach won’t really work.
I have had similar experiences in the past until the ranking demons were finally busted. It isn’t that my website is always on the top, but I rarely fall below the third spot. Here I have enumerated a few missteps that some webmasters and owners are still making when trying to beef up the traffic and hence, go up the ranks. Avoid these and you are pretty much on your way.
But before starting off, I would like to shed some light on how Google works with our websites, pages, links and, certainly, rankings. In simple terms, Google is inclined to help the general user. The websites that are of value to the audience, boast of regularly updated content and offer relevant information, get the needed traction.
The following are the obnoxious elements likely hindering your Google rankings growth. Fix these and the results will, certainly, be extraordinary.
Duplication is a SIN
Useful content is original, relevant and definitely not duplicated. Google has an eye for detail and it looks at blog pieces and even minor strands of plagiarized content. Penalties are therefore pretty big and proactive. Quality must be maintained at all times. However, some websites have landing pages and product pages that often have similar content, mainly due to the orientation. Google might still consider this a fraudulent approach unless we make use of ‘Canonical’ URLs for the cause. This would tell Google that the copied content is for a definite cause and certainly not for stacking the pages, against the SEO norms. Again the original draft can always be checked, via Copyscape to determine the authenticity of the piece.
‘Thin Content’ and Penalties
I recently saw a friend of mine getting heavily penalized by Google and the cited reason was thin content. One must know that content that lacks quality is no better than the copied one. A few years’ back Google prioritized link building but the present scenario, with the recent Panda Update has shifted the entire balance in favor of content. For someone who is thinking about in-depth content that is well served and detailed, Google is definitely an ally. The best way to stay ahead is to think about the audience first and then about the rankings. One must always try to make the website relevant to the cause, with the content being the stand-out performer.
Neglecting the First Human, Second Search Engine Plan
We all must target specific keywords, before trying to be indexed. These keywords are important because Google will be able to determine the relevance of your content. However, in the past, over-optimization of anchor text made Google quite skeptical regarding the inclusion of keywords in the article. Black hat SEO made use of keyword inclusion and the term ‘stuffing’ came into existence. However, the present scenario is exactly opposite — Google now prefers content creation with just the right amount of keywords in it, just for installing relevance.
Backlinks are still sacred when it comes to search rankings. Google can easily assess the relevance of your content by seeing the number of backlinks it has. For Google, backlinks are endorsements that go down well but only via the trusted sources. This technique was also abused in the form of rampaging guest blogging, directory submissions, spam comments and what not. I cannot describe all of these as abusive, but the inclination toward SERPS and not the users, is what makes these look bad. Bad directories are avoided these days. The website needs to work harder to get backlinks by creating premium content. If the quality is high, editors would include your work in their directories, offering backlinks for the valuable addition. Guest blogging has been at the forefront of link building for a long time but, with some websites being of low quality, Google opted to post some stringent changes in this regard. For guest posts to be accepted by domains with high authority; exceptional write-ups must be drafted.
Page Titles are Often Non-Optimized
Balancing plays an important role in SEO. Keywords need to be used with precision without going over-board. Google gives a lot of importance to the page titles because these suggest the relevance and value of the posted content. Hence, it would be logical to optimize the title with the most important or core keyword. This will improve rankings and even increase the CTRs because higher relevance will be justified. Some researchers at Searchmetrics suggest that page titles might be losing relevance, ever so slightly. The ranking impact might go down a tad, but these are still your best bets for higher ‘click through rates.’
Insecurity is a Factor
Site security is now a ranking factor. Sites beginning with https://, steal the show because visitors can be tracked and, thus, Google can determine relevance and hence, rankings. This builds on to the traffic referral data. And, of course, safer sites go down well with customers because they can rely on them for content and subscriptions.
Some websites keep themselves hidden unless they are ready to go live. The No-Follow Tags are, therefore, implemented in the form of small codes like robot.txt. If the codes aren’t removed in a timely fashion, Google won’t be able to crawl your website.
Pathetic User Experience
Search rankings do have an impact but what sells is the user experience. If the website created and ranked isn’t upgrading itself, then customers will visit and leave without reading or even subscribing. Bounce rates will be high and this isn’t what Google favors. Google tracks signals, one of which is the amount of time visitors spend on a site. Some of the factors that determine the user experience include ease of navigation, faster loading speeds and mobile responsivity to name a few. The companies need to look at the entire perspective as a visitor and implement changes, accordingly.
Bad Agencies Pull you Down
There are too many charlatans out there that offer a lot but often fail to deliver. Some SEO agencies still go by the black hat techniques because they aren’t well informed. We have seen firms putting up tags like, ‘Guaranteed First Page’ but the experienced individuals know that with Google nothing is guaranteed. Hence, selecting the agency is important. Check out and agency’s testimonials and tangible results before making a commitment.
Richard Smith is a mechatronics degree holder and a tested marketing professional with a decade long experience in SEO, SEM and Google’s algorithms. He specializes in drafting customer centric posts for entrepreneurs, helping them reach the pinnacle and stay there for longer periods of time. In his leisure, Richard makes gadgets that are extremely automated and perform the basic chores for him. Follow him at Google+ and www.100status.com