Site   Web

October 31, 2014

What You Can Learn from KISSmetrics’ SEO Strategy

In this post, I will provide a comprehensive technical Inbound audit for KISSmetrics. This example is extremely detailed, and it covers a wide range of inbound topics. But before the inbound marketing audit, let’s cover the basics…

Who is KISSmetrics?
KISSmetrics is a Web analytics solution that helps companies make smarter business decisions, and boosting ROI.

Headquartered in San Francisco, KISSmetrics is backed by a syndicate of angels and early stage funds.

Why Audit KISSmetrics?
Like everyone working in SEO and inbound marketing, we all overlook or miss things (even basic things). The goal of this post is to help KISSmetrics, to help others learn from their inbound successes (they’ve done remarkable things in this arena), and to give a third-party perspective on a great example.

To be fair, “audit” seems like a terrible word, but in this case, think of “audit” as “being helpful.” The purpose of this audit is to give an outsider’s view to a great company. All in all, I hope this helps KISSmetrics. Neil and team are outstanding

Disclaimer
Before I jump into the technical details of the inbound audit, it’s important to note that I have no affiliation with KISSmetrics (I am also not currently a user of their product – so you don’t have to worry about affiliate links). KISSmetrics also did not ask me to complete this audit. As a result, I don’t have access to any of the site’s analytics or webmaster tools accounts, and I don’t have access to the site’s content management system (CMS).

In a typical audit, I would begin by sifting through the data and then narrowing in on problem issues. So… if I make completely inaccurate observations, I blame my data from the third-party tools, ie (Searchmetrics, Ahrefs, SEMrush, Moz, etc…). Don’t get me wrong – these tools are awesome, but they generally work better from the inside of the company.

The goal of this audit is to help KISSmetrics. The aim is never to critique in a negative, harmful way, but to help promote KISSmetrics through inbound marketing by giving them the perspective of an objective third party. With those disclaimers out of the way, let’s begin.

*If you have any questions: @elioverbey

Since this post will be extremely long – here is a detailed navigation to help you understand where you are throughout this audit. As you will see, this audit will spend a majority of time on optimization (top of funnel), but lower areas of the funnel will be covered as well. The audit will work through four main areas as seen below: Optimization (Attract), Conversion, Customers, and Delight.

KISSmetrics Web Structure
Optimization / Attract
Search Visibility
Robots
Robots Meta Tag
Accessibility
Performance
Site Architecture
Authority Flow
Click Depth
On Page Factors
HTML Markup
Structured Data
Head Tags
Open Graph
Twitter Cards
Titles
Meta Descriptions
Images
URLs
Duplicate Content
External Links
Off Page Factors
Backlinks
Backlink Distribution
Backlink Source
Anchor Link Analysis
Image Links
Conversion / Leads
Calls to Action
Landing Pages
Forms
Close / Customers
Contacts / E-mail
Delight
Social Media
Engaging Content
Summary
Indexability / KISSmetrics Web Structure

In order to understand this audit, you need to understand how KISSmetrics is set up. At first glance, you’ll notice that KISSmetrics is constructed using quite a few subdomains:

  • kissmetrics.com/ – 18 pages
    • *tweets.kissmetrics.com – 2,587 pages
    • blog.kissmetrics.com – 1,038 pages
    • *status.kissmetrics.com – 70 pages
    • grow.kissmetrics.com – 7 pages
    • focus.kissmetrics.com – 10 pages
    • support.kissmetrics.com – 347 pages
    • middleman.kissmetrics.com – 1 pages
    • styleguide.kissmetrics.com – 7 pages
    • uptime.kissmetrics.com – Redirects to status.kissmetrics.com
    • demo.kissmetrics.com – 100 pages

An * (asterisk) indicates that the subdomain is blocked by the robots.txt.

In total, KISSmetrics has 4,185 pages across 11 domains and subdomains.

As you glance at the site structure of KISSmetrics, you notice that their content is divided up into subdomains. Due to the structure of KISSmetrics, this inbound audit will focus solely on two areas of their site: the root domain (kissmetrics.com) and their blog (blog.kissmetrics.com). The other subdomains do not contain information pertaining to this inbound audit (ie. styleguide) and will not be included in the analysis.

Inbound Marketing
In its most basic form:

“Inbound marketing focuses on earning, not buying, a person’s attention.”Brian Halligan
The principle of inbound marketing has been around for years, but the term has been coined by HubSpot (well done Dharmesh and Brian). Inbound Marketing is all about creating great content that pulls people in.

Since HubSpot is the go-to on inbound, this audit will use their structure as a model. As this audit continues, you’ll notice how the funnel naturally works.

kissmetric_attract

Optimization / Attract
The first step in inbound marketing / SEO is attracting the right audience. Content is the single best way to attract new visitors to your website. In order to be found by the right prospective customers, KISSmetrics must not only create optimized content, but foster an environment in which content can thrive.

Search Visibility

In order to set the framework for the entire audit, it’s important to analyze the site’s performance. When looking at KISSmetrics’s traffic, there are 3 important questions:

  • Is the traffic growing?
  • Has the site been penalized?
  • What is the percentage of organic traffic?

To determine KISSmetrics’s traffic performance, I used two tools:
Searchmetrics Suit, and one of my new favorites, Similar Web. Finding accurate third party data is incredibly difficult, but after using Google Analytics on a network of sites, I’ve found that these two programs provide the most accuracy in capturing third-party data.

As you can see from the graph below, there are some red flags for KISSmetrics (the tools join all traffic from all subdomains together):

kissmetric_chart

As the graph above shows, the site’s visibility decreased dramatically after May 18, 2014. This date is important because it also corresponds to when Google updated Panda 4.0.

Immediately, there is cause for concern on KISSmetrics’s site. As you begin to look over the third-party data, there is a strong correlation between the data (although third party) and Google’s Panda update…

There are three possible solutions to this drop in traffic:

  1. The data is inaccurate
  2. The site was algorithmically penalized due to duplicate content
  3. The site was algorithmically penalized due to guest bloggers

The first possibility: The data from the third party is inaccurate. In fact, it seems highly improbable that KISSmetrics was hit with an algorithmic penalty, right? Neil Patel (co-founder of KISSmetrics and SEO expert) wrote an article for Search Engine Journal about the Panda 4.0 update the day after Panda 4.0 was released.

In the post, Neil describes the update, who was affected, and how to fix it. Does it not seem like KISSmetrics (his company) would be prepared for Panda? Since he knew so much about Panda, he would have protected his assets.

Besides that, KISSmetrics writes lengthy (2,000+ word average), unique content that is well received (only 12 of their articles don’t have any comments – meaning, the other 1,000+ have engagement via comments).

That is one possibility. But the other possibilities do cause for more concern:

The second possibility: KISSmetrics was hit by Panda 4.0 and here is the probable reason:

For some odd reason, the KISSmetrics blog exist on two protocols: http and https. Meaning, the same content (although unique and engaging) can technically be seen as duplicate.

Let’s look at an example. As of today, the most recent post from KISSmetrics is: Website Testing Mistakes That Can Damage Your Business (this post was chosen due to recency, but this same problem exists throughout). You can find this article in two places:

  • http://blog.kissmetrics.com/website-testing-mistakes/
  • https://blog.kissmetrics.com/website-testing-mistakes/

As you can see from the image below, the same page exists in two places (unfortunately, this is duplicate content and cause for concern):

kissmetric_video

You’ll notice that in the image above, on the left is the normal protocol (http://), and on the right is the secure protocol (https://). Google drops the http:// protocol in the browser on the left by default. I also checked out the source code on this same page to find three problems that could have led to this drop in traffic on May 19th (Panda 4.0), I highlighted 3 key problems in the diagram below:

kissmetric_sourcecode

  1. Once again, I highlighted the protocol so you can see the two exact pages in two locations.
  2. Both of these pages are INDEX, FOLLOW. If one of these pages was NOINDEX, there wouldn’t be a huge problem, but KISSmetrics is telling the search engines to index both of these pages. This is another red flag.
  3. The third red flag is the canonical. The canonical tag tells the search engines that a given page should be treated as though it were a copy of the URL. In this case, each page canonicals itself. One page says the content should canonical to http:// and the other page says the content should canonical to https://. Basically, each page is claiming original.

Finally, Google has noticed that the blog exists in two locations. As you can see below, a query for the blog brings back both protocols (http and https).

kissmetric_blog

This means that the KISSmetrics blog was probably hit by an algorithmic penalty on May 19th.

Even if the third party data is wrong (and KISSmetrics was not hit by a penalty) they should fix these problems immediately. KISSmetrics should decide which URL is primary, and then 301 or canonical.

Even more, as it currently stands, the site is dispersing its links among https and http. If they fix this problem, the could see a rise in rankings due to the migration of their links.

The third possibility:
KISSmetrics was hit by Panda 4.0 because of guest bloggers.

I am not implying that KISSmetrics was penalized because “they didn’t stick a fork in their guest bloggers.” But, KISSmetrics could have been penalized by duplicate content from their guest bloggers (quite a few of KISSmetrics posts are written by guest bloggers). Guest writers could have written for KISSmetrics and then re-published that content on their own blog.

Based on the KISSmetrics publishing guidelines, there are no guidelines stating that the writer cannot re-publish his or her own content. Although the duplicate content issue could be covered in other forms of communication to the writers (contract, etc), it is a possibility.

Penalty Conclusion
Although I am hoping for the first possibility, (I wouldn’t wish a penalty upon anyone) these are issues that KISSmetrics should thoughtfully consider. Again, I cannot stress the unreliability of third party data, but in any manner, these are issues that should be addressed immediately. Hopefully, Neil can chime in on this issue. I’d really love to know if these have been issues for KISSmetrics (plus, Neil is extremely transparent in order to be helpful).

Robots

A robots.txt file is used to restrict search engines from accessing specific sections of a site. Here is a copy of KISSmetric’s root domain Robots.txt file:

kissmetric_sitemap

On their blog, KISSmetrics uses another robots.txt file:

kissmetric_robot

There are a few improvements that KISSmetrics could make in the construction of their robots.txt.

First, on the blog, KISSmetrics should consider blocking ‘/wp-content/plugins/’. Sometimes developers put links in their plugins.

Second, on the blog, KISSmetrics should rethink blocking /wp-includes/’ in robots. There are better solutions for blocking robots than in the robots.txt file (ie, NOINDEX).

Finally, KISSmetrics should add a working sitemap to their blog. The current sitemap (Sitemap: http://blog.kissmetrics.com/sitemap.xml.gz) does not work.

kissmetric_404

Fixing this will help the robots to index their pages.

Robots Meta Tag
Each page on a site can use a robots meta tag to tell search engine crawlers if they are allowed to index that page and follow its links.

WordPress is a great open source platform – one that KISSmetrics built their blog on – but duplicate content is one thing you have to be very mindful of. Using the robots meta tag will help prevent duplicate content. Content duplication issues include tags, categories, and archives.

Quite a few of KISSmetrics’s pages have a meta robots tag. In the case mentioned above (wp-includes), it would benefit KISSmetrics to include a “noindex” robots meta tag on a per page basis, rather than blocking an entire directory.

One thing KISSmetrics should consider ‘NOINDEX’ is blog subpages. As you can see below, the sub-pages are indexed:

kissmetric_viewsourceblog

Since these page take up crawl bandwidth, but don’t have unique content, and historically have a low CTR, KISSmetrics (based on their analytics) should consider “NOINDEX, FOLLOW” on these archive based pages.

KISSmetrics has successfully implemented this change on subpages of categories, topics, and topic sub-pages, and should consider adding the same meta robots tag to the other archive based structures. The “NOINDEX, FOLLOW” will remove the subpages from the index, but will still allow links to pass.

Accessibility
This section covers best practices for both search engines and users. Many of the search engines’ accessibility issues were mentioned above in index-ability, and now we’ll cover accessibility as it mainly relates to personas with the benefits of robots in mind.

Performance
According to Google Webmaster tools:

You may have heard that here at Google we’re obsessed with speed, in our products and on the web. As part of that effort, today we’re including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests… While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page.

It’s tempting to dismiss site speed as an important SEO ranking factor, but if Google says it matters (even a small percentage), then it matters. Even if you dismiss speed as an optimization factor, it is also an inbound factor that can’t be ignored. Who likes a slow site?

KISSmetrics’s site speed is fairly slow, and KISSmetrics could easily make a few tweaks to make their site much more efficient. As you can see below (via GTMetrix), the site speed for the root domain is:

kissmetrics_analytics

The problems on the root domain above are quick fixes. The blog (blog.kissmetrics.com) did much better, scoring an 88 percent and a 79 percent.

Reducing the number of files needed to load the site, and thereby reducing the number of HTTP requests, will make KISSmetrics’s site load more quickly. Currently, the root domain makes 64 requests when loading the page (which is somewhat surprising, considering the homepage is only a few images and includes less than 70 words.)

There are usually three parts to fixing this:

  • Reduce the number of JavaScript files
  • Reduce the number of CSS files
  • Reduce the number of images

On the homepage, the Time to First Byte is efficient (300ms), but the page load is somewhat slow – anywhere from 2.95 seconds on GTMetrix to 1.95 seconds (Pingdom). They received a 55/100 on Page Speed Insights via Google (42 on Mobile), and 75/100 on Pingdom. KISSmetrics’s homepage has room for improvement.

Another concern is that their blogs have over 100+ https requests, but surprisingly, even with all the requests, the page speed was excellent: 933kb.

The site could be improved with the following:

  • Eliminate render-blocking JavaScript and CSS
  • Minimize HTTP requests – KISSmetrics’s pages will load more quickly with fewer requests. Minimizing these requests involves reducing the number of files that have to be loaded, such as Javascript, CSS, and images.
  • Combine Javascript and CSS into external files with links from the header. This allows the external page to be cached so that it will load faster (there are 10 CSS files and 4 JS files loaded separately on the blog).
  • Implement server side / browser caching – This creates a static html page for a URL so that the dynamic sites don’t have to reload / be recreated each time the URL is requested.
  • Load Javascript asynchronously.
  • Use a CDN – such as Amazon. The CDN will allow users to download information more quickly (as far as I can tell, they are not using a CDN).
  • Finally, use 301′s only when necessary. A 301 forces a new URL, which takes a longer time to load (93 of their pages use 301).

Site Architecture
The site architecture defines the overall structure of a site, and it has a number of important SEO implications. For example, when a page receives external authority, the site architecture defines how that authority flows through the rest of the site.

Additionally, since search engine crawlers have a finite crawl budget for every site, the site architecture ultimately dictates how frequently pages are crawled (or if they’re crawled at all).

Authority Flow
To understand how authority flows through the site, I performed an analysis on the site’s internal links.

Based on that analysis, here is the distribution of the site’s links (these values have been rounded to the nearest percentage):

kissmetric_authority

As you can see above, 96 percent of the pages have a value less than 0.1 (pages with authority values closer to 0 have the least authority and pages closer to 1 have highest authority), relative to the other pages. Most of the site’s internal authority is held by 3.8 percent of the pages.

The root cause for this authority is the site’s navigation. All of the pages that have large amounts of internal links (1800+), are found in the footer or header:

/infographics (1,834)
/marketing-guides (1,830)
/webinars (1,822)
/topics (1,820)

Since these links appear sitewide (i.e., on every page), these page receive multiple internal links, while the other pages receive very few internal links.

Even though the article categories on the site break down the articles into substantial sections, the internal links do not seem to pass through to individual articles (which is the case with almost every navigation).

The related post section on each article works well to accomplish this purpose (an example is below), but not every post includes the related post widget.


avatar

Eli Overbey is an SEO analyst by day and blogger the rest of the time. To find other similar in-depth audits or to contact him about doing an audit for your site, visit elioverbey.net.

css.php