Everything That's Terrible About My Content - A Case Study

Over the past year I’ve written tons of pieces of content that help to grow traffic to my blogs, grow my social following and drive direct business enquiries. Having the opportunity to regularly write for a lot of the top digital marketing blogs like Moz, Search Engine Land and Search Engine Journal has been a huge catalyst for all of the above as well.

Having said this, not all of my content has had the impact that I’d hoped for. For me, this isn’t always a bad thing because it helps me gauge what my readers are after so that I can improve my content going forward.

If you’re churning out tons of content every month, you’d have to be mad not to take a step back and question what’s working. This is exactly what I do each month, or each time I target a specific type of audience in order to maximise the return on investment that I get. Whether I deem this as being through link acquisition, social shares and network growth, email subscribers or direct business leads, a structured content analysis process is what helps me realise this.

I’m going to share the process that I take when it comes to analysing content in my own blogs and on the blogs that I write for. This has helped me dramatically increase traffic to my blogs and ensure that every minute of time I spend on creating new content has the maximum possible potential to deliver ROI.

The Case Study

Using Find My Blog Way as a case study, I’m going to show you both how I carry out my content analysis and what I look for to identify opportunities. This data is invaluable to any content marketer as it allows you to truly understand what your readers want, and even more importantly, what motivates them to convert.

Step 1: Gathering URLs

The first stage is to gather all of the data around your content. In order for me to get a full understanding of what type of content is working I need to pull off a list of all the URLs on the site. There’s a number of ways that this can be done, one of which is through using Screaming Frog SEO Spider.

Screaming Frog SEO Spider

All you need to do is plug in your homepage URL (as shown in the above image) and let Screaming Frog crawl through the site. You’ll then be presented with a long list of all the URLs in your site that can be exported to a .csv file.

An alternative (free) to Screaming Frog is Xenu Link Sleuth which does pretty much the same thing. It’s pretty handy if you have a fairly large site and you’re looking to only use a free solution.

Now, my preferred method is to use my site’s XML sitemap to gather a list of the URLs. The advantage here is that I can select only my article pages (which I keep in a separate sitemap), which makes analysing the actual blog content a lot easier. If you go with the first option of Screaming Frog/Xenu then you’ll have to organise the data into only the URLs you wish to analyse.

Step 2: URL Metrics

Now that you’ve got a list of your site’s URLs, it’s time to use one of my favourite tools, URL Profiler. If you haven’t tried out this tool, go and download it now. You get a free one month’s trial with all the features enabled so it’s well worth giving it a download.

Once installed, URL Profiler allows you to upload your list of URLs, then link up your Majestic SEO, Moz, Ahrefs, Google Analytics, My-Addr, Copyscape and uClassify accounts in order to bring in a ton of extra metrics surrounding the URLs.

Even if you don’t have access to any of these tools, you still get some great content analysis features.

URL Profiler

You can add your list of URLs into URL Profiler by right-clicking the right-hand panel and selecting to either paste URLs from the clipboard, import from a .csv or import from an XML sitemap.

In terms of the options you select, this will depend largely on what other tools you’re integrating. For my analysis of Find My Blog Way, I selected Majestic SEO, Social Shares, Robots Access (to check if any of my URLs are blocked by the robots.txt), Google Analytics, HTTP Status, PageSpeed, Mozscape and Readability.

Another thing that I’ve done is added in .entry-content to the Content CSS Selector field. This field allows you to enter the CSS class/id that holds the main content of your webpage. In my case, this is the entry-content class. This feature is great because it will ignore a lot of the noise on your webpage when analysing the actual content (i.e. sidebars, comments, etc.).

Once you’ve got everything setup, click Run Profiler and leave it to do its thing. If you’ve got a lot of URLs then you’ll probably have to leave it a while. Another thing worth mentioning is the ability to add proxies to the tool. If you have access to any private proxies then you can add them in the settings tab (if you’re looking to buy some proxies, check out this post by Jacob King first).

Step 3: Manipulating the Data

Once URL Profiler has finished, you’ll get a CSV file packed full of data surrounding your website. At first glance this can be a little overwhelming, but don’t worry about that because it’s better to have more than you need than not enough.

Raw Data 

Taking this a step further, you can extract extra information through using some XPath. You can check out my full scraping tutorial here for more information.

Now you’ve got the data within Excel, it’s time to make some sense of it. If you’ve used Excel pivot tables before then you’ll start to see how this analysis can work, but even if you haven’t, you’ll be able to follow my instructions to get going.

First thing’s first, it’s time to create the pivot table.

Creating the pivot table

Simply go to Insert>Pivot Table and then select all of the columns of your worksheet. For more info, check out this guide.

Now that you’ve got the pivot table set up, you’ll be able to start your analysis. Within my analysis of Find My Blog Way, I looked at the following things:

  1. Post category vs social shares/inbound links/article comments
  2. Day of publishing vs social shares/inbound links/article comments
  3. Headline length vs social shares/inbound links/article comments
  4. Article length vs social shares/inbound links/article comments
  5. Type of article vs social shares/inbound links/article comments

Here’s how I did each of them:

Post Category Analysis

I wanted to understand which type of post was performing the best on the blog for generating backlinks, social shares and comments. Using this data, I’m able to plan my editorial calendar around the categories that get me the best return – simple when you think about it, but it’s worrying how many people overlook this.

To do this, I needed to extract the category type of each post on the blog and add it in as a new column to the data sheet. Now, I don’t fancy going through all of the site and copy/pasting this info in, so this is where scraping comes in!

Post category analysis

All I did was find where the category is displayed within an article, right-clicked (within Chrome) and selected inspect element. On the line of code that is highlighted within Chrome Developer Tools (i.e. the line of code displaying the category name), I right-clicked and pressed Copy XPath. This is the XPath I used:

//span[@class='post-category']/a/text()

All that needs to be done now is to add a new column in the Excel worksheet and add the full XPathOnURL Excel formula (with a little help from the SEO Tools plugin for Excel) and apply it to all of my URLs. Simple!

If that doesn’t sound so simple, follow the image below:

Extracting post categories with XPath

Once you’ve gathered the categories of each of the posts, it’s time to compare them against the various other metrics within the pivot table.

Go to the Pivot Table worksheet that I showed you how to set up earlier and then refresh the data. You can do this by clicking on the pivot table, navigating to the Options tab and then pressing Refresh.

The pivot table should now update and include the new data within it. All that needs to be done now is to add the Category to the Column Labels box and select Majestic URL Ref Domains (if you’re comparing against links) within the Values box. You’ll then need to click the drop-down arrow on the Majestic URL Ref Domains, select Value Field Settings and then choose to summarize the value field by Average. This gives you the average linking root domains across each post category.

You should now have some data that looks something like this:

Categories vs links data

I like to now visualise the data within a graph so it’s easy to break down. Here are the findings from Find My Blog Way:

Average Referring Domains vs Post Category

The chart shows that when it comes to linking root domains, articles surrounding social media, link building and SEO tools seem to be the clear winners.

Publishing Day Analysis

Understanding which day is the best to publish your content on can make such a big difference to the overall results of your content.

Something as simple of posting on one day as opposed to another can be the difference between 500 social shares and 50 social shares – believe me, I’ve learnt this the hard way.

To find this information out, I needed to extract some extra data from the blog. Similarly to post categories, I display the time that an article has been published within each post. This gives me an opportunity to use some more XPath to extract this data.

Publishing date analysis

To pull in the published date, I used the following XPath formula in Excel (B2 is the cell reference for my article URL):

=XPathOnUrl(B2,"//meta[@property='article:published_time']","content")

Once I’d pulled in all of the published dates for the post, I used the =WEEKDAY() formula within Excel to extract the day of the week that the date related to (it will display as a number between 1 and 7 i.e. Saturday = 7, Sunday = 1, Monday = 2, etc.)

Now that each article has a corresponding numeric value assigned to it that informs me of the day it was published, it‘s time to refresh the pivot table.

This time, I wanted to compare the average number of social shares on my posts across each day of the week. This would give me a guide towards the best day of publishing for gaining social signals.

To do this, I added the Day of Publish field to the Column Labels box within my pivot table sheet, and the Social Stats Total field within the Values box (again, making sure that I selected the average figures within the Value Field Settings).

This looked like this:

Pivot table fields

And gave me the following data:

Pivot table data

All that was left to do was to copy/paste the values over to a new worksheet and replace the numeric column titles with the days of the week. Here’s what I found:

Day of publish vs Average social shares

The key takeaway here was that Wednesday and Thursday were by far the best days for me to publish content on in order to gain social shares. I was also able to break this analysis down further to show the effects across each social network.

Here’s what I found with Google+:

Day of publish vs Average Google +1s

Again, Wednesday and Thursday were the best days here, but Wednesday seemed to be the clear winner.

The story was slightly different with Twitter though:

Day of publish vs Average Tweets

The best publishing days for tweet acquisition are Thursday and Tuesday, whilst Wednesday was one of the worst days for this.

These insights are so valuable to me because it helps me to better achieve the goals I’m setting and also allows me to tailor my content to the specific channels I’m targeting.

For example, if I’m writing some content focused on building a successful Twitter campaign, I can use the data I’ve built to ensure it gets the maximum coverage possible across Twitter.

Headline Length Analysis

Don’t underestimate the importance of a good headline.

You can create a fantastic article, but if nobody reads it then it’s lost. Your headline is the shop window to your content and you need to make sure that every aspect of it is optimised to perfection.

The length of a headline is a particularly important aspect (but not the only one). I’ve found that by either reducing or increasing (depending on the audience) the length of a headline, it can dramatically improve click-through rates.

So, with this in mind, I wanted to find out what the optimal headline length was for Find My Blog Way. Here’s how I did it…

Headline length

Within my blog, the headline of my post is automatically used as the HTML title tag, so URL Profiler had already gathered the titles from each article. If your blog doesn’t do this then you can easily pull it in via some XPath on the H1 tag.

The next step was to create a column to the right of the list of titles that could count the number of words within each headline. To do this, I used a built-in function within the SEO Tools plugin for Excel. The formula looked like this:

=CountWords(S2)

Now all that’s left to do is to refresh the pivot table and add in the values for comparison.

Pivot table data

Within this comparison I wanted to understand the correlation between the number of comments left on an article vs the length of the headline. To do this, I added my Headline Word Count field to the Column Labels box and the Comments field to the Values box (selecting average within the Value Field Settings).

Once I had my data, I plugged it into a graph and the results were as follows:

Headline Length vs Average Comments

It looks like the best headline length for the blog (to acquire comments) is either 7 or 12 words, however the sample size for articles that have 12 words within the headline isn’t great. Taking this into account, the optimal length is 7 words.

The story is pretty much the same for social shares, with headlines that are a length of 7 words achieving the most social shares on average. Links, on the other hand, tell a slightly different story. In this case, the longer headlines seem to get linked back to more frequently, with headlines of either 10 or 11 words being linked to the most.

Article Length Analysis

Like the length of your headline, the total length of your articles can have a huge impact upon their success.

Another thing that I haven’t touched upon (but I’ll leave you to play with this) is that URL Profiler can also pull in the Google Analytics data from your website so that you can conduct further analysis associated to time on page, bounce rate, conversions, etc.

Now, URL Profiler does bring in the total characters from within your articles, and as I mentioned before, you can tell the tool which CSS element to scrape in order to only pull in the stats around the content area of your articles.

I don’t mind using these stats for analysis, but I much prefer to have the word count. To do this, I scraped the same CSS element to pull in the entire contents of each article and then used the =CountWords() function to total up the number of words in each post.

Here are my findings…

Word Counts vs Linking Root Domains

The big winner for links is the longer-form content, which isn’t a complete surprise as these tend to be my ultimate guides that are pretty extensive. For the shorter content, it seems that posts between 1,037 and 1,336 words are pretty much perfect from a links perspective.

Word Counts vs Average Social Shares

Social shares seem to increase as the word count increases. Long-form content is a clear winner on the blog (hence why I put so much effort into it!).

Article Type Analysis

The last thing that I wanted to look at was the type of content that was performing well. By this I don’t just mean looking at the category of an article, but instead I wanted to delve a little deeper into the theme of the content itself.

To do this I looked for specific types of words that were mentioned within the headline and then grouped them into specific post types. I grouped them into the following categories:

  • List posts (articles that outline things in a numbered list).
  • How to posts (articles that are giving specific how to advice).
  • What posts (articles that mention what in the headline).
  • Question posts (articles that ask a specific question).
  • Why posts (articles that begin with why).
  • Ultimate Guides (any of the ultimate guides I’ve written).
  • Review posts (articles that are reviewing something).
  • Tutorial posts (articles focused on giving tutorials around something).

In particular, I wanted to find the themes that motivated people to share the article via social media so that I could create content that would get the largest online reach possible.

Now, I’m not going to go into all the details of the formulae that I used to find out each post type. Instead, go check out this post from Annie Cushing.

Here were the findings from this analysis:

Post Type vs Average Social Shares

Interestingly, it looks like my Ultimate Guides are a clear winner on social media, along with the tutorials that I write. As well as this, posts that have the word Why within the title seem to perform really well. An example of this could be, Why Google Plus is a Great Traffic Generation Source.

Further Analysis

These are just a few things that I’ve chosen to look at, but the possibilities really are endless. Just ensure that you’re objective with your research and that you’re aiming to compare the performance of your content against key elements that make them up.

Every website is different, thus the analysis should be tailored to each individual case. Some blogs have internal voting systems that allow users to rate posts that they read. If this is the case, this data can be used to better understand what your readers respond well to.

Alongside this, there is all the data that you can pull in from Google Analytics to measure against. Metrics like total pageviews, bounce rate, time on page, scroll depth, conversions, etc. are great starting points for your analysis. Likewise, there are loads of different content elements that can be analysed, for example, paragraph length, time of publish, text/image ratio and many more.

If you’d like to take a deeper look at the data that I gathered around Find My Blog Way then you can download the spreadsheet below:

[sociallocker id="1727"] http://www.matthewbarby.com/goodies/findmyblogway-content-analysis.xlsx[/sociallocker]

TL;DR

  • Use URL Profiler to extract big data around your content.
  • Tailor the data further to fit your website through some custom scraping.
  • Compare key performance metrics against content elements.
  • Remain objective with your analysis and use the findings to improve your ROI.
  • You should be consistently testing and improving your content. If you’re not then you’re missing a huge opportunity.