Through my place of work I get to participate in many different business processes and I am lucky enough to get to contribute to the betterment of the field. There are always discussions on how to better define and measure performance indicators and then making sense of the data after.
One of the obstacles that has been prevalent in my work is the question of how to measure engagement on content pages. Especially with regards to how much is shared through different traffic sources and viewed on smaller and smaller, personal devices which in turn has lead to a higher bounce rate.
The importance and benefit of knowing your customer and being able to serve them engaging content is paramount in a world where personalization and optimization are in focus. So let us talk about how to better measure engagement on content pages!
At first glance this might not seem like a problem or it might seem like something that would be covered by the core functionality in the different analysis tools. This has been shown to be untrue. Without any real indicator on content engagement people have tried different solutions at different success rates.
We often encounter customers that are optimizing their site based on metrics such as bounce rate. As it turns out, pages without any clear call to action tend to have a high bounce rate, and a high bounce rate is bad, right? Not necessarily. If the page your customers are viewing is a content page without a clear call to action it is expected to be high meaning that a bounce could be a happy customer just as well as an unhappy one.
Time On Page
So how about if we measure the time spent on the page, that would surely give an indication if the content is engaging or not? Sadly that is not the case. Google Analytics for example use the time between the first pageview and the next to calculate time on page. Meaning that if there is not a second pageview, time on page will always fall in the 0-10 sec category which in conjunction with the high bounce rate renders it useless.
Other possible solutions
But what about scroll tracking to check if the user sees the content we want them to see? While scroll tracking gives us an indication on the activity of the user it does not really shine a light on the level of engagement. Just imagine the scenario where a user immediately scrolls to the bottom to check the length and the highlights of the article. That would give us misleading data and it is possible to make the argument that it does not really provide any insight.
The “Read More” button serves to give an indication of when the user has shown interest by exposing the rest of the article. This can be an indicator for engagement, because by interacting with the button you know that the user at least wants to see how much more information there is. But where should you put the “Read More” button? And if you want more than one benchmarks, should you add more buttons? I have yet to see this work as intended.
Ping Intervals / Active Time
Another way I have come across is to either send a ping on an interval in time or on “active time”. That means that there is a script on the page that sends a ping to the analysis tools so that we effectively determine the time spent on page, and with active time we can get an even more accurate idea of the engagement per page. But how can we compare the level of engagement across all content on the page without a common standard or benchmark? Well, let me tell you about that!
Our article engagement solution
The problem we set out to solve was this: “How can I measure and compare engagement across all my content?”. It was important to deliver an easy to read and informative report so that the threshold for understanding and utilizing the insight it gives was not too high.
We started with defining a benchmark that would work for articles of different length. To accomplish this we ended on a script that defines the content area and then isolates the text content so that we can estimate a total reading time for each article.
The small algorithm uses simple HTML selectors to find the text and goes by the assumption that the average user reads 250 words per minute. Using the estimated reading time we were able to set two benchmarks in time using a percentage of the total.
Set article scroll length
Already we have a better version of the previous ping based time on the page because it was relative to the length of the content. But wait, there is more! To give an even better metric for engagement we combined the benchmarks in time with scroll milestones which are based on the previously defined content area.
So now we are able to get the benefit of the scroll tracking without the misleading data of someone who is just scrolling to check the length.
After defining these benchmarks we created two metrics to give an indication of the engagement and named them Medium Engaged and Fully Engaged. They are defined as follows.
- The user must have stayed on the page for longer than 25% of the total estimated reading time.
- The user must have scrolled 40% of the total content height.
- The user must have stayed on the page for longer than 50% of the total estimated reading time.
- The user must have scrolled 75% of the total content height.
As long as the user satisfies both the criteria our script sends a Content Engagement event to the different analysis tools. This in turn gives us a report where we can view engagement across articles and segment it into how many were medium engaged and how many were fully engaged.
We now have a pretty good idea of the engagement across the articles on any given site, so now we had to deal with content with videos or a clear call to action.
We did this by also measuring downloads, offsite links and video completions. By combining this with the data we collect using the engagement tracker, we are able to a reliable degree give each article grounds for comparison.
Insight and possibilities
We set out to find a better way to measure engagement across articles and blog posts so that they could be compared regardless of length and I will argue that we have found it. Even though the script is still in its infancy, it is already starting to show its potential where we have implemented it.
What we also wanted was a script that lived outside analytics world, so we added functionality that makes every success tailored to the length of the article. Since the script is implemented through a TMS, you can easily attach this success to other marketing tools like Facebook, Adform, Google ads, etc.
We are also seeing that some content is more readable than other. It is therefore adamant that we organize our content so we are able to set an expected level of engagement. For example; A heavy law article will take you more time to engage in than a light, man-on-the-street article. Maybe tweak the words per minute?
We at Arena Data Consulting are working with highly advanced and customized solutions every day so you would think that the script is designed with big corporations in mind. Luckily, that is not the case. The engagement measurements are easily implemented, regardless of scale, and will give you insights on even a small WordPress blog.
I want this!
Want to implement our engagement script on your own page? No problem! Click on the link below to our GitHub with code and documentation.
I want this, but I need some help!
Need help to convert customers on your website? We work to gain insight on all solutions and have specialized us on the most advanced.