Growth Hacking: Experimenting With Blog Measurements

While every organization is different, almost everyone has a blog. Blogs are used to build online credibility and trust between the brand and the end consumer. When looking critically at how (and what) content makes a blog really perform – a free Google Analytics account can tell you which blog posts are actually being visited.

One of my good friends, Sagan Morrow, has been building up her lifestyle blog to the point that she now sustains it as her full-time business. She recently wrote a blog post that went “viral” on Pinterest. This post had generated over 60 times her average pageview count and, because of this, was at the top of her analytics. Something was bugging her though; she wondered why her bounce rates were higher than some of her blogging community friends (Pinch of Yum, Melyssa Griffin, and By Regina). This made her ask the question, “Was there something wrong with the content?”


You may have heard people say that “a bounce happens when a user arrives to your website and leaves without doing anything,” and while this is theoretically correct, it’s a little more technical than that. Let’s say we’ve freshly installed Google Analytics on a new website. When a user arrives to any webpage (let’s call it Page A) for the first time during a new session, Google Analytics starts a timer and it continues to run until one of two events happens:

  1. The user leaves the webpage (Page A) without conducting any on-page* interactions. If this event is true, Google Analytics considers this user “bounced” and the data from the timer is not tracked.
  2. The user navigates to another internal webpage (Page B) by clicking on an internal link found on Page A. If this event is true, Google Analytics considers this user “engaged” and the data from the timer begins tracking various metrics such as Time on Page and Session Duration.

*Note: Whenever a user clicks something, or performs some sort of action, that’s considered as an on-page interaction.

In Sagan’s case, the viral post was gaining many pageviews but Google’s definition of a bounce had made her bounce rate metric inaccurate. That is, it looked like nearly all of her visitors did not care for her article.

The primary goal of a blog post is “that a user reads it”, which is difficult to measure as it’s not an on-page interaction. But just because a reader doesn’t do anything on that page doesn’t mean that it’s not of value to you and your business.


How do we go about tracking off-page interactions? It’s very difficult to do, and chances are the data isn’t going to be completely accurate, but there are ways to get an idea of what’s actually happening. With Sagan’s blessing, I attempted a couple experiments with her blog metrics.


My hypothesis was, “If I trigger an on-page interaction using a 10 second timer, it will provide an accurate Time Spent on Page metric and a lower Bounce Rate metric.” In laymans terms it means I want to force an on-page interaction after the user has been there for more than 10 seconds.

In order to do this, I had to define a “timer event” in Google Tag Manager and then tell Google Analytics to capture that event after 10 seconds. Once I set up the timer, I let the experiment run for three weeks to see what would happen.

My hypothesis was spot on; Sagan’s bounce rates dropped from 90% to ~5% and her Time on Page shot up by 50%. I was excited to tell her that the experiment had generated positive results and Sagan was grateful to see the Time on Page measuring correctly now (or at least so I thought). However, now that

we had made adjustments, something still wasn’t sitting right with her. After another coffee chat, she raised the question “Is my Bounce Rate now too low?”

That was a fantastic question. A super low bounce rate usually indicates that Google Analytics isn’t set up correctly. Since I was the one who was playing with the metrics, it made me wonder, “Is the Bounce Rate actually the right metric for her? Or should it be the Exit Rate? Or Both?”


While a Bounce is when a user arrived and left on the same page, an Exit is when a user leaves your webpage after conducting some type of on-page interaction or navigated at least one additional page (e.g. Page A > Page B > Exit). Overall, the main difference between the Bounce Rate and the Exit Rate is whether or not the user conducts some kind of an action.


After thinking more about which metrics she should care about, I realized it should be both Bounce Rate and Exit Rate, but it would depend completely on the page. The problem with the “10s timer event” is that it fired the event regardless of which page a user was interacting with. This would severely mess up bounce rates for people who are coming to non-content pages (e.g. Home Page).

Ultimately, I drafted up a new hypothesis: “If I track a user’s scroll depth by the percentage of the page’s height, and only on pages that strictly contain blog posts, the further a user scrolls, the stronger an indication that the user is reading her content.”

Scroll depth measurement isn’t a new concept in the analytical world. There are many articles explaining how to best implement it, but many of these articles are using ga.js (aka. Old Google Analytics) and are hard-coded directly into the website. This poses as a problem because Google upgraded to Universal Analytics in late 2014, bringing with it a wide range of new benefits.

To throw another wrench into things, I believe all tracking codes (including Google Analytics) should be fired through Google Tag Manager. This makes it much easier for Digital Marketers (such as myself) to build customized events into Google Analytics. Without it, you’d need a developer to hard-code everything throughout the website. Not only is this tedious, but developers hate it.

Eventually, I stumbled upon Scroll Tracking Implementation in Google Tag Manager V2 – The exact article I needed. Not only do they explain the implementation, but they provide the source code free of charge, saving a lot of coding hours for experimenting!

After going through the implementation steps, making a few adjustments to the code (I wanted to fire a 10% scroll), and conducting a few QA tests, I launched the code and let the data flow. Another three weeks had gone by and the results were in:

  • Bounce Rates on non-blog post pages increased by 208% and are sitting back normalized levels and Exit Rates increased by 7%.
  • Bounce Rates on blog post pages decreased by 13% and Exit Rates didn’t change.

Sagan and I both believed that this was another successful experiment as she can now see exactly how far her users are reading and I got her bounce rates back to “normal.” However, what I did not expect was that the Avg. Time on Page metric increased by 181% where some blog posts were reporting over an hour spent viewing the page. This, in my opinion, is completely incorrect. Especially given that her session timeout settings are set to 30 minutes.


My original intention of this article was to provide you, the reader, with ways to better track blog readership. Yet, after weeks of testing and experimenting, there are still questions to be asked and metrics to measure. I have a few theories why her Time on Page increased, but if I continue on experimenting, this article would never see the light of day.

The key takeaway of this article is that, as a marketer, you should always experiment with your marketing – this is also known as “Growth Hacking.” We’re living in a world of big data, and there’s no way of analyzing it all. However, by picking and choosing small scale problems, you can easily experiment and work your way to a larger goal (in this case, identifying and measuring a blog in the best way possible).

If you’re interested in understanding Google Analytics for your blog or small business, feel email me by clicking the button below.

Email Me

Leave a Comment.