| Quarterly Main Page

Revisiting Engaged Time in a Mobile World

Chris Breaux & Justin Mazur

At Chartbeat, we spend a lot of time evangelizing about the “time between the clicks” and trying to measure the “attention economy.” Behind that message is the simple premise that from a visitor’s interactions with a page, you can accurately infer whether the visitor is actively engaged. This determination of engagement is a cornerstone to our philosophy and allows us to measure time in a meaningful way. As such, it’s important that we continue to challenge our own methodology to ensure it accurately reflects ever-changing user behavior.

How we currently measure if a user is engaged

So, how do we currently determine whether a user is engaged? On all of our clients’ websites, we have JavaScript running in the background, which tracks each time a person interacts with their browser (mouse moves, scrolls, etc). Each second a person has a page in focus we ask: Did the person perform an action in the past 5 seconds? [1] If yes, we count the person as engaged for the current second; otherwise we don’t. We then total up the amount of time that this person engaged with the page.

If a person only moves their mouse one time, they will be recorded as having 5 seconds of engagement, even if they sit idle on a page for the next 2 hours. This is what makes our methodology different from counting things like time on page, which is generally estimated based on when the page was opened—and would significantly overcount this person's activity.

For example, each plot in the visual below demarks in gray when users performed actions during a 300 second period. The first and fourth plots show users that would be considered engaged for most of the interval, whereas the third plot shows a user who is mostly unengaged until around the 3 minute mark.

Recorded Actions of Five Users

When users performed an action during 300 seconds

Figure 1

Breaking engagement down

Since our engagement methodology is predicated on tracking these user events, one question you might ask is: “Are all events created equal?” We may hope to gain a more nuanced view of user behavior if we try to understand the intent conveyed by different types of events. Intuitively, one may think that a scroll event might convey the intention to continue reading on the page, whereas a mouse move may only be a signal that the user is present at her device. Perhaps, then, we should give “more credit” to a scroll event, in the form of a longer window of engagement, than to a mouse-move event.

If a person only moves their mouse one time, they will be recorded as having 5 seconds of engagement, even if they sit idle on a page for the next 2 hours.

To better understand the difference in these event-types, we started looking at user events “in color.” The following plot shows the same five users as before, but with each type of activity encoded with different colors. Already, we can guess that there might be different patterns of behavior for different users. Mouse moves (marked in light blue) are often continuous, whereas scroll events (blue) are often—but not always—more intermittent. We can attempt to more concretely quantify some of our anecdotal intuition by looking at the distribution of events and gaps between events.

Recorded Action Distinguished By Type of Action

User actions during 300 seconds broken down by type

Figure 2

One interesting metric is return probability, the likelihood that a user will perform an action within some specified amount of time, say 15 seconds. We can use return probability as a proxy for engagement in the following sense: say that we are following a user, and we have seen that her last interaction with the page was 10 seconds ago. If we had foresight that the user was going to perform another action within the next 15 seconds, we should be more confident that the user is currently engaged.

Return Probability By Type of Last Event

After scroll events there's a greater chance of the user returning

Figure 3

In the graph above, we can see that a user whose last interaction was a mouse move 10 seconds ago has about a 45% chance of returning in the next 15 seconds. After a gap of 20 seconds, a user is twice as likely to return if her last action was a scroll than any other event type. After a bit more than 20 seconds after a scroll, there is a 50/50 chance of returning in the next 15 seconds. For all other event types, the break-even point is less than 10 seconds after an event. From this graph, there is compelling data to support our intuition that scroll events ought to be given more credit.

Addressing mobile engagement

A second question, and one that ends up being closely related to the first, is this: “How does this methodology work for mobile engagement?” One school of thought would say that on mobile devices, a user should almost always be thought of as engaged on pages that are in focus. However, we find frequent examples of mobile sessions with long periods of inactivity. When we look at aggregate usage statistics, the ratio of engaged time to time on page is almost as bleak for mobile (roughly 18%) as it is for desktop (~13%).

In light of our event-based analysis above, it’s clear that an analysis of mobile usage is going to look a lot different than an analysis of desktop usage. The first thing to note is that we don’t listen to mobile-specific touch events, as they can have negative effects on client sites.

There is compelling data to support our intuition that scroll events ought to be given more credit.

We still capture the full range of user interactions because essentially all mobile browsers fire desktop mouse events at the end of a touch tap in order to better accommodate legacy web pages. However, there is no reason to think that the engagement implications of a “mouse down” arising from a touch on mobile should be the same as a mouse down on desktop.

Indeed, we see that almost 90% of the time an event is fired on a mobile device, we record a scroll event. In contrast, on desktop, scroll events comprise less than 40% of events. If we find that scroll events should be considered premium events for mobile as well, we might conclude that we are being overly conservative with our measurement of engaged time for mobile devices.

Actions By Platform

The predominant actions on desktop and mobile are very different

Figure 4

Indeed, one potentially surprising observation is that return probability for scroll events is relatively similar for both mobile and desktop. Perhaps we may then be able to use event types alone to explain much of the difference in engagement patterns between desktop and mobile devices.

Going forward

Engaged time as a concept works because it tracks something that is finite: time. I could click through a hundred pages over a lunch break or leave any number of tabs open all night, but my attention is strictly limited to 24 hours a day. By tracking the amount of time actually spent consuming a piece of content, we paint a more accurate picture of its intrinsic value.

Keeping with the spirit of painting a more thorough picture of user behavior, we’ve attempted to sketch out a refinement to the science of measuring engagement on the web. Intelligently ranking the signals from different events has potential to be a useful technique to better capture increasingly diverse patterns of use.

about the author / Chris Breaux

At Chartbeat, Chris Breaux harvests data to study Internet audiences and their behavior. In the past, he’s worked on modeling pandemics and human longevity, and he maintains a keen interest in algorithmic game theory. In his free time, you can find Chris barreling down ski slopes, solving puzzles, or laying down vocal bass lines. A New Orleans native at heart, he maintains that the secret to happiness is a good crawfish étouffée. Chris received his M.S. in Computer Science from Stanford University.
@breaux_cm / christopher@chartbeat.com

about the author / Justin Mazur

Justin Mazur received a Ph.D. in mathematics from Indiana University, where he studied Algebraic Geometry. At Chartbeat, he is a data scientist and engineer helping to build tons of cool products related to ads and real-time analytics, like CampaignIQ, an automated tool for creating natural language insights and research about ad campaigns.
@justindmazur / justin.mazur@chartbeat.com

References