Understanding Content Engagement: Key Events to Track and Metrics to Choose

Portrait of John Hughes
By John Hughes

24 September 2025

Part 3 of 5

Uncovering the KPIs That Really Matter for measuring Content Success.


Introduction

In the third part of this blog series about content measurement, we will discuss various types of events to track and how they can be used to indicate success. While we reference Google Analytics 4 and Google Tag Manager practices, the principles we outline are universally applicable to the relationship between content and web analytics.

To be clear, we are talking about non-transactional content in this blog post. We may tackle transactional service measurement in another post in the future, but hopefully the kinds of things you would track around a transactional service should be more easily identifiable. It should be obvious what a transaction is and therefore what you might call a conversion (or ‘key event’ in Google Analytics parlance).

The problem we need to solve is that for non-transactional digital services, there is not always a very clear idea of what a ‘conversion’ might look like. How should we think of success for a digital service that is non-transactional. What makes a good metric or KPI? How can we measure that? To discover how Storm can help you improve content measurement through Google Analytics 4, speak to our team.

“Content engagement metrics go beyond simple page views and time on page; they provide a more nuanced understanding of user behaviour.”
16 to 18 min read

The REAN model

We have discussed REAN before as a model for understanding marketing channels and user journey measurement. It is an acronym for Reach, Engage, Activate, Nurture, the four stages of the relationship between a digital service and its users. Important for us in the context of this post are only the Engage and Activate stages, as these are driven by the actions of the users (the Reach and Nurture stages describe the actions of the organisation, conversely).

Engagement describes how a user interacts with the digital service overall. Activation describes more specific interactions that generate a relationship with a customer. In ecommerce terms, you might describe browsing a catalogue as engagement, and purchasing as activation, for example. However, activation is less specific when it comes to a non-transactional service. In these cases, the lines between what is engagement and what is activation become less clear, as activation is often either a subset of or perhaps a combination of events that might normally be considered as just engagement. Therefore, we prefer to distinguish them into interaction events and activation events to try to keep the delineation clear.

Interaction events and activation events

Interaction events – those engagement events that simply describe user interactions with the digital service, but don’t on their own describe completed success.

Activation events – those engagement events that either solely or in combination meet a threshold to describe success.

Direct measurements and proxy measurements

Analysis is also complicated by the way some event tracking is named. For example, we talk about tracking file downloads, but typically we are in fact just measuring link clicks – the link may lead to a file download, but there is no actual measurement that the file downloads, there are no errors, the user opens the downloaded file and so on. This means the true nature of what has been measured can be obscured through language, and insights may be biased.

We can describe some events as direct measurements, and others are simply proxy measurements. For example, we might measure a form submission in one of two ways – when the submit button is clicked, or when the form received message is posted back to the user – the first of these is a proxy measure in the sense that we don’t know at this stage if the form will pass validation. The second is a direct measure, as the completion message would only be posted to users who have successfully submitted the form.

Enhanced measurement in Google Analytics 4 is guilty of using proxy measurements for file downloads and form submissions in this sense. However, it would be unfair to say Google itself is the driver in this process, more that the industry as a whole is at fault. As an aside though, I would say that Google is at fault that enhanced measurement does not always seem to track proxy measurements accurately, and I would always recommend switching off most enhanced measurement to control tracking through Google Tag Manager.

Two-dimensional thinking about event tracking

Taking these two concepts into account, we can consider event tracking as being a matrix of events that we want to track. Some events are proxy and some direct and some are interactions, and some (alone or in combination) are activations.

This gives us a table with four quadrants, with each quadrant giving a different signal strength for success.

  Proxy Direct
Activation Medium signals Strongest success signals
Interaction Weakest success signals Medium signals

We will come on to some examples of the kinds of events and how you might categorise them later in this blog post.

Why measuring content engagement matters

Content engagement metrics go beyond simple page views and time on page; they provide a more nuanced understanding of user behaviour. Tracking engagement metrics allows you to:

  • Evaluate the effectiveness of your content strategy.
  • Identify which types of content resonate most with your audience, especially when coupled with a content segmentation model.
  • Discover pain points in user interactions.
  • Make data-driven decisions for future content creation and optimisation.

Pre-existing Google Analytics 4 measures

Engagement rate

Engagement rate refers to the rate at which GA4 has tracked engagement within a session on your web site. This is based on any of the following criteria being met:

  • The session is longer than ten seconds,
  • The session has a key event
  • The session has more than one page view

Engagement rate is calculated as percentage of engaged sessions compared to all sessions as a percentage.

= (engaged sessions / sessions) %

The problem with engagement rate then is on two levels. Firstly, the engagement rate measures the session, not the page. Secondly, the engagement rate benchmarks its data against the entrance page of the session. If you imagine a user lost inside an informational service, unable to find the content they need, they will likely see more than one page, and thus be counted in the engagement rate. Even if they eventually find the content they need, the engagement is tracked against the first page URL they saw, not the page that helped them.

Bounce rate

Bounce rate is the exact inverse of engagement rate, a slightly different definition to how bounce rate was defined in Universal Analytics, but similar enough overall. A ‘bounce’ occurs if a user leaves the website on the first page view, in less than ten seconds, having not triggered a key event, i.e. all the criteria for engaged sessions are unmet.

Therefore, bounce rate is calculated as one minus engagement rate as a percentage.

= (1 – engagement rate) %

This means bounce rate is as meaningless as engagement rate for measuring content performance. A bounce indicates that either the user found the information they were looking for (e.g. a phone number) in less than ten seconds, which is potentially true for some types of content, or that they immediately felt that this was not the website they were looking for and had no desire to review the content to find the information they needed, which doesn’t feel as much a comment on the quality of the content so much as whether the traffic source bringing them to the site was misleading, or perhaps the user experience is terrible, neither of which are usually the fault of the content.

Enhanced measurement

Google Analytics 4 supports some automated measurements, which they call enhanced measurement. Typically, you need to make sure they are switched on in the GA4 admin area to ensure you have access to them.

Enhanced measurement includes:

  • Page view – tracked each time a web page is loaded.
  • Scrolls – tracked each time a user reaches 90% down the page height.
  • Outbound clicks – tracked each time a user clicks on a link to an external website.
  • Site search – tracked each time a user sees a search results page on your website.
  • Form interactions – tracked each time a user starts or submits a form on your website.
  • Video engagement – tracked each time a user starts an embedded YouTube video, when the video passes 10%, 25%, 50%, 75% duration, or when the video completes.
  • File downloads – track each time a user clicks a link to a common file format such as PDF, CSV, XLSX, etc.

The problem we find with enhanced measurement is that it frequently doesn’t work. It simply fails to track all the necessary clicks that it should to track the items it claims to reliably. We find ourselves frequently just recreating the enhanced measurement criteria in Google Tag Manager and using that, which does pass muster with our rigorous testing.

This also solves the other problems with enhanced measurement – each measurement is just not specific enough. For example, scrolling to 90% could be misleading on sites with large footer areas. Furthermore, much of enhanced measurement tracks proxy measurements, but the language around things like file downloads implies more certainty.

Average session duration

Average session duration is a great metric for understanding how long a user engages with a digital service. It gives a good broad-brush measure of engagement in this sense, and therefore making changes to content can show a change in average session duration.

However, it is not a useful metric when it comes to understanding individual pages as it is a session scoped metric. It lacks the context of the individual pages. Furthermore, it can be influenced by so many other factors, such as the page load speed, device profile of readers, and so on. Additionally, an increase in average session duration might indicate a more engaged reader, but it might similarly indicate a confused reader, hunting for the answer to their need.

Average engagement time

A similar metric to average session duration, but different in that it measures the time that a website is in the foreground of a user’s device, meaning it is more contextually accurate than average session duration.

However, many of the same drawbacks are implicit. It is a session-scoped metric, meaning it is less useful per page. It is influenced by things such as the page load speed and UX as much as the content, and a longer engagement time could indicate either good or bad things about the content.

Tracking custom events and interactions

Here are some ides of custom events and interactions that you might track. At Storm we have used all of these at different points in time but remember to only track the things that will be useful in the context of the digital service you need to measure.

Replacing enhanced measurement

We mentioned earlier that we have found Google’s enhanced measurement to be unreliable, and hence inaccurate, so in new GA4 installations we now always delegate click-based events to Google Tag Manager.

This includes:

Event type GA4 events Context
External link click click

Any click on a link where the Click Hostname in Google Tag Manager is not the same as the Page Hostname. For this logic, we typically create an Auto-event Variable to identify the element hostname of the click, or a Regex Look-up Table for more complex logic. More information about Google Tag Manager variable types.

To replicate exactly enhanced measurement, track these parameters:

  • link_classes
  • link_domain
  • link_id
  • link_text
  • link_url
  • outbound (true)
Form interactions
form_start
form_submit

Best: Ask developers to send dataLayer events when the form is interacted with. Use these dataLayer events to trigger form_start once per page (assuming one form on a page). Additionally, send a dataLayer event when a submitted form successfully passes any validation rules, and use this to trigger form_submit.

Alternative: Use CSS selectors in Google Tag Manager to identify when form elements get focus and use this to trigger form_start once per page (assuming one form on a page). Also use CSS selectors to track clicks on the form submit button and use this to trigger form_submit. This is closest to how enhanced measurement tries to track forms, but risks tracking invalid submissions (e.g., with validation errors).

To replicate exactly enhanced measurement, track these parameters:

  • form_id
  • form_name
  • form_destination
  • form_submit_text (form_submit only)
Video engagement
video_start
video_progress
video_complete

There are a number of great video tracking recipes that enable tracking into GA4 in a more robust manner than enhanced measurement allows.

Use these recipes and instructions to set up video tracking for YouTube, Vimeo, or HTML5.

To replicate exactly enhanced measurement, track the stated parameters for your chosen recipe.

File downloads file_download

Use a Regex Look-up Table to process the Click URL of any click and match if there is a file extension in the Click URL for a filetype a user might download. Use this to trigger the file_download event.

To replicate exactly enhanced measurement, track these parameters:

  • file_extension
  • file_name
  • link_classes
  • link_id
  • link_text
  • link_url

Note that you may not want to replicate enhanced measurement exactly, but it is useful to ensure keeping your data model largely aligned with that which GA4 users might expect.

Extending enhanced measurement with other link click types

GA4 enhanced measurement only looks at clicks on external links, but you may be interested in other kinds of links too.

We often look to track:

Event type Event name Context
Contextual internal links internal_click

These are links which are in the main content section of the HTML, perhaps used as signposts or CTAs. You need to have suitable semantic HTML to be able to distinguish these from clicks on navigation clicks on the menu or footer.

We have used CSS Selectors to identify links within the main content area that have the same Click URL Hostname as the Page Hostname and used this to trigger this event.

Use the same parameters as the click event, except ‘outbound’ should be set to ‘false’.

Contextual external links external_click

Note that these will be double counted as external link clicks in enhanced measurement and you may want to add trigger exclusions to prevent this.

These are similar to the internal_link_click event, limited to the main section in the same way. The logic for these being the opposite in that the Click URL Hostname is not equal to the page hostname.

Use the same parameters as the click event.

Email link clicks email_click

These are link clicks where the Click URL starts with ‘mailto:’.

Use the same parameters as the click event but make sure not to track personal email addresses (which will potentially be in the link_url and link_text parameters). We typically replace the full email address with the first 3 characters followed by 3 asterisks then the domain part of the email address, which should be enough to identify which email link was clicked, without risking the storage of personal data.

Telephone link clicks tel_click

These are link clicks where the Click URL starts with ‘tel:’.

Like email addresses, track the same parameters as the click event but update the link_url and link_text parameters to mitigate the risk of personal data being tracked into GA4.

Understanding if content has been read

It is likely with most kinds of non-transactional content that one of the key things you want to understand is whether the content on the page has been read. The problem is that there is no way to be certain that it has been. The best we can do is to consider what signals would need to be present for a user to have read it, and then assume that if these signals are there that they probably did read it.

This great post by Stephanie Coulshed gives some great insight into content designers’ thinking on this for advice and guidance content, and this approach seems suitable to extend across other kinds of written content too.

We have covered internal and external link click events above, but Stephanie also looks at two other signals:

Event type Event name Context
Time on page > 30 seconds No event tracked

This is a Google Tag Manager timer trigger that fires once the user has been on the page for 30 seconds.

Scroll past 75% of the page No event tracked

This is a Google Tag Manager timer trigger that fires once the user has scrolled to at least 75% page depth.

These two triggers individually do not fire any events but in combination the following event is recorded:

Event type Event name Context
Page read page_read

Fired on a page once only, when both the Time on page > 30s AND the scroll past 75% triggers are true.

The parameters to track with this event should be collected automatically (i.e. page_location and page_path).

User satisfaction

The GDS digital guidelines recommend tracking user satisfaction for transactional digital services, but this concept can also be useful for non-transactional services.

Adding a simple feedback form at the foot of each content page can allow users to give feedback about how useful they found your content.

We might normally expect to take the form of a likert scale (from ‘not useful’ to ‘very useful’) mapped to scores of 1 to 5. Additionally, you might collect qualitative feedback, but we are most concerned here with tracking the likert scale. Furthermore, qualitative feedback carries a risk of containing personal data and so should not be sent to GA4, but instead sent to an email address or appropriate database for which you are the data controller from a GDPR perspective.

In my experience, users infrequently go to the trouble of providing user feedback unless they have had a sufficiently negative experience, meaning that this method is less useful in identifying high-performing content, but conversely it is useful in discovering problem areas.

By tracking the feedback score, you will be able to identify pages where users report being dissatisfied with content and take appropriate action.

Event type Event name Context
User satisfaction score input user_satisfaction

Tracked when a user submits a user satisfaction likert score on a web page.

The parameters should include:

  • page_location
  • user_feedback_score

Other events you might consider

No two content pieces are the same in terms of their content, but also how users might be expected to interact with them. The events above are great general success indicators, but some content may contain other functionality that aligns with its purpose. Such functionality should also be tracked.

Some things that you might consider include:

  • Interaction with accordions, carousels or similar on-page controls.
  • Button clicks
  • Newsletter subscriptions
  • Survey submissions
  • Listens to embedded audio
  • Zoom/control/interact with image functions
  • Zoom/control/interact with map functions
  • Clicks on social media links
  • Clicks on “bookmark this page” or “send to a colleague” functions

Proxy for time on page

Additionally, we discussed above how average session duration could be a misleading metric. GA4 doesn’t have an “average time on page” metric natively, but it is possible to construct one using the following method.

Event type Event name Context
Time on page every 30 seconds page_timer_proxy

This is a Google Tag Manager timer trigger that fires on the page every 30 seconds, sending the associated event.

Send the parameter page_location with the event.

To calculate an approximate time on page for all page views for a given page, sum the event count for this event for that page URL, multiply by 30 seconds, and then divide by the page views for that page URL in the same time period.

If you really require more accuracy, you could fire the event more frequently than 30 seconds, but be mindful of the potential impact on browser performance of firing too many events too frequently, and of collecting too high a volume of data.

Choosing the right KPIs for your content strategy

Not all metrics are equally relevant for every content strategy. Here are some tips for shaping your measurement plan based on your specific needs:

1. Define your business objectives and user goals

Start by clearly defining what your digital service a success (your business objectives) and what user needs your digital service provides (the user goals). Consider how the business objectives and user goals align with each other. Are you looking to onboard new users, or to increase engagement from existing users? Do you have specific goals to move users from other channels such as call centre to self-serve on digital? How you define your business objectives and user goals will guide your choice of metrics.

2. Identify key performance indicators (KPIs)

Based on your business objectives and user goals, identify the KPIs that will help you measure success. For example, if you want to increase the number of users that read your content, you might consider measuring scrolls and time on page, alongside clicks on call to action links.

3. Focus on user behaviours

Choose metrics that provide insights into user behaviours, such as scroll depth, engagement with functionality like video and maps etc., and link clicks. Understanding how users interact with your content will help you make better decisions than simply pageviews and session could.

4. Use multiple metrics

Relying on a single metric can be misleading. Use a combination of metrics to get a comprehensive view of user engagement. For example, combining time on page, scroll depth, and button clicks can provide a more accurate picture of content performance.

5. Regularly review and iterate

As you collect data about user behaviours, you will naturally be able to compare changes in performance with benchmarks, and identify what metrics correlate with improve digital service performance overall. This allows you to refine your measurement and reporting strategy to enable faster and more effective data-driven decision making. Continuously monitoring and refining your metrics will help you stay aligned with your objectives.

Practical applications: case studies

Let’s explore a few practical applications of these metrics through hypothetical case studies:

Case study 1: enhancing blog engagement

A content marketing team wants to increase engagement on their company blog. They decide to track the following metrics:

  • Scroll depth to see how far readers get into each post.
  • Time on page to measure overall engagement.
  • Button clicks for CTAs encouraging newsletter sign-ups.
  • A “Conversion” key event, triggered if either a CTA button is clicked or both the scroll and time on page measure pass a minimum threshold.

By analysing these metrics, the team discovers that readers often drop off after the first few paragraphs. They decide to restructure their posts to include more engaging openings and strategically place CTAs. As a result, they see a significant increase in both scroll depth and newsletter sign-ups.

Case study 2: improving video content

An e-learning platform aims to enhance the effectiveness of its video tutorials. They track:

Video view duration to see how long users watch.

Completion rate to measure how many users finish the videos.

Feedback submissions to gather user insights.

The data reveals that many users drop off midway through longer videos. Based on this insight, the platform decides to shorten the videos and add more interactive elements. User feedback also highlights areas for improvement, leading to more engaging and effective tutorials.

Case study 3: optimising advice and guidance content

A local authority wants to track engagement with advice content. They track:

  • Number of opens/closes for each accordion item.
  • Scroll depth to see how far readers get into each post.
  • Time on page to measure overall engagement.
  • Clicks on links that signpost citizens to other helpful content
  • User satisfaction feedback to measure effectiveness.

The team finds that certain guidance is more frequently accessed but has low satisfaction ratings. They revise this guidance to be more comprehensive and user-friendly. Additionally, they notice that some guidance is rarely engaged with, indicating that users may not find them relevant. They reorganise the advice content based on user behaviour, leading to increased engagement and higher satisfaction scores.

Conclusion

Effectively measuring content engagement requires tracking a variety of events and interactions. By understanding the key metrics and selecting the right ones for your scenario, you can gain valuable insights into user behaviour and optimise your content strategy. Remember to define your business objectives and user goals, identify relevant KPIs, focus on user behaviours, use multiple metrics, and regularly review and iterate your approach.

With these strategies in place, you’ll be well-equipped to measure and enhance content engagement, ultimately driving better outcomes for your digital presence.

To discover how Storm can help you improve content measurement through Google Analytics 4, speak to our team.

"We prefer to distinguish them into interaction events and activation events to try to keep the delineation clear."