ROMAN9

tech comm – elearning – information experience


2 Comments

If tech comm content is an asset, can analytics measure its value?

Analytics, like school league tables, should be approached with a healthy degree of scrutiny. As more of our technical content, such as help, white papers and product guides, goes out onto the Internet, the pressure to measure and quantify its value using analytics increases. analytics-image

From post-sale necessity to valued asset

Let’s start with some good news. As companies wrangle growing quantities of diverse content, which has to look great on any device out there, technical content is included in that challenge. That’s a good thing. We now apply weighty, finance-derived words such as assets and collateral, previously reserved for the domain of marketing, to a wider range of content. We recognise that content which was previously destined exclusively for post-sale audiences is now increasingly used up front before a purchase is ever made. This raises its perceived “strategic” value, and contributes to blurring the line between technical communication and content strategy.

What can we measure, and what does it tell us?

For example, you host user help on your web site. Add an analytics code on every help page, and you can start measuring traffic. Simples. But what does that traffic tell you? Is it confirming what you already know, or delivering new, actionable insights?

“Data is your eyes, not your brain.” — Colleen Jones of Content Science and author of Clout, the art and science of influential web content

If you have an established relationship with your help desk or tech support team, you probably already know the main “gotchas” and stumbling blocks experienced by your customers, and you’re either providing supporting material to help resolve those issues, or you’re lobbying for an improved user experience to remove the “gotchas” in the first place. The traffic gives you a benchmark against which to measure changes you make to your content. Over time, you can track the effect of changing titles, improving landing pages, adding detail, removing clutter, and moving high value elements to more visible areas on a page. Lana Gibson, of the UK’s Government Digital Service (GDS), wrote a great post in February, on the analysis of analytics data and influence of changes to content: GOV.UK page performance: are we fulfilling our content goals? The clue to the real value, however, is in the last part of Lana’s blog title – content goals. The GDS is measuring against specific goals which they have set based on the aims of GOV.UK.

This highlights one of the challenges for tech comm: once your technical content is on the web, it is no longer exclusively used in a post-sales context by people with the same types of issues that are coming into your help desk. You have to take that into account when analysing the data or your interpretation is skewed. Also, does your analysis of the analytics data take the wider organisation’s content goals into account? Can you accurately define, or measure its “value” unless it does?

Please do share your thoughts on, and experiences of, analytics in tech comm in the comments. If you’d like to wade in on school league tables too, please do. Both are on my mind right now. Lastly, I wanted to give a hat tip to Indi Young, whose talk on Practical Empathy from UX Lausanne last year made me challenge some of my thinking on analytics to date. It’s 45 minutes long – grab yourself a cuppa and enjoy.

Indi Young – Practical Empathy from UX Lausanne on Vimeo.