User testing our interactives
Post by James Bayliss, Digital Content
Design is not just what it looks like and feels like. Design is how it works.
Steve Jobs
We know that the most important part of any interactive is the user engagement. Whether it’s an interactive map, calculator or quiz, it’s essential to us that a user knows how to interact with it and that it’s easy for them to do so. User experience (UX) testing is therefore very important to developing our work. In late 2014 the digital content team were approached by the Treasury to design an interactive to support the new Pension Wise website that could tell a person their life expectancy and therefore how long their pension might need to last. I took this as a great opportunity to experience an external customer’s approach to interactive design and testing
Stranded in the Strand
Fast forward to mid-March 2015, and I find myself walking around the Strand, central London in a mildly perplexed, confused state failing badly at locating System Concepts, the hosting company for that day’s user experience testing session.
I eventually discovered that the venue was not actually were Google said it was, but instead inside the entrance to the Savoy Hotel!
Carrying out the user testing
The day’s target demographic (and the main target of the interactive) were people aged 55 and over, nearing retirement and therefore with a keen interest in their pension options. Six user interviews were undertaken in total, each lasting one hour, and aiming to get feedback on both mock pensions advice feedback letters each interviewee had ‘received’ and the Life Expectancy calculator interactive we had been developing. The six selected people – three men and three women – scheduled for interview that day had all undertaken preliminary face-to-face interview guidance sessions a few weeks prior where they were shown mock-ups and static designs of the new Pension Wise website.
Each interviewee was initially given freedom to explore the interactive without influence or instruction. After a while they would be asked fairly general questions to prompt discussion and response (for example, “How does this make you feel?”, “What do you think the graph is showing?” and “What is life expectancy?”). Reactions varied widely, from the pointed “This is absurd. I would switch off. Awful”, to an inquisitive “How do they calculate this?” (or “how the hell do they think this up?!” from one particular participant) and “What is SPA?” (state pension age), “I don’t understand it at all” and the declarative “I don’t want to live to one hundred”).
Interestingly all participants guessed their life expectancy or underestimated it by between five and seven years, even though a number of them would not necessarily consider ascertaining their own life expectancy and did not dwell on this particular issue. This may well be due to people tending to compare themselves to their parents and grandparents lives, and maybe not taking into account improvements over time in general health and people’s lifestyle. This is an example of our interactive challenging people’s perceptions compared with the real world (“people’s perceptions don’t always add up”). This was interesting and very noteworthy, and reinforced our end intention of this interactive to make users really consider how long potentially needed to make their pension and other retirement resources last. There was certainly a sudden noticeable ‘sit-up-and-think’ point with many of the day’s testers.
Playing with the UX testing suite
The testing suite consisted of two small adjacent windowless rooms; one each for the assessment team and the interviewer/interviewee set-up. I discovered during the day that the assessors – who had come from a range of different organisations (e.g. GDS, HM Treasury and Behavioural Insights) – would come and go depending on their own work commitments, rarely staying for all interviews.
Much like our new User Testing Lab in Newport and other such environments used in the recent past for testing ONS web material, such as ONS’s Alpha site (The research behind the alpha), the resources in the assessor’s room consisted of a single large split-screen TV mounted to the wall with speakers attached to hear conversations held in the adjacent interview room. The set-up in this second room was even more IKEA’esque in its simplicity, and consisted of a PC on a table, three cameras whose views were shown on the TV next door and the obligatory small bookshelf with plant on. The three cameras aiming to provide an all-encompassing view of the interview, being positioned one in front, one behind and a table mounted camera providing a clear view of documentation discussed and critiqued during each interview.
Learning from the UX day
Generally, the interactive was well received; participants understood the importance of knowing the information it presented to them, and how they could take it forward to benefit them, with some providing recommendations were offered by some as to how the life expectancy calculator could be improved still further. Based on their feedback the interactive was improved by de-cluttering the user interface of some unnecessary labelling, repositioning and enlarging much of the remaining information aimed at the user and improving the user input functionality and colorimetery. Below you can see how this particular interactive changed between the UX testing day in early March and its release on Visual.ONS at the end of that month.
Since then it has been very popular, having been viewed over 200,000 times. In fact, we have plenty of evidence that we reached far beyond our target audience, as it has even been syndicated by Arabic, Chinese, Greek and Polish websites (although I’m not sure they realised it’s based on UK data!).
One comment on “User testing our interactives”
Comments are closed.