As Visual.ONS closes down and moves to the main ONS website, we wanted to reflect on the challenges we’ve faced in communicating statistics and data to a wider audience. We have already looked at what we’ve learned from three years of experimentation, but how have we faced and dealt with communication challenges?
Sometimes a car crash is not actually a “car crash”. Sometimes – in the language of mortality statistics – it’s a “land transport accident”. Is a petrol or diesel car a “conventional fuel vehicle” or just a “car”? If there’s a huge reduction in net migration to the UK, how do we determine whether or not it’s a “record fall”? And when we look at the UK economy since 2008, can we see which regions of the country have improved the most? How about which cities? Or which postcodes?
These are the sorts of questions we have faced over the last three years of producing articles, features and interactive tools for Visual.ONS. Many of our challenges fall into these categories:
- working out which statistical terms and classifications can be simplified for our users (and which can’t!)
- finding out which of the many facts are the most important to our users, and deciding whether it’s right to make those more prominent than the others
- determining whether the data we hold can support the analysis we want to do
Why are we doing this?
The UK Statistics Authority’s strategy for 2015 to 2020, Better Statistics, Better Decisions, sets out to improve public understanding of statistics, attract new users, and make the stats and analysis “more accessible, engaging and easier to understand”.
That’s where we data journalists come in. Everyone in our current team was recruited from outside the ONS, and include print, digital and broadcast journalists, who are used to communicating complicated issues to diverse audiences.
We call the kinds of people we’re trying to reach “inquiring citizens” – people who may not be statistically literate, but want the ONS to help them discover the truth and make informed decisions in their lives.
As the Financial Times’ Undercover Economist Tim Harford said, “the case for everyday practical numeracy has never been more urgent.” As statistics are used – and sometimes abused – in public debate, we want to help people to do their own fact-checking.
So what have we learned to do so far?
Speak to people in their own language
One of the first challenges when writing for a non-expert audience is choosing the right words. When it comes to statistics, there are valid reasons why obscure terms are used. Maybe terminology needs to remain consistent over time, or there’s a complicated internationally-agreed methodology. Sometimes, writing “car crash” isn’t right, because this doesn’t count someone being hit by a tractor.
We can’t sacrifice statistical accuracy in favour of simplicity. Take the word “deprivation” – it’s often thought of as a synonym for poverty, but the measure we use takes into account access to public services like health and education as well as income. So when writing about people in “areas of multiple deprivation”, we discussed whether we could say “poorer people”, or “people in poorer areas”.
These are the conversations we have every week. And don’t even get us started on medical conditions and causes of death, the classifications for which are so varied and obscure that sorting through them is enough to give even the most determined spreadsheet geek a headache!
Embrace uncertainty and all the possibilities it brings
Harford urges readers to “embrace imprecision”. In statistical language, we have a term for this – “confidence intervals” – which tell you how sure we are that a number can’t be explained by chance or sampling errors. There is no easy, everyday phrase which can really fully explain our confidence (or lack of!) in numbers, so this can be hard to explain to people who expect journalism to convey certainty.
One of the ways we try to explain uncertainty or another vague concept – “likelihood” – is to show it to people using data visualisation. In a recent piece about the likelihood of people smoking, we grappled with “odds ratios“, which show the likelihood of smoking when exposed to a particular variable. This analysis allows us to show which factors have more of an impact on the likelihood of smoking. Odds ratios are a hard concept to convey in words, so we used data visualisation to help explain it.
Use real examples to create narratives rather than making assumptions
Readers may also expect us to be able to explain why a particular trend is happening, when often the data won’t be able to explain the why. In these cases, newspaper or broadcast journalists covering a story may seek out an expert in the topic to discuss potential reasons, but that is probably not what people expect from ONS. We enjoy far greater levels of public trust than the media and we want to keep it that way.
One way we can avoid making assumptions about data but help tease out potential reasons is to have case studies. Recently we have started including the “human side” of statistics. For specific stories we have used personal experiences to illustrate stories behind the numbers. In our article Who is at most risk of suicide? we worked with the Samaritans charity to tell Kristian’s story. He had considered suicide, but received help, preventing him from becoming another statistic.
Looking to the future, we want to continue the great work done by our team so far and continue thinking hard about new ways to communicate statistics to the public. Our hope is that the move from our own site to the main site helps to consolidate our experiments in communication, and helps the public when it comes to making important decisions.
By Callum Thomson and Sophie Warnes, Data Journalists, Digital Publishing