Here’s an interesting thing: data journalism is the mainstream. Or indeed:
It’s been threatening to do this ever since I wrote this piece. Now it’s everywhere.
If 2014 was the year that saw new outlets develop their own brands of data journalism (think 538, The Upshot, Vox…), and 2015 was the year it got into its stride; then 2016 is the year that it takes over a whole election. It is not just the US: take a look at the winners of the Data Journalism Awards: this is a global movement.
It’s become so ubiquitous that many organisations — from newsrooms to big companies — think just by putting numbers into infographics that they are practising data journalism. That is not the case. For data to become data journalism, it has to do something else: it has to matter.
So, there is a definition of data journalism that I would stand by. Data journalism is:
Using data to tell stories in the best possible way, combining the best techniques of journalism: including visualizations, concise explanation and the latest technology. It should be open, accessible and enlightening.
And just because there is data being used in news stories in 2016, doesn’t mean that it always fulfills that definition.
Take one part of the equation. Open data journalism means making data sources available for others to take on what you’ve started. It’s enlightened self-interest really: if you make data available you often get better stories in return. Yet for many organisations, this is still a difficult pill to swallow: go through this list of news data sources I compiled in 2014. Many of them have not been updated for months. So often, the open part of the data journalism equation is deemed less important than the others. (Check out our data team’s GitHub data page here.)
The thing is, data journalism can change the world, and you don’t have to look far to find examples. In fact just this week there are two striking ones, one from somewhere you might expect, the other from somewhere you might not.
Hell And High Water, developed by ProPublica and the Texas Tribune is a powerful exercise using the latest and best techniques of data journalism. It reveals an untold story: how Houston in Texas, the home of America’s petrochemical industry, has become a “sitting duck for the next big hurricane”.
It uses mapping to explain how the channels worked, the risks of a disaster and what it would mean for the United States. It’s a sobering read made so much stronger by incorporating straightforward but sophisticated graphics and visuals.
But there’s another example from the last seven days too, so bear with me. Fox News, broadcasting the Republican Party debate from Detroit, used graphics to fact check candidates while they were debating.
It was incredibly powerful. If the moderators had just quoted the facts, instead of visualising them, the reaction would have been very different. Humans are visual creatures. I can quote numbers at you until I’m blue in the face, but a graphic can make the case for me in half the time.
As the Washington Post noted, it changed the nature of the debate. And in an election campaign where numbers are bandied about by the candidates on a constant basis, that is a big deal.
At face value, there really couldn’t be two more different examples of data journalism in action. But each, in its own way, did something important.