Data-Driven Future Provides Opportunities and Perils for Journalism

Reading time: 2 min

A recent article in Say Daily by ReadWrite editor-in-chief Owen Thomas testifies to the growing importance of data in the new online journalism landscape. This conversation has become more salient as tech continues to eat up traditional media revenue streams while providing exciting new ways to tell stories, and big names in big data like Google, Amazon and Mozilla are partnering with journalists and journalism institutions. The piece covers “data journalism” as both a type of reporting and a means to creating a sustainable and engaging model for journalism. The latter argument is interesting, if only for the fact that it falls outside of what most people would call “data journalism,” or even the increasingly popular term, “data-driven journalism”. “Data ought to inform the entire operation that creates the product, not just the newsgathering,” writes Thomas. The point is correct, but feels a bit tired in 2013. It rests on the assumption that media organizations are not already savvy users of data and other forms of technology and information; an assumption that doesn’t ring true. Well-applied analytics and strong audience engagement are the reason so many articles have been reborn as “the list post,” why an increasing number of news stories are told in infographic form, why some news outlets are getting rid of standard length reporting altogether, in favor of shorter live-blogs — while others may be focusing on long-form pieces based on what the data shows to be their readers’ preference. Many offices hold online live chats about their most popular stories, which connect audiences, reporters and newsmakers: again, all based on viewer data and feedback. Data are already making the news world rethink how to tell stories, all the way from local coverage of fires in California to the “black budget” of our National Intelligence Program. The challenge to editors, producers and executives, then, is not listening to or using data — that’s already happening — but rather to approach data with editorial judgement, determining when to use it and when to ignore it for the sake of journalism, journalism innovation and the good of the organization’s reputation and goals. Gleaning trends from news data is useful, but trusting the data too much runs the risk of giving it the dubious power to privilege some news stories over others, essentially becoming a sort of automated managing editor. While this is good for pushing big stories like Syria to the top of the page, it risks making smaller stories all but disappear. “What we need as an industry,” writes Thomas, “is not a slavish adherence to data, but an interest in it as a proxy for the real human beings who make up our audience.” This is a sentiment everyone in news can agree with. When analytics reduce the news and audiences to numbers (even well-defined and descriptive numbers) and patterns, perhaps the biggest risk of all is for news organizations to fall into a model of virality and engagement, stymying storytelling creativity instead of incentivizing it. Allison McCartney is an editor at the PBS NewsHour focused on education and informational graphics, and a freelance designer in the marketplace. She has a bachelor’s degree from Washington University in St. Louis, where she studied Middle Eastern history and art. You can follow her on Twitter @anmccartney.

See how ScribbleLive drives results

Schedule a tour to see how ScribbleLive can help your content succeed predictably.