Monthly Archives: November 2010

Data journalism needs to be more than external data sets

Paul Bradshaw has a good column at Poynter about how the increasing availability of data will force journalists and news organizations to change:

Data journalism takes in a huge range of disciplines, from Computer Assisted Reporting (CAR) and programming, to visualisation and statistics. If you are a journalist with a strength in one of those areas, you are currently exceptional. This cannot last for long: The industry will have to skill up, or it will have nothing left to sell. …

So on a commercial level, if nothing else, publishing will need to establish where the value lies in this new environment, and where new efficiencies can make journalism viable. Data journalism is one of those areas.

Journalists should read and heed everything Bradshaw writes. But it’s important to make sure the discussion of data doesn’t get too narrowly confined to¬†external data, without considering how journalism itself fits holistically into the data-centric future.

The big challenge for news organizations isn’t just how to better ingest, analyze, and present extant external (if sometimes hard-to-access) data sets. Inculcating a new skill set industrywide may be non-trivial as a matter of scale and institutional-cultural inertia, but at least that skill set is pretty well defined.

Rather, the trickier and less-addressed challenge for news organizations is how to turn the raw materials and finished products of non-database journalism into data.

Continue reading

Advertisements