Know the Media—Data Journalism


This past week we got online hits on The Atlantic, Vanity Fair, The Boston Globe and Deseret News (and also this fun “Today I Learned” subreddit) for research we created two years ago. Guess how many hours of work it took to land these placements? Zero. In fact, the client is no longer actively engaging in media relations (which might explain why they aren’t mentioned in the articles).

So what? At Hodges we spend an inordinate amount of time researching what journalists are looking for and need in order to do their jobs. Successful media relations hinges on matching their interests with our clients’ stories (or finding ways to create stories that will). One way more PR pros are going about this, either to the media’s ire or rejoice depending on our standards, is by leaning into data journalism.

What is data journalism?

The short of it is this, more and more journalists are looking to expose and explain stories found by carefully collecting, analyzing and presenting the growing amount of data that the information age has gathered up. The oft-cited examples of the trend’s growing importance are the rise of Nate Silver and FiveThirtyEight and Ezra Klein’s VOX (although not without reasonable critiques of those examples)—the latter has a good primer about how they define data journalism, written by Melissa Bell.

Putting this into practice

The groundswell of data journalism is too great to ignore, and more importantly, the opportunity is too great for us and our clients replicate ourselves. Take the Lost-Hour Economic Index—that’s the study that led to all those delicious hits up top that generated an insane amount of coverage when it was first released. 

Here are the combined elements of that program that made it so successful:

  • A timely news hook—daylight saving time is a much-loathed but often assigned news hole at media outlets
  • A rigorous methodology—our clients wisely invested in a third-party econometrics firm and the methodology they recommended
    • The original study with methodology was made available to journalists—the top-flight media prefer and often demand the “clean” data for veracity and to interpret themselves
  • It was novel research—it was easy to understand and tied to the client (sleep tips and information to overcome DST was the natural tie back to’s goals)
  • The data was deep enough to do both a national story and market-by-market stories
  • We pitched it like crazy

Not every client has the resources to undertake this sort of research, but at the same time, data journalism isn’t going anywhere so we need to meet them halfway. Especially among top-tier media outlets, journalists don’t care about trumped-up lists or unscientific surveys when they have publically available databases with data sets of hundreds of thousands or millions. But pulling out insights from data, when done right, is a laborious process, so PR pros have an opportunity to either beat them to that data OR create new research. Either way, we should try our best to meet or exceed their high bar for accuracy and credibility.

The most recent example that fits this strategy, the New York City comptroller’s release of a study that found that it “costs NYC $1.8 Million to Clear One-Inch of Snow”. Now there is a factoid that is ready-made for snowpocolyptic news cycles.

Next time you have a pitching assignment for a client, take a moment to think like a data journalist. What research could your client sponsor that would be novel, rigorous and resonate with journalists and their audiences, even two years after its release? Does your client have any deep data it can share, provided (and this is a big provided) it’s not obviously proprietary or could be used to expose personally identifiable data.  At the very least, start keeping track of what types of data media are using and how they are delivering it to their audiences so you can be attuned to opportunities and threats for your clients.

Leave a Reply

Sign up to receive our blog posts by email