Email Newsletter Success Metrics

One byproduct of all the recent articles about the growth of email newsletter that aim to “cut through the daily clutter” is an amazing amount of clutter about email newsletters. Here’s what you need to know today.

Localizing the NYT Data Viz on Police Racial Disparity

One of the most important attributes of data driven journalism is that it scales, and the primary goal of my OpenRural, Open N.C. and data dashboard projects has been to democratize data so that we start seeing the same types of reporting and presentation in small community papers that we see in the big national news sites. So when I saw Thursday’s New York Times graphic on the race gap in America’s police departments, I immediately thought that something similar could be done pretty quickly that would look at North Carolina towns.

Rebecca Tippett at UNC’s Carolina Demography service was able to pull and clean the data within about three hours. She posted the data in CSV format to her blog, along with a nice explanation.

Being a words guy rather than a picture guy, I used data visualization software Tableau to put together a prototype of something similar to what The Times had done. It is absolutely no where near as good as what they did, but I copied their concept, color scheme and fonts. And about two hours later I had something that told the same story.

Click on the image to see the interactive version of an embeddable graphic that can easily — and at no cost — be dropped in to any news site or blog (except this one … because I’m still hosting it on javascript-averse

Click to view the interactive version on Tableau Public.
Click to view the interactive version on Tableau Public.

The graphic alone doesn’t tell the whole story. Tippett pointed out when I showed her the chart that most of the Latinos in Siler City aren’t even eligible to join the city’s police force — 40% are not adults, and 80% of adult Hispanics there are not citizens.

And many of these police forces are very small, which makes it easy for them to end up with huge percentage disparities in the racial breakdowns of their police and residents. Tiny Biscoe, for example, only has nine police officers. Wagram has two police officers — half of which are white and half of which are “other.”

The other potential problem with the data is that it’s seven years old. But so is the data used by The Times.

This is just an example of how we might continue to democratize data. This graphic could be emailed to an editor of each news outlet in North Carolina, along with a list of suggested questions that local reporters could ask to quickly make the data more relevant.

Suggested Questions to Localize This Data Driven Story

  • “This data is seven years old. Does it still look accurate to you? Can you provide me with some more recent data of the racial and ethnic breakdown of the police department?”
  • “Why do you think your department has a higher percentage of white officers than the residents?”
  • “How does the racial disparity between the police department and local residents effect the way your department works?”
  • “Walk me through the hiring process for new officers. How does a candidate’s race factor in to hiring decisions, if at all?”
  • “How do you publicize vacancies in the department? Do you do anything to recruit minority applicants?”
  • “What percentage of your officers live in the city? How important is it that officers come from within the city? Why?”
  • Also, seek opinions of others — both insiders such as city council members and community leaders as well as people on the street. Consider using social media such as Facebook or Twitter to ask people what they think about the data and these questions. This is the start of a conversation, not the end. Be sure to get a diversity of perspectives — age, gender, geography and certainly race and ethnicity.

The Challenge: News Deserts

But even if we acquire, clean and produce data along with some simple story guides, data driven journalism may still not find its way into smaller newspapers if nobody is there to receive our help. At many papers, this would still be seen as enterprise reporting. As an editor with a staff you can count on one hand, do you send a reporter out prospecting for answers to these somewhat uncomfortable questions? Or do you have them write up the day’s arrests? Or preview this weekend’s chamber of commerce golf tournament?

North Carolina also has broad news deserts — whole counties that have no reporters shining light in dark places, holding powerful people accountable and explaining an increasingly complex and interconnected world. Siler City, for example, is in a county of 65,000 people with a single newspaper that reaches only 12 percent of them. The News & Observer — provides scant coverage of the county.

What other story templates would you like to see? What would make them easier to use?

Why ‘Robot Reporters’ Are a Good Thing

First of all, let’s not let allow the alluring alliteration to distract from we’re really talking about — not robot reporters, but robot writers.

Mashable’s Lance Ulanoff asked me what I thought about the news that Durham’s Automated Insights would be writing automated business stories for the Associated Press.

This trend excites me about the future of journalism. I’ve been talking with folks about it for about five years, since I first saw similar work that was being incubated by Northwestern’s journalism school. That effort grew into the company Narrative Science, which has been writing earnings preview stories for The Los Angeles Times uses an algorithm to write earthquake stories. The Washington Post has looked into using Narrative Science for high school sports stories.

The Guardian learned how hard it is to build a robot writer, but the automated stories I’ve seen written by both Automated Insights and Narrative Science are pretty good. And 46 media and communications undergrads couldn’t distinguish a computer written story from one written by a human.

The trend in automation should free up the best writers and best reporters to add the how and why context that still needs to be done by humans. If I were a beat reporter at a newspaper I’d be working as fast I could to convince by editor to let a computer write the scut stories I have to write and free me up to do more explanatory and accountability reporting, or to craft beautifully written narratives.

One significant risk is that for the last decade we’ve seen “good enough” journalism growing in popularity. News organizations that continue to have a strategy of harvesting profits rather than investing in growth will no doubt cut reporters if machines can write commodity news at a lower cost.

If I were a young journalist looking for my first job, I’d be looking for news organizations that are sustaining a small margin and growing both expenses and revenues — the ones that are using both bots and humans.

The trend toward automation will result in an emphasis on the news value of impact. Mass customization is going to change the nouns in the leads of stories from the third person to the second — “investors” will become “you.”

The trick is how to make money off this. News organizations that continue to see themselves as manufacturers of goods will probably increase the volume of digital commodity content they publish and continue to drive down ad rates.

But smart content companies are evolving from a manufacturing industry to a service industry, and trying to create, explain and capture the value they provide to each client by getting the right information to the right people at the right time.

What we see now as data is as unsophisticated as what many of us thought of data when Google first made its mission organizing all of it. We think of data now as numbers in tables — scores, money, temperatures, but we’ll soon see data as behavior and content metadata. And we will see automated stories that incorporate the user’s data and the data of her social network as well.

That level of concierge news service, though, is going to come at a price for users. If we’ve seen the democratization of media this automation trend has the potential to create a world of media haves and have nots — the haves will pay premium subscription fees to get highly personalized news from bots. The have-nots will get generic news (maybe written by bots as well).

The one thing from which I think everyone will benefit is an increase in the quality and frequency of narrative writing, and of explanatory and accountability reporting.

To aid that transition I’m working on the idea that we can use digital public records to build a newsroom dashboard system that will alert beat reporters to possible story ideas. Automated Insights and Narrative Science are scaling commodity news stories. I want to see if we can lower the human reporters’ opportunity cost of pursuing enterprise stories that land with much bigger and much longer lasting impact.

If you want a pithy quote from a journalism prof. on the effect that robot writers are going to have on the job market for journalism students, here it is: “My C students are probably screwed. My A students are going to do better than ever.”

How to Cover Live Events: Create an Experience

Whenever I’m trying to figure out a new way to tell a story, there’s a quote from one of the inventors of virtual reality that always pops into my brain: “Information is alienated experience.” Or, like my middle school English teacher used to say, “Show, don’t tell.”

So when you go out to cover an event, don’t bring back a product, a widget, a good, a 10-inch inverted-pyramid story. Use multimedia and interactivity to bring your audience along for the ride. Make them feel like they’re in the room with you. Cover the event live, and then repackage your live coverage to attract the search engine audience.

For these types of stories you should consider:

  • Live tweeting.
  • Streaming video via UStream.
  • Capturing/publishing audio via SoundCloud.
  • Publishing full or editing video on YouTube.
  • Re-using in a regularly scheduled weekly podcast.
  • Posting to the website using Storify.

Live Tweeting Tips

Before the event:

  • Do background research so you have some idea of what you can expect to happen. Because news is when the world doesn’t behave as you’d expect. So first you have to know what to expect. Know the players and the rules. Look at old stories, get a copy of the meeting agenda or the speech text if possible. Check out the group’s website and social media accounts. Check out national press, too, if warranted.
  • Make a Twitter list of everyone you expect to be in attendance at the event — that includes “official” event participants as well as observers.
  • Make another list of anyone you think might be interested in the topic. You might find these are folks who already follow you on Twitter, or folks that have re-tweeted similar stories, or just people who’ve been talking about similar topics.
  • Make sure you know any hashtags related to the event. If there isn’t one, make one up and tell your followers to use it during the event. For the most important hashtags you expect to be used during the event, create a saved search on Twitter.
  • Prepare a few tweets in advance of the event. For example, if you know that someone is going to reference a particular news article or book, have a link to that article ready to send out when the person mentions it. If you use HootSuite, you can save drafts of your tweets. Twitter just added native pre-scheduling of tweets to its clients.

During the Event

  • Get there early. You never know what might go wrong — you can’t find parking, there’s no WiFi, no place to put your camera, room is full, flock of rabid seagulls attacks…
  • Tell people that you’re getting ready to start live tweeting an event. Tell them where you are. Tell them about how long you’ll be at it. You’ll be using your personal account for credibility and intimacy.
  • Listen for key quotes, and either paraphrase them or attribute them: – “Obama: ‘My role here is done.'” or “Obama says he is resigning from office.”
  • Listen for key facts that provide context: -“27% of new students hail from Antarctica.” or “Construction on Gryffindor began in 2011 and was scheduled to cost $3.1B.”
  • Listen for news: “Board voting now on whether to oppose Amendment One.” or “Board’s vote on Amendment One unanimous. Everyone’s opposed.”
  • Provide both “play-by-play” coverage as well as analysis: – “Somewhat unexpected to hear all sides agree on that issue. Where was the opposition?”
  • Use hashtags. Hashtags can be used to help your audience find other tweets on the topic, but they can also help your tweets find an audience that cares about the topic but doesn’t yet know you’re covering it. Finally, you can use hastags for commentary (but be careful with that.)
  • Ask questions of your audience during the event. Questions can either seek information: “Dept. of Labor says it’s still interviewing witnesses. I’d like to interview you, too. Did you see the Vortex collapse?” … or they can seek opinion. “Rides inspected 3x/day. Fairgoers- Is the Dept. of Labor doing enough to keep you safe?”
  • When you receive responses, re-tweet the interesting ones. Think of yourself as the host of a call-in talk show. Re-tweeting adds interesting voices to the live event and puts yourself in the position to mediate a conversation between your followers. That’s creating an experience rather than just a story.
  • Invite your readers to ask you questions: -“Petraeus taking questions now. What do y’all want me to ask him?”
  • If you make a mistake, correct it. If it’s an egregious fact error — “Thornburg found guilty of murder!” — delete the original tweet and send out a correction: “CORRECTION: Thornburg found NOT guilty of murder!” Correct anything that alters your audience’s clear understanding of the event. Misspellings and things like that probably don’t merit corrections. If someone has re-tweeted a fact error that you made, be sure to @mention them in your correction so they’ll be more likely to see it and pass it along, too.
  • Tweet photos and (brief) video. Give people a sense that they are going “behind the scenes” with you — that you’re taking them to a place they can’t go.
  • Interview participants and observers. Tweet a photo and a quote of the person. Be sure to @mention them.
  • When you end your live coverage, tell your audience that you’re wrapping up… and they they can go to your website or print publication soon to see your full wrap-up of the event.

After the Event

  • Use Storify to pull together your quotes. Embed the Storify on your site.
  • Follow-up the next day with a moderated live online discussion with one of the event’s participants. Or just allow your readers to ask you questions. Two good tools to use for live discussions on your site are CoverItLive and ScribbleLive.

Live Audio & Video

With UStream, you can turn your iPhone into a broadcast truck. It doesn’t matter whether the event has a huge following or not, imagine that suddenly you don’t just work for the newspaper of record for in your community but also the CSPAN.

If you want to host your own video talk show, try Google Hangouts On Air like Investigative Reporters & Editors has done.

You know you can use your phone as a camera, but you can also use it to record audio interviews and use SoundCloud to publish the audio on The Chronicle or to a podcast. If you want to dramatically improve the quality of your audio, try one of these little microphones that plug into your iPhone. You can also use this free iPhone app to do some pretty nice audio editing right on the phone.

None of these audio and video tools are going to win you an Oscar. They’re the tools you use when the story doesn’t merit a trained videographer.


General Tutorials

Tips for Speeches

Tips for Meetings

Tips for Festivals/Celebrations

Tips for Live Q&A Events on Twitter

Correct Audience Numbers for The Columbia Tribune

I made an error in the piece I wrote for PBS Media Shift Idea Lab yesterday. I misrepresented the audience for The Columbia Tribune. The paper’s general manager, Andy Waters, kindly brought it to my attention and I want to offer a correction here.

Numbers that Waters sent me from The Media Audit show that the news organization reaches nearly 80 percent of the 130,000 adults in its market. I used a small potential audience base and a smaller penetration rate when I wrote up the post. That number was no good and I should’ve known better than to use it. I simply took the print circulation and divided it by the Census estimate of residents in Columbia.

You could go all night quibbling about audience measurement methodologies, but whatever faults The Media Audit numbers may or may not have, they are certainly better than the way I tried to calculate it.

And I think this is a particularly important measurement to correct because of the number of unsourced posts that you can find on the Internet saying that the Tribune lost anywhere between 25 and 40 percent of its online audience when it implemented online subscriptions. I’m not fact-checking those claims one way or the other, and even if true they may not be important. I’m repeating it here only to provide context for the correction and to hopefully spur some critical thinking about any audience claims you see — including mine.

Anyway. The Media Audit numbers that Waters showed me indicate that out of a base population of 130,634 adults 18 or older, 103,260 of them – or 79 percent – say they read the Tribune either in print or online. The print edition (weekday/Sunday) reaches 62 percent of the market at least once a week, and the website reaches 52.5 percent of the market at least once a month.

I hope that gives a better picture of the kind of environment in which the Tribune’s OpenBlock experiment is taking place. And the point I was trying to make I think remains valid — that Columbia, Mo., is WAY different than Chicago or Charlotte or San Francisco.