Product
Risks We Accept
Could a content management system help journalists avoid retractions?
On February 17th, SB Nation, which is part of Vox Media , published a 12,000-word story that was bizarrely sympathetic to the convicted rapist Daniel Holtzclaw. The organization removed the story hours later. “There is no qualification: it was a complete failure,” wrote the SB Nation editorial director Spencer Hall.
At Deadspin, Greg Howard reported on what happened and who was involved:
All media operations have flaws inextricable from their structure and the culture upon which they are built, which leave them open to very specific failures. Deadspin, for example, is both smug and defiantly skeptical; at our worst, we’ll assert we’ve proved a negative when we haven’t. Buzzfeed is an advertising shop with a journalism wing; at its worst, the company will pull its own articles at the behest of brands. SB Nation is a virtual monoculture, built to profit from writing done with minimal infrastructure devoted to it and deeply unserious about ambitious things. It’s not surprising that at its worst, SB Nation would badly mishandle a story like this one.
I was an editor last decade, and I was involved in several situations where a story should not have run. I’ve hit the “publish” button when I shouldn’t have. It always happened when I was in a fog, and had let myself get pulled in too many directions. Nearly every major publication has had big messes to clean up online in the last few years. It’s SB Nation today; it’ll be someone else next month.
A lot of the work I’ve done over the years has been in launching or re-launching online publications. Sometimes clients ask me what to look for when they hire a digital editor. They expect me to say “knows HTML,” but I instead suggest that they hire someone sensitive to the twin threats of litigation and public online humiliation. Passion for story, tuned ear, roster of writers on tap — all necessary. But web editors have to work fast to feed the beast, for an audience that is already pissed off. Every day carries risk without enough time to plan. The best editors are also very good risk managers.
One of the reasons I like working in technology is that when you make a mistake you usually aren’t fired. Technical failures are stitched up and sent back out onto the field. Editorial failures more often result in prompt, Civil-war style amputation, with screaming and saws. In the SB Nation case a freelancer was cut loose and an editor was fired, and the entire SB Nation “longform” product was put on hold.
In nearly 20 years around content management I’ve never had anyone ask me: “Someday, we may produce some real, career-destroying garbage. Can you help us avoid that?” That’s not the kind of conversation you have with your software vendor.
Besides, there’s not much software can do here. Computers can cut text up into chunks, but a computer can’t understand that a story functions as an apologia for a reviled criminal. Besides, computers are moronic censors, as people who can’t visit “banned” websites while at work know too well.
What about, you know, checklists? Maybe a window pops up when someone clicks “Publish,” with a list of questions like: “Does this piece reinforce hegemonic racist norms? YES/NO.” But people are sly. They will quickly learn how to fill out the form and move on. You can’t CAPTCHA racism, sexism, or errors of fact.
Besides, adding checkpoints adds complexity. In general everyone is trying to reduce the hierarchy and the number of checkpoints that keep people from publishing.
This part of the Deadspin article stayed with me:
“I couldn’t believe what I was reading,” [senior editor Elena] Bergeron tells me. Bergeron, who’s black, was so shocked by how bad the story was that she had to read it twice. She says she asked copy editors and a producer if they read the piece thoroughly while working on it, and that they told her they had and thought it was unpublishable, but didn’t think it was their place to say anything to [head of the SB Nation Longform vertical Glenn] Stout.
Here’s a feature that no CMS I know of has, that would be fascinating to build: A “Very Concerned!” button that appears attached to every article in the editorial view. Perhaps with a small textbox attached for some notes.
Anyone — designer, intern, editor — could click that button. Once they click, an email goes straight up the chain (to the top of the company) flagging that someone, somewhere is concerned. But other than that it’s anonymous. Make it incredibly easy to tell the boss that something bad is happening without sticking your neck out.
I can immediately see many things that could go wrong, and I’m sure you can too. The boss could filter the email to spam. People could misuse the button in a thousand different ways. An editor could keep a story very close to his team and say, “look, I know you want to hit the ‘Very Concerned!’ button but trust me, we’re going to win a national award on this one, we just have to get through an edit cycle and solve some of these problems.” Humans are smart about getting what they want.
But give this one a little more rope: What if you created a special reader’s program of, say, two hundred people who read your publication? Make sure they are as diverse as hell — race, gender identity, sports teams, location, age, education. Recruit them quietly. Pay them something small but meaningful: $100/month to read 10 or so stories each. They’d read them anyway; here, they get money for reading them early and carefully.
This is now your “reader’s council.” Give them unlimited access to drafts of articles and ask for feedback and notes, and give them the same “Very Concerned!” button you gave to your editors. Make it all totally anonymous — no way for the editors to reach them, or know who they are. Now instead of waiting for the Internet to take you to task, a group of strangers can take you to task, quietly, on a regular basis.
I can imagine editors reading this description and throwing their phones to the ground in total disgust. The last thing they need is input from a bunch of amateurs while pieces are in progress. This would slow things down and gum up the works. Plus they’d end up seeing all their scoops tweeted out. What I just proposed is a disaster. I’m not even sure it hasn’t been tried. Maybe there’s a big red button built into the Vox CMS — I don’t know at all.
Worse, this program (that editors wouldn’t want, that advertisers wouldn’t care about) would cost around $500,000 the first year. You need to recruit the readers; you need to build the software to manage the program; you need administrators; you need to cut the checks. Plus there would be ongoing support costs for as long as it lasted.
Much content-creation software I’ve seen focuses on tasks — making it possible to compose and publish quickly, often to multiple platforms. There are also very large complex analytics platforms that serve advertisers. But proportionately little of the work I’ve seen focuses on managing the risk of publishing (i.e. more than workflow/permissions), even though publishing is an insanely risky business and seems to be getting riskier.
To build a “reader’s council,” a powerful editor-in-chief would have to really want it, enough to pull a half-million dollars out of the budget and spend it on risk management instead of growth — and who knows if it would even work at all? You could never demonstrate return because you’d be trying to prove a negative. In an industry in perpetual crisis, no one can afford that. So you can add this off-the-cuff idea to the long list of systems that we aren’t building right now, if ever. When it comes time to build publishing systems, we prioritize that which will help us grow and hope to hold off crisis as long as possible.
Story published on Feb 28, 2016.