Defend Truth

Opinionista

Journalism 101: Helen Zille needs to go back to the basics

mm

Gwen Ansell is a freelance writer, writing trainer and media consultant. She is the author of Soweto Blues: Jazz, Politics and Popular Music (Bloomsbury, 2004 ); has written for multiple South African and international publications; is a Research Associate of the Gordon Institute of Business Science, University of Pretoria, with which institution she has co-published on the cultural and creative industries; and runs writing-related training programmes for academic and business writers as well as journalists. She blogs at http://sisgwenjazz.wordpress.com

There are a few basic steps we teach journalists to take before they try and cover science. The first is to go back to the original research report, not a second-hand press boil-down written by a hack who doesn’t understand the field. Helen Zille used the boil-down.

We’ve all read them: the dramatic little “research says” snippets used by newspapers to fill space. “Asparagus may cause cancer,” we’re warned, or “Humans will grow webbed feet to deal with global warming”. Such crude reductionism about research is often blamed on what’s called the “juniorisation” of newsrooms (and that, in turn, is often a coded attack on demographic changes that critics of the press don’t like). So, in an odd way, it’s almost comforting to find that an extremely senior former journalist such as Helen Zille (no slouch when it comes to attacking the research credentials of Richard Poplak) made all the same basic errors when she tried to write about research.

In one of her attacks on Sanef’s failed case against the EFF’s press-baiting, Zille told us that “a famous academic experiment… proved an important point about the prevailing biases in certain publications”.

She didn’t name the experiment, possibly because it’s normally referred to as the “Grievance Studies Hoax”. Late last year, Helen Pluckrose, James Lindsay and Peter Boghossian (respectively, a magazine editor, a mathematician and a philosopher) claimed that by getting fake papers accepted in certain journals, they had proved (in Zille’s words) that a writer can get any nonsense published as long as it is steeped in the narrative of victimhood and the ‘lexicon’ of identity”.

Except that’s not what they proved at all.

Perhaps we can blame Zille’s error on the only source she cites for her knowledge of this “experiment”: the Irish Times. As we’ve said, journalists may not be great at reporting science. But there are a few basic steps we teach them to take before they try and cover it. The first is to go back to the original research report, not a second-hand press boil-down almost certainly also written by a hack who doesn’t understand the field.

The second lesson is to look at what actually happened. The three researchers set out to write deliberately outrageous articles in order to prove what they saw as the “corrupt” nature of fields of study related to identity politics – “social snake oil”, they called them. By this stage, a critical journalist might be alert to the possibility of confirmation bias: the researchers had already decided what answer they wanted.

The 20 hoax articles the three initially drafted were so bad they were all instantly rejected. So they threw money at it. Lindsay still refuses to name their donors for financing intensive lexical work to mimic the structure and terminology of successful articles in the targeted journals and strengthen the lies about methodology and results. He did admit to Vox magazine that the task consumed “90 hours a week”. And after all that investment, only seven hoax articles from the original 20 got published, two of those in practitioner journals, which employ different acceptance paradigms.

That’s too many, of course. But all it suggests (journalists who don’t understand science should steer clear of “proves”) is that there’s a problem with reviewing procedures in scholarly journals and that peer review has no defence against straight-up liars (or as Zille termed it, “intentional sophistry”). That’s something long conceded and discussed in scholarly communities across “hard” as well as social science disciplines. Science magazine, for example, reported in 2016 that 40% of economics experiments failed replicability tests; the technical Journal of Vibration and Control retracted 60 papers last year because of corruption in the peer-review process. That didn’t need additional research; it’s already well known.

Pluckrose, Lindsay and Boghossian did not undertake a comparative study across disciplines. They targeted fields of study they disliked, using only a tiny sample. Thus they could not possibly “prove” that what they termed grievance studies were more corrupt than any other field of study. Their work was sloppy scholarship, widely slammed as soon as it appeared.

When we train journalists, we encourage them to look at the sources of the research they report on. Who’s doing it, why, and with money from what sources? Sources of funding for the “Grievance Studies Hoax” papers remain opaque, and its motivation is explicitly to attack identity politics. We’re a democracy, and Zille can embrace whatever ideological stances she chooses. But such bad journalism? Really, Helen, from someone who presumes to judge award-winners such as Poplak, we’d have expected better… DM

Gallery

Please peer review 3 community comments before your comment can be posted