Editorial: On PACE
In 2011, amid many thoughtful avenues of research into a paralyzing syndrome that is as near to an off-switch on life as one can imagine, an $8 million dollar government-funded trial in the United Kingdom bulldozed a highway. The first results of PACE, the largest treatment trial ever conducted on ME/CFS— popularly, and unhelpfully, known as “Chronic Fatigue Syndrome”—announced a way out for, perhaps, a third of sufferers: As the British newspaper, The Independent, in a headline: “Got ME? Just get out and exercise, say scientists.”
This was not greeted as good news; indeed, to say sufferers were—to use a Britishism—gobsmacked was an understatement. The claim that ME/CFS could be ameliorated by graded exercise therapy (and cognitive behavior therapy) as the study suggested ran directly against many patients’ experience of the condition; moreover, the apparent success of these treatments in the trial suggested that the condition, or rather the prolonged experience of the condition, was mostly cognitive—‘in their heads,’ so to speak—rather than the result of any number of possible biological and organic infections or physical responses to the environment.
So what happens when the largest trial of its kind produces a result that overturns patient expectations and understandings? The institution of medicine trusts the trial; that’s the power of science, after all: the capacity of an appropriately designed study to disentangle perception from probability, cause from correlation, personal bias from objectivity. The PACE trial was “rigorously designed,” according to CNN, and it was published in one of the most prestigious medical journals, The Lancet, so its conclusions not only had the power to affect the way the condition was treated, they had the power to set the agenda for further research, potentially foreclosing other approaches.
As a result of PACE, the UK’s National Health Service, the Centers for Disease Control, the Mayo Clinic, and Kaiser all ended up recommending cognitive behavioral therapy and exercise for ME/CFS. There is now a PACE-like trial in children—MAGENTA. PACE has become the paradigm for understanding a condition affecting millions of people.
The PACE Trial and David Tuller’s investigation
raise other significant points:
- Study design is now one of the most pressing issues in scientific integrity. It is not just that the problems with PACE could have been—and should have been—seen beforehand; it is that poor study design appears to be a major factor behind the much larger reproducibility and replication crisis in science. This has implications for journalism too: Unless journalists start asking whether the studies they report on can actually answer the questions they claim to answer, science reporting will be little more than free PR. It’s also important to point out that PACE was government funded, and that it’s time to ditch the idea that independently funded research is intrinsically more reliable than industry funded research; all research needs to answer the design question.
- Patients need to be much more closely involved in the research process. You would think, following the transformational role played by patient groups in changing the way HIV/AIDS trials were designed, that patients would be seen as partners in research: they have, after all, an inside knowledge of their own suffering—histories that can be richly mined by researchers who are not lifetime experts in the disease. Patients come to research with a fundamentally different perspective to scientists; as Julie Rehmeyer puts it, their question is, “How does this new research fit or fail to fit with my experience?” Reaction to PACE from patients was that it did not—and that kind of reaction should not have been summarily dismissed.
- Scientific journals should be more interested in engaging in ideas and less reflexively defensive of the ones they publish. We can understand the Lancet editors’ loyalty to the researchers who publish with them, but openness to criticism in science cannot be construed as betrayal, even when the stakes are high.
- We can only imagine the challenges Tuller would have faced getting his mammoth investigation published in a conventional media outlet, so it’s important to point out that a long form, multi-part story on an academic blog (Virology) can change the world; and that is, in a word, awesome.
David Tuller, a journalist who had earned a doctorate in public health, and who ran the University of California Berkeley’s joint program in public health and journalism, believed the sufferers were onto something when they said there was something badly wrong with the way the trial was designed and conducted.
And the thing about patients who either suffer from a rare disease, or a more common and inexplicable one as with ME/CFS, is that they are usually a formidable resource—a network of distributed experts who have sifted and weighed the scientific research with the kind of avidity you would expect, given that their lives depended on it. In pharmacology, rare disease patient groups are highly respected and are seen as partners in research rather than just subjects and consumers of studies.
Tuller dug in the weeds and published his results in a four-part series on the blog Virology. The gravity of his investigation may be gauged by one of the experts he quotes, Ronald Davis, Ph.D., Professor of Biochemistry and Genetics at Stanford University: “The PACE study has so many flaws and there are so many questions you’d want to ask about it that I don’t understand how it got through any kind of peer review.”
Because we believe that study design is a critical issue in science, and because statistics is central to understanding study design, we felt it was important to look at PACE from this perspective.
But we were also spurred by science writer Julie Rehmeyer, who wrote a powerful essay for our series “Epistemically Challenged” (over at Sense About Science USA) about her own experience of ME/CFS, and how it changed her view of science. As Rehmeyer is the most recent recipient of the American Statistical Association’s Excellence in Statistical Reporting Award (an honor we think Joseph Pulitzer would have considered equal to his eponymous prizes given his love of statistics), we took her criticism of PACE as another important alarm.
The conclusion of Rebecca Goldin’s 7,000-word analysis on PACE’s design is this: “The best we can glean from PACE,” concludes Goldin, “is that study design is essential to good science, the flaws in this design were enough to doom its results from the start.”
The reaction to patient criticism and Tuller’s story by the PACE researchers and the Lancet has been to deflect rather than to dissect; and the story has become one of questionable demands for patient level data and social media provocations and threats.
While these are not trivial issues, they are something of a distraction from the fundamental one, which is that the way PACE was designed and redesigned means it cannot provide reliable answers to the questions it asked. There is really not a lot that can be said to mitigate that; it’s a terminal prognosis.
But there is one more important thing to say. Sometimes, justice is blind to those who serve its cause. Perhaps the service is not so easily rendered in the kind of heroic narrative that tempts Hollywood—the easy clarity of right triumphing over wrong. It takes digging in the weeds to understand who did what and why that was important—and we prize effortlessness, especially the effortless kind. David Tuller may not get a Pulitzer Prize for investigating PACE trial on a blog; but his service to—and we do not exaggerate—millions of sufferers around the world make it hard for us to think of another work of journalism so deserving of commendation.
— To read Rebecca Goldin’s 7,000-word analysis on PACE’s design, click on this link.