We believe that Joseph Pulitzer, the newspaper publisher, nailed it back in 1904 when he wrote in the North American Review: “You want statistics to tell you the truth. You can find truth there if you know how to get at it, and romance, human interest, humor and fascinating revelations as well. The journalist must know how to find all of these things—truth, of course, first.”
But we believe that today the nail needs to be hammered home: In a world where numbers and data are increasing their domain over the choices we make—and the choices others make for us— finding the “truth” in statistics is more challenging than Pulitzer could have ever imagined. It means quantifying uncertainty; it means assessing probability; it means honoring complexity when the clamor rises for a simple “yes” or “no.” We believe that this challenge must be embraced if we are to hold power accountable.
This is not going to be easy. Translating the complexity of numbers is tough, because, at some point, numbers cannot be described in more vivid ways to someone who is not really that interested in numbers. We will do our best to try and make the mysterious comprehensible and, it is to be hoped, eloquent; but if knowledge is a journey, understanding cannot be achieved by teleportation. You have to traverse the steps between cause and effect—between premise and conclusion—if you really want to understand why it is that you should accept one thing over another.
Who we are
We are a mathematician and a journalist. Half a century ago, C.P. Snow identified two cultures, the sciences and the humanities, and we—the editors—could be said to represent each. We could expound at length on the insights we have gained from a ten-year collaboration; but our conclusion is this: we all live in one world and with one scientific method. Math and statistics and probability are central to understanding how the scientific method produces reliable and valuable knowledge; they are, if you will, the keys to the motor of inference and the journey of causality. These are things we believe everyone should have the opportunity to understand because they are vital to understanding how the world works.
We have been and remain a non-profit. We do not accept industry support or sponsorship. We’re not chasing customers or page views or ad clicks or unreasonable financial goals. We are not trying to build a content mill. We do not want to expend our lives on stuff that doesn’t really matter for the sake of publishing a story that will make the site look fresh. Our resources are finite and time is best spent trying to get one thing right rather than getting many things somewhat right—or worse, wrong. The only criterion for publishing a story is that we believe its content is interesting and valuable—even if only to a handful of people.
We are very—very—grateful to the foundations and individuals who believe that what we do is best done as social science, in the sense of pursuing a social good. We welcome your support in this endeavor, be it big or small. We’re also grateful to all those who have volunteered time and effort to bring all this to fruition.
What we say to journalists …
If you are a journalist whose story gets dissected, sorry—it’s not personal. We know you’re under pressure. We know that there’s something absurd and disingenuous in trying to explain complex issues in 800 words (or fewer) if all you’ve been given is a few hours and a press release about a new study on a topic you’ve never covered before—and your editor wants to maximize traffic with a tantalizing headline, possibly because everyone else is running the same story. Some of us have been there. It’s exhausting and tough and often thankless. It’s also increasingly pointless. What value does a rewritten press release about a study that may or may not be valid have beyond clickable content?
The problem is not you; it’s journalism— and the predictable narrative formulas that have hardened news into an exercise of epistemic conventions. These conventions are increasingly inadequate to the task of assessing the value of quantitative information. You cannot hold data accountable if you are required to run away from analysis and flit between “he said” to “she said” and back again with no pause to ask for evidence, or to consider whether the evidence means what people claim it means. That’s what we’re trying, in a small way, to change. Contact us before you write your story and maybe we can help you. While not everyone can be Nate Silver, we believe that every journalist can benefit from having access to a Nate Silver.
What we say to scientists…
If you are a scientist who gets their study dissected, sorry—it’s not personal. We know you’re under pressure to publish positive findings in high impact journals and to bring in grant money to keep your career, your lab, your department and your university afloat. It’s exhausting and tough and largely thankless, and we understand how the shifting culture of academic research has made “impact” in the professional and news media imperative. But scientific knowledge is meaningless if it’s going to be the case that every study must get a media hug for effort. And how is scientific knowledge in the public realm scientific if it’s never going to be challenged or verified?
We know that you know that the state of scientific research is troubling, certainly in the life sciences. For the past decade, we’ve witnessed the rise of what seems like an entire field of science devoted to explaining how science goes wrong and how much published research is false. The National Institutes of Health have drawn our attention to an epidemic of irreproducible studies and tied them to deficient statistical reporting. Peer review mostly glosses over statistical analysis, and journal editors have limited resources to check and verify.
Our hope is to provide statistical resources to you. You can seek statistical help before you design your experiment. And then you can seek statistical help in interpreting your results. This is why we believe you should get to know the American Statistical Association and its members. This is why we are excited to be in partnership with them.
Bringing together the two cultures
In itself, error is fundamental to the practice of science: it is not as if valid results advertise the route to their discovery in advance. But if the discovery and correction of error becomes secondary to the promotion of research—or if claims become too difficult or too big to check—then flaws will expand into faults.
To us, the threat is redolent of the financial meltdown of 2008: if moral hazard is practiced in science—if there are no risks to taking risks in method and interpretation—science, as a practice, will be seen by the public as unscientific, as unreliable, as not fully understanding the things it tells us are true, and, perhaps worst of all, as not caring. What terrible decisions will be made when such skepticism sets the tone of public debate? And who will be to blame?
The shift, too, from a computational to a data intensive paradigm of scientific research raises as many questions about methods and measurements as it does promises of discovery. Who will analyze and critique the billions of data points produced by the Internet of things? Who will critique the data scientists? How will journalism make sense of this new quantitative dimension to reality, this new empiricism? We see STATS as a very modest step towards creating—at least the sense of need—for a new journalism, a journalism of statistical analysis, one built through collaboration between journalists and statisticians.
Finally, we look forward to hearing from you, and we look forward to the questions you ask and the comments you add to what we do and to what your fellow readers will say. We believe that you will make us and this site and project smarter and useful.
Trevor Butterworth & Rebecca Goldin Ph.D.