When news outlets report that new research studies prove something, they’re almost certainly wrong.
Studies conducted in fields outside of mathematics do not “prove” anything. They find evidence — sometimes, extraordinarily strong evidence.
Even when scientists have lots of very strong evidence, they rarely claim to have found proof because proof is absolute.It’s important journalists understand that science is an ongoing process of collecting and interrogating evidence, with each new discovery building on or raising questions about earlier discoveries. A single research study usually represents one small step toward fully understanding an issue or problem.
Even when scientists have lots of very strong evidence, they rarely claim to have found proof because proof is absolute. To prove something means there is no chance another explanation exists.
“Even a modest familiarity with the history of science offers many examples of matters that scientists thought they had resolved, only to discover that they needed to be reconsidered,” Naomi Oreskes, a professor of the history of science at Harvard University, writes in a July 2021 essay in Scientific American. “Some familiar examples are Earth as the center of the universe, the absolute nature of time and space, the stability of continents, and the cause of infectious disease.”
Oreskes points out in her 2004 paper “Science and Public Policy: What’s Proof Got To Do With It?” that “proof — at least in an absolute sense — is a theoretical ideal, available in geometry class but not in real life.”
Math scholars routinely rely on logic to try to prove something beyond any doubt. What sets mathematicians apart from other scientists is their use of mathematical proofs, a step-by-step argument written using words, symbols, and diagrams to convince another mathematician that a given statement is true, explains Steven G. Krantz, a professor of mathematics and statistics at Washington University in St. Louis.
“It is proof that is our device for establishing the absolute and irrevocable truth of statements in our subject,” he writes in The History and Concept of Mathematical Proof. “This is the reason that we can depend on mathematics that was done by Euclid 2300 years ago as readily as we believe in the mathematics that is done today. No other discipline can make such an assertion.”
If you’re still unsure how to describe the conclusiveness of research findings, keep reading. These four tips will help you get it right.
1. Avoid reporting that a research study or group of studies “proves” something — even if a press release says so.
Press releases announcing new research often exaggerate or minimize findings, academic studies have found. Some mistakenly state that researchers have proven something they haven’t.
The KSJ Science Editing Handbook urges journalists to read press releases carefully. The handbook, a project of the Knight Science Journalism Fellowship at MIT, features guidance and insights from some of the world’s most talented science writers and editors.
“Press releases that are unaccompanied by journal publications rarely offer any data and, by definition, offer a biased view of the findings’ value,” according to the handbook, which also warns journalists to “never presume that everything in them is accurate or complete.”
Any claim that researchers in any field outside mathematics have proven something should raise a red flag for journalists, says Barbara Gastel, a professor of integrative biosciences, humanities in medicine, and biotechnology at Texas A&M University.
She says journalists need to evaluate the research themselves.
“Read the full paper,” says Gastel, who’s also director of Texas A&M’s master’s degree program in science and technology journalism. “Don’t go only on the news release. Don’t go only on the abstract to get a full sense of how strong the evidence is. Read the full paper and be ready to ask some questions — sometimes, hard questions — of the researchers.”
2. Use language that correctly conveys the strength of the evidence that a research study or group of studies provides.
Researchers investigate an issue or problem to better understand it and build on what earlier research has found. While studies usually unearth new information, it’s seldom enough to reach definitive conclusions.
When reporting on a study or group of studies, journalists should choose words that accurately convey the level of confidence researchers have in the findings, says Glenn Branch, deputy director of the nonprofit National Center for Science Education, which studies how public schools, museums, and other organizations communicate about science.
For example, don’t say a study “establishes” certain facts or “settles” a longstanding question when it simply “suggests” something is true or “offers clues” about some aspect of the subject being examined.
Find out whether scholars have reached a scientific consensus, or a collective position based on their interpretation of the evidence.Branch urges journalists to pay close attention to the language researchers use in academic articles. Scientists typically express themselves in degrees of confidence, he notes. He suggests journalists check out the guidance on communicating levels of certainty across disciplines offered by the Intergovernmental Panel on Climate Change, created by the United Nations and World Meteorological Organization to help governments understand, adapt to, and mitigate the impacts of climate change.
“The IPCC guidance is probably the most well-developed system for consistently reporting the degree of confidence in scientific results, so it, or something like it, may start to become the gold standard,” Branch wrote via email.
Gastel says it is important journalists know that even though research in fields outside mathematics do not prove anything, a group of studies, together, can provide evidence so strong it gets close to proof.
It can provide “overwhelming evidence, particularly if there are multiple well-designed studies that point in the same direction,” she says.
To convey very high levels of confidence, journalists can use phrases such as “researchers are all but certain” and “researchers have as much confidence as possible in this area of inquiry.”
Another way to gauge levels of certainty: Find out whether scholars have reached a scientific consensus, or a collective position based on their interpretation of the evidence.
Independent scientific organizations such as the National Academy of Sciences, American Association for the Advancement of Science, and American Medical Association issue consensus statements on various topics, typically to communicate either scientific consensus or the collective opinion of a convened panel of subject experts.
3. When reporting on a single study, explain what it contributes to the body of knowledge on that given topic and whether the evidence, as a whole, leans in a certain direction.
Many people are unfamiliar with the scientific process, so they need journalists’ help understanding how a single research study fits into the larger landscape of scholarship on an issue or problem. Tell audiences what, if anything, researchers can say about the issue or problem with a high level of certainty after considering all the evidence, together.
A great resource for journalists trying to put a study into context: editorials published in academic journals. Some journals, including the New England Journal of Medicine and JAMA, the journal of the American Medical Association, sometimes publish an editorial about a new paper along with the paper, Gastel notes.
A bad headline can derail a reporter’s best efforts to cover research accurately.Editorials, typically written by one or more scholars who were not involved in the study but have deep expertise in the field, can help journalists gauge the importance of a paper and its contributions.
“I find that is really handy,” Gastel adds.
4. Review headlines closely before they are published. And read our tip sheet on avoiding mistakes in headlines about health and medical research.
Editors, especially those who are not familiar with the process of scientific inquiry, can easily make mistakes when writing or changing headlines about research. And a bad headline can derail a reporter’s best efforts to cover research accurately.
To prevent errors, Gastel recommends reporters submit suggested headlines with their stories. She also recommends they review their story’s headline right before it is published.
Another good idea: Editors, including copy editors, could make a habit of consulting with reporters on news headlines about research, science, and other technical topics. Together, they can choose the most accurate language and decide whether to ever use the word “prove.”
Gastel and Branch agree that editors would benefit from science journalism training, particularly as it relates to reporting on health and medicine. Headlines making erroneous claims about the effectiveness of certain drugs and treatments can harm the public. So can headlines claiming researchers have “proven” what causes or prevents health conditions such as cancer, dementia, and schizophrenia.
Our tip sheet on headline writing addresses this and other issues.
“‘Prove’ is a short, snappy word, so it works in a headline — but it’s usually wrong,” says Branch. “Headline writers need to be as aware of this as the journalists are.”
This post was originally published by The Journalist’s Resource and is reprinted here under a Creative Commons license.
Additional Resources
The Rise of Science-Based Investigative Journalism
Meet the Watchdog Scientists Battling Dubious Scientific Research
4 Tips for Avoiding Math Errors When Reporting
Denise-Marie Ordway joined The Journalist’s Resource in 2015 after working as a reporter for newspapers and radio stations in the U.S. and Central America, including the Orlando Sentinel and Philadelphia Inquirer. Her work also has appeared in publications such as USA TODAY, the New York Times, and Washington Post.
The post How to Report on Scientific Findings appeared first on Global Investigative Journalism Network.