Physicians, patients, and journalists should all be wary of news stories
based on medical meeting presentations that have not yet undergone peer review
and scientific publication, according to two physicians who have long
campaigned for better medical reporting.
"The most direct way to improve the media coverage of scientific
meetings would be to have less of it," wrote Steven Woloshin, M.D.,
M.S., and Lisa M. Schwartz, M.D., M.S., in the June 5 Medical Journal of
Australia. Both authors are associate professors of medicine at the
Veterans Affairs Outcomes Group of Dartmouth Medical School in White River
Presentations at scientific meetings are often big news, said Woloshin and
Schwartz, although that is more often due to the combined interests of all
involved. Meeting organizers like the attention that news stories bring,
researchers garner academic resume builders, the public enjoys hearing about
the latest discoveries, and reporters please their editors. Commercial
pressures are inevitable, too.
"Journalists think they're above industry influence, but if you're
not aware of industry-funded research and how positive results generate
profits, you're wearing blinders," said Gary Schwitzer, director of the
health journalism program at the University of Minnesota School of Journalism
and Mass Communication in Minneapolis, in an interview.
Woloshin, Schwartz, and Schwitzer were faculty members at the annual
Medicine in the Media program in June sponsored by the National Institutes of
Health. This program is designed to help journalists better evaluate and
report on medical research. Schwitzer also publishes "Health News
Review," a Web site that evaluates medical news stories.
Woloshin and Schwartz looked at 174 newspaper stories and 13 television or
radio stories reporting on study presentations given in 2002 and 2003 at
meetings of the American Heart Association, International AIDS Conference,
American Society of Clinical Oncology, Society for Neuroscience, and
Radiological Society of North America.
The news stories often omitted basic facts about the studies, they said.
About one-third didn't mention the study size, for instance, and more than
half failed to state the study design or were so unclear that even clinical
trials experts could not say what they were. About 40 percent of the stories
failed to quantify results, and while 21 percent quantified the main result,
they used only relative change without a base number or rate. Two-thirds
presented only interim outcome measures (like blood pressure or tumor size)
rather than patient outcomes. Only a minority made any mention of study
cautions, like side effects or other risks, small study size, or the possible
lack of applicability of animal studies to human beings.
The preliminary nature of meeting reports was rarely noted, said Woloshin
and Schwartz. In fact, 173 of the 187 stories failed to state that the
findings were unpublished, had not gone through peer review, or might change
as the study continued.
Aside from having fewer such stories, all parties could take steps to
improve coverage of meeting reports, said the authors. Researchers could
include appropriate caveats in their presentations and in interviews. Meeting
organizers could do the same with their press releases, as well as including
data tables and absolute risks of outcomes. Presenters and organizers could
both indicate the preliminary nature of the work and the need to wait for peer
review. Reporters and editors should be aware of these trouble spots, too, and
press for more detail when it is needed and write with more
Of course, not all early research reports present findings that turn out to
be inaccurate as the research continues or as study data undergo more
"Often they let people know about negative trials," she said."
However, physicians should approach preliminary news reports with
skepticism. Often, it's better to wait until we see more data."
Physicians should also steel themselves for the all-too-common moments when
patients arrive waving news clippings based on research not yet peer reviewed,
added Schwitzer. Doctors need to point out that such stories are reporting on
findings that are still preliminary and that evidence for benefits or side
effects isn't all in yet, he said.
Researchers, journalists, and clinicians should all understand basic
aspects of study design to avoid misconstruing results. Studies and the news
stories based on them should make clear the strength of the evidence
presented, its relation to prior work, and its relevance for readers. Reports
of intervention studies should note both clinical benefits and downsides, and
possible alternative treatments.
Providing that information should go a long way toward more accurate and
more useful coverage of medical meetings, said the authors.
"Media Reporting on Research Presented at Scientific Meetings:
More Caution Needed" is posted at<www.mja.com.au/public/issues/184_11_050606/wol10024_fm.html>."
Health News Review" is posted at<www.healthnewsreview.org>.▪