• Welcome to Phoenix Rising!

    Created in 2008, Phoenix Rising is the largest and oldest forum dedicated to furthering the understanding of, and finding treatments for, complex chronic illnesses such as chronic fatigue syndrome (ME/CFS), fibromyalgia, long COVID, postural orthostatic tachycardia syndrome (POTS), mast cell activation syndrome (MCAS), and allied diseases.

    To become a member, simply click the Register button at the top right.

XMRV prompts media thought: ask for the state of play

The following is a comment found on Sciblogs regarding the interpretation of scientific information... one study doesnt policy make, much less one press release.....


Previously I considered that media might ask experts what is known rather than their opinion.*
The fuss about the potential link between XMRV and CFS over the past few months has reminded me of the need for coverage to present what the current state of play is.
One of the more frustrating things for scientists to watch is media reports jumping in too soon,** reporting each new finding in an unresolved story as if it were the last word.
It portrays each research paper as definitive on their own. Research papers are in effect an argument for a case, a case that might potentially later prove wrong.
“Instant” blow-by-blow accounts portray science as a progression of abrupt discoveries, rather than an accumulation of smaller pieces from many different sources that lead to larger conclusions over time. It is true that occasionally there are genuinely startling findings that fly in the face of most of what was known in an area, but these are rare; much more usual are additions to what is known.
Sometimes research findings are contradicted by later studies.
One of the basic foundations of science is that experiments should be able to be repeated. Results should be reproducible.
When results cannot be reproduced, further work is needed to resolve what is true or not, to find out just what made the difference.
Often neither result is “wrong,” but rather slightly different (and valid) approaches show results that are at odds with eachother. This is frequently how new factors important to a story are uncovered.
The resolution of conflicting results can be quite technical. The different results might be due to different techniques, contamination, differences in samples, and other factors. Resolving them usually requires a detailed understanding of the area of science involved.
Last year evidence was presented that Xenotropic murine leukemia virus-related virus (XMRV) was associated with chronic fatigue syndrome (CFS), in a study published in Science. Other researchers found that they could not reproduce the result. Very recently a press release suggests that some up-coming work may back the initial claim of an association between XMRV and CFS. (The research behind this has not yet been published for other researchers to view, so this cannot be confirmed yet.)
I’m not going to discuss the details of the science in the XMRV-CFS story, other than to say it is still an unfolding story and one that ideally should be presented by a virologist. The footnotes contain some links for those that are interested.***
My focus here is on media reporting, in particular how new research findings are presented.
The XMRV-CFS story is an example of a story where the current state of play matters. Reporting the findings out of this context would be misleading.
Many media stories about XMRV (now) correctly point out that this an emerging story, which is good to see. I can’t but help feel that they’ve been bitten (at least) once earlier in this story and aren’t willing to be bitten again!
Press releases from institutes or vested interests (e.g. those offering testing services) may largely present things from their point of view, playing down or leaving out competing views. One reason that media reports that are largely rewritten press releases are looked down on is that they often do not provide the full context that the new findings fall within (if any at all).
Unresolved areas of science can become quite polarised especially when there are conflicting results, with some researchers falling into distinct camps.
Asking what one researcher’s findings indicate is likely to represent only part of the story, possibly a quite misleading part of the story if not placed in the full context properly.
In on-going stories — which in many sense is all science stories — interviewers might ask “what is the state of play?”
Is this an issue that is still being resolved? Do other studies confirm that the new results are (likely to be) correct? What aspects are still uncertain or implied, rather than tested?
Are the new findings accepted by most specialists in the area? (Taking care to remember that it is the soundness of the evidence that matters; opinions, as distinct from evidence, don’t count for much.)
Have other specialists had an opportunity yet to view the research and think it through? This is particularly important if the news is new: it takes time to absorb the details of research findings. Rapid reporting, particularly of embargoed research, can get ahead of the time need to absorb the new results.
Like my earlier thought about asking what is known as opposed to the expert’s opinion, trying to understand the state of play is trying to understand the new results in the context of the literature or relevant scientific community as a whole, rather than the perspective of one person or one research paper.
All of this perhaps re-enforces that specialist stories are best in the hands of specialists, but at least appropriate questions might be asked.
* I have to admit that I feel awkward writing these advisories. What I am writing seems dead obvious to me, to the point that I feel in danger of coming across as an idiot writing them. Nevertheless, the same problems keep recurring in media reporting of science, so I presume these messages are needed. I can only hope that they have some value.
** Desperate consumers jumping on new results, and dubious therapists don’t help matters either.
*** On the potential association of XMRV with CFS. These notes are intended to provide links you may wish to investigate, not as a professional summary of what is understood. As this is an unfolding story, I would encourage people not to rush to conclusions.
Recent, views on the story can be found on at many places on the WWW. Note the discussions in the comments of these articles. This summary of an article that surveys the relevant literature (i.e., a review paper) may be an appropriate starting point for some. (This article also reminds readers that the initial discovery of this virus in humans was in association with prostrate cancer (open access summary article), something that seems to be getting lost in some reports.)
The latest fuss appears to be a recent press release that to my reading has been made by journalists sitting in on a conference presentation, claiming that research by the NIH and FDA back earlier work showing a correlation between XMRV and CFS. This announcement has been reported by these journalists ahead of the research paper, meaning that nobody — virologists or other experts in the specific field included — can comment meaningfully on these claims yet. The relevant institutions have not elaborated further, citing the work being in advance of publication. (A PDF file of the slides is available; see slide 10.) This is a single slide from a 30-slide presentation on issues in screening blood from donors. It is possible that, given their focus on blood screening, the up-coming work may not have direct relevance to CFS.
If it’s not already clear, I dislike advance media reports of this nature. I am similarly wary of reporting from conferences for a variety of reasons. (Too many to give here.)


Senior Member
Logan, Queensland, Australia

Modern science degrees have become very technical. There is no theory of science. I finished my biochem BSc in 2002, and not once did I encounter anything on experimental design. It was all chemistry theory and technique. My knowledge of scientific methods and theory is more from scientific philosophy as part of my artificial intelligence degree (theory of knowledge etc). So most of the people graduating with a modern science degree don't really know anything about experimental design ... and those who finish some other degree probably know even less. I suspect only science postgrads and some statisticians (and maybe some philosophers) get a good rounding in the theory of science and experimental design, and I think even here this often fails. If it didn't, most of the biopsychosocial theory people would have seen their errors and never published anything.


Sorry to have gone all professorial at you.... I'm just getting so fed up with all the bad science flying around.... Clearly our schools need to do a lot better at teaching basic science literacy. [grumble, grumble]


Senior Member
Clay, Alabama
My comment (also on Facebook)

News is what is new. It isn't "olds." So it is understandable a change in conclusions about an illness that affects large percent of population is a matter of public concern. These changes happen through published studies, so understandable a study that reveals something new would be news worthy. I hope this blogger is not proposing news media hold off on reporting until six studies are published that agree.

If the message is report in context, give opposing views, tell history, tell limitations of study, or other possible explanations, then I agree. But that is true in reporting on politics, crime, etc. Everything. Responsible journalism is precise and clear, not sensationalizing, but giving coverage appropriate to the public interest and impact. Not according to the time table of the sources. And not waiting a year until everyone agrees. Reports what is new at that time.

While it is true that reporting on science may have unskilled reporting to the unskilled (as far as medicine), it isn't hard to say, "They used different methods." or "Alternative explanations for these results include....."

The problem I see is that it takes six months for a paper to be published. The public's need for information is much more immediate than the standard research study cycle. And this is even moreso when we are talking about a public health threat or new research on an illness that causes suffering. And a new retrovirus? It would be irresponsible not to report on it as it develops.

While I understand reviewers are needed, in addition to an editor, can these issues not be handled within a month?

I work in the news business. And T.V. news must take an event and get it to public in less than five hours, often. Newspapers generally have a little longer. So why can't a medical journal get something to the public within a month? I guess it is because they are serving other researchers and not the public. That's the difference.

Which is why they will reveal their unpublished info to each other in conferences and workshops, but not to the public until the medical journal, in their own time, decides to put it out.