Review: 'Through the Shadowlands’ describes Julie Rehmeyer's ME/CFS Odyssey
I should note at the outset that this review is based on an audio version of the galleys and the epilogue from the finished work. Julie Rehmeyer sent me the final version as a PDF, but for some reason my text to voice software (Kurzweil) had issues with it. I understand that it is...
Discuss the article on the Forums.

Nature editorial: the human desire to be right is a major problem for robust science

Discussion in 'Other Health News and Research' started by Simon, Oct 8, 2015.

  1. Simon

    Simon

    Messages:
    1,921
    Likes:
    14,516
    Monmouth, UK
    Let’s think about cognitive bias : Nature News & Comment

    Nature's latest on problems in research focus on the problem of being human.
    There follows a series of recommendations, including crowdsourcing analysis (one dataset, lots of different teams, which produces more accurate, more nuanced and less biased results); blind analysis (prize to anyone who can explain this in lay terms) and preregistering analyis plans [and sticking to them!]. Worth a read.

    There are several related articles:

    The editorial points to the big data fields of genomics and proteomics that got their act together after early work 'was plagued by false positives':
    Added: couple of interesting points from the crowdfunding analysis paper, which looks at what happened when 29 different teams analysed the same dataset: after discusssion, most of the approaches were deemed valid, though they produced a range of answers:

    Crowdsourcing research can reveal how conclusions are contingent on analytical choices.

    Under the current system, strong storylines win out over messy results. Worse, once a finding has been published in a journal, it becomes difficult to challenge. Ideas become entrenched too quickly, and uprooting them is more disruptive than it ought to be. The crowdsourcing approach gives space to dissenting opinions.
     
    Last edited: Oct 8, 2015
    Helen, Esther12, Woolie and 8 others like this.
  2. Sean

    Sean Senior Member

    Messages:
    3,257
    Likes:
    17,985
    The first principle is that you must not fool yourself — and you are the easiest person to fool.

    Richard Feynman.
     
    Helen, Simon, SOC and 2 others like this.
  3. Snow Leopard

    Snow Leopard Hibernating

    Messages:
    4,613
    Likes:
    12,436
    South Australia
    It is not merely 'desire to be right'. It is the fact that careers depends on it (positive results and exaggeration).

    Who is going to fund the guy/lady who has a history of null results? Who is going to get excited about studies that found no effect?

    We need to change how science is funded, if we are to encourage less biased analyses.
     
    Helen, Sidereal, Esther12 and 6 others like this.
  4. snowathlete

    snowathlete

    Messages:
    3,312
    Likes:
    14,610
    UK
    I agree, there are a range of problems, as Simon highlighted, all true. But there is also the issue that people (including whole groups) are invested in certain outcomes as they impact income and status.

    You see all the same things in other fields, history, etc.

    Good that these things are being debated openly.
     
    Sean, Antares in NYC, ahmo and 2 others like this.
  5. alex3619

    alex3619 Senior Member

    Messages:
    12,489
    Likes:
    35,079
    Logan, Queensland, Australia
    In a broader context, this extends to the entire notion of authority and success in society. Its a social problem. Expert good, doubting person bad. I would like to quote Bertrand Russell:

    [​IMG]

    Getting around this issue, in science, was in part what Critical Rationalism was intended for.

    Getting around this issue, in general, was in part the purpose of Pancritical Rationalism.

    Public relations, political and managerial techniques are often about creating the illusion of certainty.

    Doctors like to convey an attitude of authority. I suspect that is why evidence based medicine has wide appeal, but evidence based practice is disliked.

    PS Apparently the image link does not work for everyone, so here is the quote:

    "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts. "
     
    Last edited: Oct 8, 2015
    Helen, Simon, SOC and 2 others like this.
  6. Antares in NYC

    Antares in NYC Senior Member

    Messages:
    582
    Likes:
    1,648
    USA
    Excellent find, @Simon. Great article.

    It does explain some of the controversies and lack of progress in research for diseases like ME/CFS and Lyme. It's frustrating to read the power dynamics in the world of scientific research, but glad that this problem is now in the spotlight.

    Seems like we didn't learn much from the experience of Galileo. Just replace the church with the appropriate medical boards/scientific authorities.
     
    Helen likes this.
  7. Woolie

    Woolie Senior Member

    Messages:
    1,930
    Likes:
    14,556
    Yep, its a problem in research. But add an extra zero to that when you're talking bout medical diagnosis:

    Confirmation Bias and Diagnostic Errors
    Deborah Cowley, MD reviewing Mendel R et al. Psychol Med 2011 May 20.
    Seeking disconfirmatory evidence can help to improve diagnostic accuracy.

    Psychology studies have shown that people tend to confirm their preconceived ideas and ignore contradictory information, a phenomenon known as confirmation bias. To examine the role of confirmation bias in psychiatric diagnosis, researchers gave a case vignette to 75 psychiatrists (mean duration of professional experience, 6 years) and 75 fourth-year medical students. Participants were asked to choose a preliminary diagnosis of depression or Alzheimer disease and to recommend a treatment. The vignette was designed so that depression would seem the most appropriate diagnosis. Participants could then opt to view up to 12 items of narrative information; their “summary theses” were balanced between the two diagnoses, but the narratives overall favored the dementia diagnosis.

    For the preliminary diagnosis, 97% of psychiatrists and 95% of students chose depression. Thirteen percent of psychiatrists and 25% of students chose to see more confirmatory than disconfirmatory items. In the end, 59% of psychiatrists and 64% of students reached the correct diagnosis of Alzheimer disease. Psychiatrists performing confirmatory searches were less experienced and more likely to make the wrong diagnosis (70% vs. 27% of those who sought disconfirming information). Participants were more likely to make the wrong final diagnosis if they chose to view six or fewer pieces of additional information. Making the wrong diagnosis affected treatment decisions.

    - See more at: http://www.jwatch.org/jp20110613000...as-and-diagnostic-errors#sthash.XuC92akW.dpuf
     
    A.B., Valentijn, Simon and 5 others like this.
  8. alex3619

    alex3619 Senior Member

    Messages:
    12,489
    Likes:
    35,079
    Logan, Queensland, Australia
    It is well known that confirmatory strategies have a high error rate. Popper emphasized one way to address this in Critical Rationalism, its not enough to gather data that supports an hypothesis (though this is fine while developing an hypothesis) you have to create experiments that test the hypothesis. You have to actually go looking for disconfirmatory data.
     
  9. Simon

    Simon

    Messages:
    1,921
    Likes:
    14,516
    Monmouth, UK
    It's a great quote!

    But I don't think we should assume that We on the forum are universally wise and They are the fools; the thing is, we're all human, with our own biases, desire to be right and lack of interest in disconfirmatory evidence. The best we can hope for is to be more aware of those biases.

    One of the interesting ideas in the crowdsource article I mentioned above is using independent teams as devil's advocates:
     
    Last edited: Oct 9, 2015
    Woolie, Maria1 and A.B. like this.
  10. alex3619

    alex3619 Senior Member

    Messages:
    12,489
    Likes:
    35,079
    Logan, Queensland, Australia
    It is indeed interesting, and its supposed to be, in part, what happens from peer criticism. However to do that requires the data be made public to other researchers. Data secrecy is the enemy of good science.

    Nobody is really wise. Just more or less wise. Human nature, brain limitations, cultural limitations, the impossibility of knowing everything, guarantee nobody gets it right in all situations. What makes wisdom different, in my view, is that some recognize they and others have limitations and work to mitigate that. The opposite is those who are so convinced they are right they ignore all criticism.
     
  11. Simon

    Simon

    Messages:
    1,921
    Likes:
    14,516
    Monmouth, UK
    A bit more about another article in the Nature special issue

    How scientists fool themselves – and how they can stop : Nature News & Comment

    On the problems of big data with the risk of being able to mine any answer with enough searching:
    Solutions discussed included this
    Another one is getting peer review - and acceptance - of studies based on the submitted methodology, before the work is done. After all, peer reviewers are supposed to focus on methodological flaws, not whether or not they like the findings:
    Visual summary:


    [​IMG]
     
    Last edited: Oct 10, 2015
    Woolie, Helen, ahimsa and 2 others like this.
  12. Sean

    Sean Senior Member

    Messages:
    3,257
    Likes:
    17,985
    Sounds good. :thumbsup:
     
    Simon likes this.
  13. alex3619

    alex3619 Senior Member

    Messages:
    12,489
    Likes:
    35,079
    Logan, Queensland, Australia
    "Science is an ongoing race between our inventing ways to fool ourselves, and our inventing ways to avoid fooling ourselves."

    :)
     
    Antares in NYC likes this.

See more popular forum discussions.

Share This Page