undiagnosed
Senior Member
- Messages
- 246
- Location
- United States
Interesting article about systemic and predictable errors in human judgment and its implications in medicine.
Welcome to Phoenix Rising!
Created in 2008, Phoenix Rising is the largest and oldest forum dedicated to furthering the understanding of, and finding treatments for, complex chronic illnesses such as chronic fatigue syndrome (ME/CFS), fibromyalgia, long COVID, postural orthostatic tachycardia syndrome (POTS), mast cell activation syndrome (MCAS), and allied diseases.
To become a member, simply click the Register button at the top right.
I think GPs are doing something similar all the time, primarily due to never having the time to get all of the evidence they need to make a good judgement.The point, once again, wasn’t that people were stupid. This particular rule they used to judge probabilities (the easier it is for me to retrieve from my memory, the more likely it is) often worked well. But if you presented people with situations in which the evidence they needed to judge them accurately was hard for them to retrieve from their memories, and misleading evidence came easily to mind, they made mistakes. “Consequently,” Amos and Danny wrote, “the use of the availability heuristic leads to systematic biases.” Human judgment was distorted by ... the memorable.
I now often frustrate my colleagues by answering their questions with other questions
To Redelmeier the very idea that there was a great deal of uncertainty in medicine went largely unacknowledged by its authorities.
The brain does not have the processing power to reassess everything every single time, so has evolved an extremely efficient way of basing decision making mostly on assumption.
The financial imperative to rush through as many patients as possible, especially if governed by fixed payments etc., drives this situation to a large extent. Medicine is in many ways going backwards, even while medical science is still advancing (and taking dead end paths as well).I think GPs are doing something similar all the time, primarily due to never having the time to get all of the evidence they need to make a good judgement.
Artificial intelligence is all about heuristics ... its how you write expert systems.
But of course closed loop control is also used where not all functionality is fully understood, and closing the loop helps cope with the discrepancies. One of the dangers in engineering is arrogantly believing the system is fully understood, and the modelling of it is therefore virtually perfect. Solutions based on that model will therefore also be presumed nigh on perfect. Until real life shows up loopholes in the model, and the "solution" fails to solve. It seems there are some medical "professionals" of the same persuasion.Heuristics are fine in closed loop systems where all functions are well understood. Not so in medicine, which is not understood to the same level as most engineered systems.
Reinforcement learning has major limitations, regardless of the system. Its how science used to work, but (aside from poor research practices in disciplines like psychiatry) it was abandoned in science as it can lead to bad decision making and hypotheses.reinforcement learning
And you really think they would do that?!I couldn't help thinking that the BPS brigade could do with reading it and reflecting on how their biases might be affecting their work.