In order to arrive at what you do not know you must go by a way which is the way of ignorance.
The fundamental problem of original research is that it always takes place in areas of ignorance. This is unsettling enough when everyone agrees they are ignorant. When many feel they already know what is going on, without testing assumptions, original research becomes very difficult. In some cases, the science in question can only advance one funeral at a time.
There is an inherent paradox in training research scientists. You dont collect a series of academic merit badges in record time -- with high grades -- by questioning everything instructors tell you. A second problem is that people who have spent years learning preexisting science are likely to be uncomfortable when placed in a situation where there are no authorities or sources. A final hurdle comes from the transition from being graded, where avoiding criticism is paramount, to being the odd man/woman out in advocating a position no one has previously considered.
The wonder is not that many scientists are unsuited to do original research, it is that any survive the process of training with their creativity and enthusiasm intact. Peer review for publication and funding can easily control those who do, because in any established field scientists who are not suited to original research always outnumber those who are. The result is that major innovation nearly always comes from the fringes, if not outside the field.
Pasteur is a good example of an innovator. He was not trained as a medical doctor or even a biologist, but as a chemist. (Chemists remember Pasteur for the discovery of stereoisomers in racemic acid. For contrast, Ignatz Semmelweis was trained as an MD, and his story is particularly sad.) Pasteur famously said Chance favors the prepared mind. He carefully avoided saying what happens when preparation is inimical to innovation.
Switching fields, to show there is nothing special about biology, Roentgen discovered X-rays because a screen coated with barium platinocyanide was a few meters away from a Lenard tube when he tested it in a darkened room to check for light leaks in the covering. (He planned to use this later in his experiments, so would have made the discovery later, but the question of why he had prepared such an unusual screen remains. There is some reason to believe he had heard of earlier anomalous results.) Though he was clearly working as a physicist, he is considered founder of the medical field of radiology. He is a typical innovator in another respect, he was trained in mechanical engineering, not physics, or electrical engineering, which scarcely existed.
Roentgens discoveries sparked interest in phosphorescence and fluorescence. This led Henri Becquerel to test samples of uranium minerals known to glow after exposure to sunlight to see if they also emitted X-rays. He found they fogged photographic plates even if they did not phosphoresce. This was the discovery of radioactivity, something nobody had anticipated. When they learned that the energy these minerals were emitting did not fade away quickly, they began to look for other sources of radiation which might be powering these emissions. This led to the discovery of cosmic rays. (These were not the answer either, it involves E=MC^2.)
That discovery also took place via a bank shot. C. T. R. Wilson was working as a meteorologist on the summit of Ben Nevis when he became interested in the Brocken Spectre, and other interactions of light with clouds. He couldnt easily get financing to study such frivolous things, so he decided to study cloud formation in the laboratory, which certainly might contribute to understanding weather. For this he needed a chamber in which air would undergo the changes in temperature, pressure and humidity which led to the formation of clouds. Droplets forming clouds condense around charged particles. His cloud chamber became a detector of energetic charged particles which ionized air by knocking off electrons, leaving visible trails in the chamber. One unexpected finding was that ionizing radiation was more likely to be found on mountain tops than in (most) deep mines. Cosmic rays were coming from outside the Earth.
Meanwhile, back with a scientist who did innovate, but missed a great shot at immortality, consider William Crookes. Like most innovative characters in this story, he came from a different field. He was trained as a chemist, and had a strong interest in spectroscopy. To make chemical elements emit clear spectra he could use to identify them and measure properties, flame spectra were not pure enough. He developed vacuum tubes in which glows were excited by electrical discharges. (These went on to become the foundation of electronics, before being replaced by solid state devices.)
Even so, he missed several opportunities. It was J.J. Thompson who determined that the beam in a cathode ray tube was composed of charged particles smaller than atoms. More embarrassing, there is a story that Crookes once found a stack of photographic plates stored under one of his bulky tubes fogged in some way he could not explain. He berated the laboratory assistant for storing them where they could be damaged, and thus missed the discovery of X-rays.
Lets get back to medicine, in which all here have a personal stake. What about major innovations there?
Anesthesia is a good place to start. Nitrous oxide (N2O) was discovered by a chemist, Joseph Priestly, under the name phlogisticated nitrous air. (Not exactly great from a marketing standpoint. The term gas was later introduced by Lavoisier who derived it from the same Greek root as chaos, to indicate of state without fixed form or volume. The connection Priestly postulated with the theory of phlogiston was a major obstacle to understanding of the chemistry of N2O.) Thomas Beddoes and James Watt later published a book on possible medical uses of this, and other gases, which they called factitious airs. Beddoes was a physician, but Watt was, as might now be predicted, out of his field.
Beddoes was wedded to the idea that these gases could be therapeutic. In this he was ahead of his time, at least with respect to oxygen. (Though it wasn't called oxygen until Lavoisier named it under the erroneous impression it was a component of all acids. We should be thankful it wasn't named antiphlogiston.) Unfortunately, other experiments with gases had less desirable outcomes, particularly with carbon monoxide. Humphrey Davy survived an experiment, a young French chemist later did not.
Many Englishmen were aware of the experiments at Beddoes Pneumatic Institute in Bristol. Some well-known students and poets even took part. The recreational properties of N2O were explored at some length. During the course of these many noted the complete absence of pain. Now we come to the inexplicable part of the story.
Clinical trials of N2O began in 1798. These were aimed at therapeutic use in treatment of diseases like tuberculosis. Despite those early reports of anesthesia, I can find no evidence of deliberate use of N2O to relieve pain in surgery prior to 11 December 1844, when a dentist from Hartford, Connecticut used it during the extraction of a tooth. (By this time, diethyl ether had already been used as a general anesthetic, but that is another story.) This casts a poor light on innovation within medicine.
What about antisepsis? Many readers will be surprised to learn the germ theory of infectious disease wasn't presented as a real medical hypothesis until after 1850. Pasteur and Henle were two prominent proponents, but there were others. (I wont even try to separate conflicting claims of priority.) There were plenty of earlier guesses and suggestions, but the science of microbiology remained separate from medicine for a large part of the 19th century.
As strange as it now seems, microbiology was about a remote from the practice of medicine as bird watching or the study of lepidoptera. People tend to forget that Robert Koch developed his famous postulates about detecting pathogens because he was fighting the prevailing miasma theory of diseases like cholera. We are still fighting over germ theory today.
As a linguistic fossil of that period, the name malaria (Italian for bad air) remains in use for the disease caused by a microbial parasite carried by a mosquito vector. Influenza is another Italian name, probably referring to the malign influence of winter stars. This was much more plausible to many than the idea of a submicroscopic virus which could be crystallized and stored like salt. (For that matter, I'm afraid it still is. Does your newspaper include horoscopes?)
Acceptance of the germ theory was surprisingly slow, but, in a few cases, one dramatic example could change medical practice, if the right person noticed. During the American Civil War (a.k.a. The War Between The States,) John Shaw Billings was medical inspector for the Army of the Potomac (Union forces). The battle of Gettysburg left about 50,000 casualties, completely overwhelming medical facilities. Thousands of wounded soldiers could not even be housed in hospital tents, and lay outside even in rain. To Billings' surprise those outside the tents fared better than those inside. You could call this a large scale experiment in hygiene.
Billings went on to design new hospitals, notably at Johns Hopkins Hospital, where a central building is still named for him. His designs became models for later hospitals. Hospitals became somewhat less likely to spread disease.
What about antibiotics? (Patience, they show up later.) Once infectious agents were identified or suspected, the first thought seems to have been to poison them. Early antiseptics were ferocious. One reason for surgeons wearing rubber gloves was to protect their hands from carbolic acid being sprayed. (The quip about Listers surgical practice was that he began an operation by saying Gentlemen, let us spray.) It took a while to realize that wounds were often being damaged more by antiseptics than pathogens. (This discovery came during WWI, which provided abundant opportunities for the study of wounds and sepsis.) The switch from antiseptic to aseptic surgery still took time.
The idea of selectively poisoning pathogens drove the early search for chemical therapies. The spirochete which causes syphilis, treponema pallidum, was attacked with a variety of compounds designed to deliver either mercury or arsenic to the bacteria. These were also toxic to the cells of the patients body.
Paul Ehrlich described his ideal therapeutic agents as magic bullets (literally Zauberkugeln German for magic balls). The idea was that a sort of scattershot attack would magically hit only the pathogens. After mercury, many early drugs for syphilis, like Salvarsan, were arsenic compounds. Today, treatment typically uses penicillin-G or more modern antibiotics. No toxins are required.
The idea of selective poisoning lingered on in medicine. When the Bayer company developed the first sulfa drug, their chemists tried to use their experience in tweaking dyes to produce different colors to produce a compound that would selectively poison bacteria. Dyes had shown surprisingly selective ability to stain both tissues and pathogens. (Ask a pathologist what Gram-positive or Gram-negative means.)
If I said chemists knew what was going on, while doctors did not, I would be lying. The real nature of the chemical bond wasn't elucidated until after the first quantum mechanics was developed in the 1920s. Predicting the way molecules will bond to other molecules or tissues remains something of an art. Despite this ignorance chemists had the ability to alter molecules in ways that were not entirely random.
There may have been patent issues behind bypassing sulfanilamide for colored compounds, but many of those same chemists appear to have been genuinely surprised when a French group found that ordinary, colorless sulfanilamide was cheaper and more effective than the red-colored Prontosil, developed at great expense.
Sulfa drugs are now classified as antinutrients. They dont so much poison bacteria as allow them to starve to death while ingesting substances almost entirely harmless to people. Most antibiotics use entirely different mechanisms. In oversimplified terms they merely mark pathogens and/or alert the immune system these are present.
You can find another technological fossil in the brand-name topical antiseptic Mercurochrome (merbromine). This hits two persistent beliefs which help sales, it is an organomercuric compound, therefore toxic, and it has a bright red color. ("No mercy to pathogens!") It was commonly used in my childhood, and I believe it is still on the market.
A third round of the poisoning theory applied to cancer. This is still with us. Because cancer cells are derived from the patients own body, it is virtually impossible to avoid serious side effects.
In a high-school science club, I once went on a tour of a cancer research lab. Like most young people seeing the treatment of laboratory animals, I was bothered by the suffering involved. It had to be for an important purpose with good reasons. After I heard the rationale behind their experiments, I left with the strong conviction those people didnt know what they were doing. I have seen no reason to modify that opinion since. To avoid being treated like those animals, I have abstained from smoking ever since.
Ill take up the role of ignorance when new therapies are being devised and tested in my next installment. What I've discussed above is primarily passive ignorance, the absence of knowledge. Active ignorance is a much more dangerous force.