Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Thursday, August 7, 2014

Hepatitis in the summer of '69

http://upload.wikimedia.org/wikipedia/commons/f/f5/Hepatitis_A_virus_02.jpgA remarkable epidemic took place in 1969 at the College of the Holy Cross in Worcester, Massachusetts. In the autumn of that year, the college was forced to cancel the football season after the first two games due to an outbreak of hepatitis. 

The first football game of that season was against Harvard and took place on September 27th; Holy Cross lost 13-0. The team appeared sluggish to fans, and one player missed the game due to fever. Michael Neagle described what happened next in a 2004 essay:
Players began dropping out during the week leading up to the team’s next game at Dartmouth [on October 4th]. What had been described as a “flu bug” by newspapers during the week was confirmed as hepatitis the day of the game. Eight players did not make the trip because of illness. Some got sick on the drive up. More were sidelined when they fell ill during the game . . .
Holy Cross lost 38-6. There were interesting facets to this outbreak, as told in a 1972 Associated Press story:
The outbreak was somewhat puzzling because faculty members, the freshman football team, and others on the Worcester, Mass., campus before formal opening of classes were not affected. Food services were studied and did not produce suspicious leads.
Neagle describes what was eventually pieced together:
. . . [the] season was doomed after just the second day of practice. On Aug. 29, a hot summer day in Worcester, on the practice fields where the Hart Center now stands, players drank water from a faucet that was later found to be contaminated with hepatitis. Though investigators almost immediately suspected the drinking fountain as the source of the illness, it took nearly a year to determine conclusively the sequence of events that led to the contamination.

On that fateful day, firefighters battled a blaze on nearby Cambridge Street. This caused a drop in the water pressure, allowing ground water to seep into the practice field’s irrigation system. That ground water had been contaminated by a group of children living near the practice facility who were already infected with hepatitis. Once the players drank from the contaminated faucet, they too became infected.
A 1972 study by Morse et al described the epidemiology of the event:
Of 97 persons exposed, 90 were infected, 32 experienced typical icteric [jaundice] disease, 22 were anicteric but symptomatic, and 36 asymptomatic players were recognized as having significantly elevated serum glutamic pyruvic transaminase values (> 100 units). Other athletes, using the same facilities but arriving six days after the established date of exposure, were unaffected. The decision to obtain blood samples from the entire team, as soon as the initial cases were recognized, resulted in the demonstration of an unexpectedly high attack rate of 93% . . .
An attack rate of 93% is remarkable, but potentially consistent with a high inoculum that could have been delivered by contaminated water. Friedman et al returned to the event in a 1985 study. Using a radioimmunoassay to test stored serum samples for IgM antibody to hepatitis A virus, they found that
Only individuals with icteric hepatitis were found to have IgM anti-HAV in serum; those with presumed anicteric illness were shown not to be infected with hepatitis A virus. The attack rate was thus only 34%, not 93% as originally reported, and the incidence of icteric illness in those infected was 100%, not 33%.
What made the other players sick thus remains a mystery, though one can speculate about potential pathogens in the environment that could have contaminated the practice field faucet given the negative pressure scenario. I'm always intrigued by disease events that seem so open-and-shut based on the technology of one era but less so when analyzed with the technology of another. This is one of those events.

(image source: Wikipedia)

Monday, July 21, 2014

Software and computing: How far we've come in a very short time

File:Computer-aj aj ashton 01.svgWhen I was in graduate school in the early 1990s, DEC and SGI Unix workstations were the hottest things around. We programmed in Fortran 77 mostly, and occasionally in Matlab® and Macsyma. As I recall, the different machines had up to a few 100 megabytes of RAM and processor speeds up to a few hundred MHz. I had grown up programming some of the first personal computers, which were very modest by comparison, and using these computers made me feel as if almost anything could be achieved computationally. Such machines cost well over $10K and the operating systems were proprietary, licensed, and expensive -- as were most of the useful applications.

Today, fast, high capacity, multi-core Linux machines are cheap. They can run Fortran and other traditional languages such as C, which are now available for free, as well as new open source packages like R and the Python language. Many of these packages are highly developed and are continuously under expansion and refinement. Libraries exist for nearly any conceivable computational problem. There are even open source analogues of Matlab® (Octave) and Macsyma (Maxima).

Very advanced methods of computation are now widely available for low cost. A major reason this has happened has been the open source software movement, the ideas of which extend now to making data and research codes used in published studies available for others to use. 25 years ago, when beginning research on a new problem, one commonly had to write new code from scratch. Today, one can turn to blogs or GitHub to look for codes that can be adapted for the problem at hand, radically shortening the code development, testing, and validation cycle. Recent work emphasizing the reproducibility and transparency of computational science promises to extend such progress farther still.

Such developments allow applications and methods to be shared and applied very broadly and across research fields. It's not uncommon for methods and codes developed for engineering, physics, and finance, for example, to be applied to problems in biology, medicine, and public health. Instead of writing code to translate abstract or unfamiliar equations into a local implementation, one often only has to install a library or find and download code (e.g., from GitHub or an online open source journal), possibly revise, and then apply to data. More time can be spent on thinking and communicating science, instead of coding and computing.

With these advances come dangers as well, but these dangers are manageable. For example, undetected bugs in codes can quickly threaten the integrity of results across multiple fields -- a frightening proposition. Similarly, it is also possible to use methods and codes in ways that are theoretically incorrect or unjustified if one doesn't understand the basis and limitations of those methods and codes. Such dangers are nothing new in science, and highlight the importance of working with others who are expert in new methods.

Current trends will only increase the computational capabilities available. It's an exciting time to work in mathematical and computational methods in biomedical science.

(image source: Wikipedia)

Sunday, July 13, 2014

Anti-vaccination movement: Nothing new

PHIL Image 14538I've always thought of the anti-vaccination movement as beginning in the aftermath of the bogus (not to mention fraudulent and retracted) 1998 paper associating vaccines with autism. Recently I've become more interested in the movement and have begun reviewing the associated literature. Perhaps what I've found shouldn't surprise me, but in reality I'm astonished: Notions against vaccination have existed for a long time, and date back to at least the British compulsory vaccination laws of the 19th Century.

Jeffrey Baker describes the history of anti-vaccine movements in a very informative paper on the pertussis vaccine controversy in Great Britain in the late 1970s and early 1980s. His study recounts and analyzes how a 1974 series of case reports describing alleged diphtheria–tetanus–pertussis (DTP) vaccine adverse reactions led to plummeting vaccination rates and a resurgence in disease. The study describes many dynamics taking place then that resonate with events surrounding the MMR vaccine recently. For example,
  • Reports of supposed vaccine injuries were published
  • Vaccine victim/anti-vaccine advocacy groups were formed
  • A number of physicians recommending against vaccination emerged as a group
These and other forces led to a sharp decline in public acceptance of the DTP vaccine then in use and an increased incidence of pertussis, the likes of which had not been observed for 20 years (which is another similarity with the current outbreaks of vaccine preventable disease in the United States).

Importantly, Baker hypothesizes that, although the press played a role in initiating the anti-DTP vaccine movement and attendant epidemics, it was not the only factor. He points out that the British medical profession was deeply divided, "reflecting quite real uncertainties surrounding the safety and efficacy of the vaccine in the 1970s." (Note that although the medical profession isn't presently divided on the issue of the MMR vaccine, there is some reason to think that younger doctors are less likely to believe that vaccines are efficacious and safe than more senior doctors are.) Moreover, he notes that
Parents in vaccine victim advocacy groups played an additional important role in sustaining the crisis. The ambivalence of both public and medical profession . . . are best understood against the background of Britain’s long history of skepticism regarding many vaccines dating back to smallpox.
Here, Baker alludes to the controversy surrounding compulsory smallpox vaccination in the late-19th century in the UK, noting that mandatory vaccination against smallpox virus
. . . represented one of the first intrusions of state public health policy into personal life, and consequently provoked considerable libertarian opposition.
Recent studies by Anna Kata of the tactics and tropes used online by the anti-vaccination movement at present, and of anti-vaccination misinformation on the Internet, reflect many facets of the anti-DTP vaccine movement nearly 40 years ago.

How could ideas opposing vaccination have persisted for well over a century? The phenomenon of groups of people opposing the best public health guidance is not limited to vaccination; other examples include the raw milk movement (which refuses to acknowledge the risk posed by bacterial contamination of raw milk and related products) and the anti-fluoridation movement (which questions the safety of fluoridating public water supplies). As I've mentioned before in this blog, it's critically important to understand such groups and how they make behavioral decisions. It may or not be possible to change their outlooks given such knowledge, but it's hard to imagine doing so without it.

(image source: CDC PHIL image ID#14538)

Monday, March 17, 2014

Bacterial interference and the deliberate colonization of patients

File:Staphylococcus aureus VISA 2.jpgBeginning in the mid-1940s and lasting until the late 1960s, the world saw a dramatic pandemic of staphylococcal infections. This post describes a curious historical episode in research aimed at controlling Staph outbreaks toward the end of that period.

One of the fundamental ideas in ecology is that, depending on the environment and properties of individuals, some types of individuals can out compete other types. When this happens, the less successful individuals can become incompletely or completely displaced. In the 1960s, the idea of microbial competition was actively applied to clinical medicine in a fascinating series of studies, which ultimately ended in tragedy. These studies investigated an idea known as "bacterial interference": the inability of a strain of a bacterium, in this case Staphylococcus aureus, to colonize a particular site of a host following deliberate colonization of that site with another strain of the bacterium.

The notion of using bacterial interference for controlling or preventing epidemics of Staph in hospital nurseries was evaluated and several trials were carried out. How this idea came about and how the studies were done is fascinating and is described in Boris, 1968 and references therein. As the nose is one of the main ecological niches of Staph aureus in humans, newborns were deliberately colonized with an apparently apathogenic strain of Staph aureus (called "strain 502A", after the phage typing scheme then in use) by swabbing the nose and the umbilical stump shortly after birth.

The results were dramatic. Clinical and epidemiological observation revealed a striking lack of staphylococcal disease in the infant study population and in their families. As Shinefield et al 1966 summarized the situation:
It has been clearly demonstrated that artificial colonization of the nasal mucosa of newborns with one strain of Staphylococcus aureus interferes with subsequent acquisition of a second strain of S aureus. This deliberate colonization of infants shortly after birth with a staphylococcal strain of low virulence (strain 502A) has been employed to protect infants from colonization and disease with virulent epidemic strains of S aureus.
The studies on children in university hospital environments were extended to children in a community hospital setting in Light et al, 1967, and found to be effective. Boris et al 1964 applied the idea to adults.

There were reservations discussed in the literature, however. An echo of that concern can be seen in an August 3, 1968, issue of the British Medical Journal, in a short report on a NEJM paper by Light et al describing observations of bacterial interference (not involving deliberate colonization) between Staph aureus and Pseudomonas. In the report, an anonymous author referred to the trials evaluating deliberate colonizations, mentioning that
Ethical objections have been raised to this procedure, but it seems no more objectionable from this standpoint than the use of living vaccines.
Unfortunately, adverse effects soon became known, including a death from infection with the 502A strain. Writing in 1972, Houck et al reported on complications associated with bacterial interference trials. A passage from the abstract describes the death due to septicemia,
An infant of a diabetic mother developed septicemia and meningitis, probably secondary to passing an umbilical vein catheter through the colonized umbilical stump. Staphylococcus aureus 502A and Escherichia coli were isolated from blood culture before death and from autopsy cultures of blood and peritoneum. A meningeal culture grew S aureus 502A. Gram-positive cocci were identified in liver, lung, heart, and meninges. 
They also noted that 
Only two (0.5%) minor 502A infections were seen in 444 spontaneously colonized infants. The benefits of S aureus 502A programs far outweigh their hazards. Disease due to the 502A strain is more frequent when the inoculum applied to the infant is large than when it is kept below 4,000 bacteria. The fatal case emphasizes that bacteria of extremely low virulence may produce serious disease in compromised hosts and that catheterization through a contaminated umbilical stump may induce bacteremia.
Although I haven't done an extensive search for bacterial interference programs after the publication of Houck et al 1972, these activities seem to have terminated after the death.

There are so many things to ponder regarding this curious episode in the 1960s, including how the one death in a few hundred patients, interpreted by Houck et al as a risk far outweighing the hazards, contrasts with current thresholds for attributable risk. Another is the remark that pathogens "of extremely low virulence may produce serious disease in compromised hosts", and how that notion is similar to the practice of avoiding live virus vaccines in recovering HSCT patients during immune system reconstitution.

Recently, Mukherjee and coworkers observed that the beneficial fungal yeast Pichia inhibits growth of pathogenic fungi, including Candida. Candida causes oral candidiasis (thrush) in immunocompromised and immunosuppressed patients. This is exciting; one of the study authors commented
One day, not only could this lead to topical treatment for thrush, but it could also lead to a formulation of therapeutics for systemic fungal infections in all immunocompromised patients . . . In addition to patients with HIV, this would also include very young patients and patients with cancer or diabetes.
I think it's important to know about the history of bacterial interference interventions so that past issues can be recognized and actively avoided in related future investigations.

(image source: Wikipedia