Kids who develop acute lymphoblastic leukemia may be the victims of a triple-whammy stroke of bad luck, according to a provocative new theory from a respected British cancer researcher.
If the explanation turns out to be correct, it would be good news for the most common type of childhood cancer: Doctors could prevent cases of acute lymphoblastic leukemia with the strategic introduction of something the world has plenty of: filth and pestilence.
Melvin Greaves, a cell biologist who directs the Center for Evolution and Cancer at the Institute of Cancer Research in London, has been trying to unravel the causes of childhood leukemia for 30 years. His “unified theory” of leukemia, which draws upon research from genetics, immunology, microbiology, epidemiology and evolution, is laid out this week in the journal Nature Reviews Cancer.
Greaves’ treatise is a kind of scientific stem-winder that posits that modern humans, in our admirable zeal to prevent infectious diseases and spruce up the environs in which we raise our young, have inadvertently broken a rule imposed by millions of years of co-evolution with our microbe-filled environment.
That rule, repeated by grandmothers everywhere, tells us that, in life, we must “eat a peck of dirt.”
In the industrialized world, roughly 1 in 2,000 children will develop acute lymphoblastic leukemia before the age of 15. Their bone marrow cells grow and divide rapidly, resulting in too many immature cells that interfere with the normal production of blood cells. Thanks to improved treatments, 90 percent of patients will survive, albeit with persistent health problems and an early burden of existential dread.
Acute lymphoblastic leukemia is on the rise throughout the industrialized world. In the United States and Europe, the incidence is increasing by 1 percent a year. In search of an explanation, researchers have explored a long list of modern enviromental influences, including ionizing radiation, electricity cables, electromagnetic waves and man-made chemicals.
That research just doesn’t add up, Greaves argues.
Instead, he makes the case that three steps, performed in just the right order, set the stage for acute lymphoblastic leukemia.
First, a baby is born with a random transcription error in her genes. It’s a mutation so common—in twin studies, Greaves and his colleagues have discovered that 1 in 20 children are born with it—that it might never have drawn the suspicion of researchers.
The mutation results in a fusion gene called ETV6–RUNX1. But if 1 in 20 are born with it and only 1 in 2,000 end up with acute lymphoblastic leukemia, some trigger is clearly necessary for the disease to progress.
That trigger, argues Greaves, is actually something that doesn’t happen. It must be a combination of events that generally happens in early childhood, and that would prime the immune system to block leukemia. Left unprimed, the immune system is left vulnerable to step three.
Finally, at some key point before the age 20 (when nature appears to wipe the mutation clean), a seemingly minor infection must happen. It could be something as insignificant as a touch of flu.
This final blow may prompt a further genetic mutation, activating the malignant potential of these first two factors and igniting a process in which the bone marrow’s production of immature lymphocytes spins out of control.
Greaves’ “delayed infection” theory suggests that when it comes to teaching the immune system about infectious threats, the sooner the better. In the absence of early priming (and in the presence of the right genetic mutation), a later infection can trigger leukemia.
As support for the role of steps two and three, Greaves cited epidemiological studies that have found the incidence of acute lymphoblastic leukemia is notably higher in children who did not attend day care, were firstborn in their families, were not breastfed, and were born via caesarian section rather than vaginally. All of those, he said, are fair surrogates for lower levels of early-life exposure to germs and infections.
He then dissected a number of recorded clusters of acute lymphoblastic leukemia to find common threads. In one cluster, seven children in Milan, Italy, were diagnosed with the cancer in a four-week period. Greaves’ sleuthing turned up evidence that all seven patients had been infected in a swine flu outbreak three to six months prior to their diagnosis. Six of the seven patients were firstborn children, and none attended day care in the first year of life.
Finally, in mice engineered to have the high levels of the protein that allows acute lymphoblastic leukemia to gain a foothold, Greaves and his colleagues found one highly reliable way to induce leukemia: by moving the mice from germ-free housing to cages teaming with microbial life after their early childhood.
Paul Workman, the chief of London’s Institute of Cancer Research, said Greaves’ work suggested that most cases of childhood leukemia were likely to be preventable.
“It might be done in the same way that is currently under consideration for autoimmune disease or allergies—perhaps with simple and safe interventions to expose infants to a variety of common and harmless bugs,” he said.
Dr. Bert Vogelstein, a Johns Hopkins University geneticist whose research has helped tease out the role of random mutations in cancer, said that scientists have already come to appreciate that two genetic alterations are likely required for acute lymphoblastic leukemia to develop.
That the catalyzing alteration is the result of a bout of infectious illness “is a reasonable and intriguing hypothesis,” said Vogelstein, who called the new hypothesis “stimulating.” But “it is still speculative and much more research must be done to confirm or refute it,” he said.
Leave a Reply