In her new book, Nafis Hasan looks at how America's “war on cancer” continues and highlights the factors that could threaten its effectiveness. (Image credit: CHRISTOPH BURGSTEDT/SCIENCE PHOTO LIBRARY via Getty Images)
The United States officially began its “war on cancer” with the passage of the National Cancer Act in 1971. The goal, broadly speaking, was to spur research into cancer biology to improve treatments and potentially cures for the disease. But the country has been engaged in this “war” for more than 50 years, and we’re no closer to success, according to Nafis Hasan, a cancer scientist and associate professor at the Brooklyn Institute for Social Research.
In his book, Metastasis: The Rise of the Cancer Industrial Complex and the Horizons of Cure (Common Notions, 2025), Hasan argues that cancer research has focused too heavily on individual treatments, which has weakened overall efforts to reduce cancer rates. For example, in the passage below, he explains how the emphasis on “somatic mutation theory” — which posits that mutations in specific genes are the primary triggers of cancer — has distracted attention from the threats of environmental carcinogens and the benefits that can be achieved through public health efforts to reduce cancer incidence and mortality.
The idea that cancer is hereditary dates back to the early twentieth century. Around 1900, biologists Theodore Boveri and Walter Sutton rediscovered Gregor Mendel’s laws of inheritance and proposed that chromosomes were responsible for transmitting biological characteristics. Boveri later hypothesized that a tumor cell arises when the process of cell division goes awry and chromosomes are distributed inappropriately. In Boveri’s view, “the problem of tumors is a problem of cells.” This may have been the first concept of a “cancer cell” as a single culprit capable of wreaking havoc on the body.
The first experimental evidence for the hereditary transmission of cancer was provided by Harvard scientist Ernest E. Tizzer, who demonstrated that selective breeding of cancer-bearing mice resulted in significantly increased tumor incidence in subsequent generations.
The concept of cancer as a genetic disease was also supported by the eugenics movement, which conducted cancer research to ensure racial “purity” in the 1920s and 1930s. For example, the widely used Pap smear to detect cervical cancer was first introduced at the Third Race Betterment Conference in 1928. Research in Nazi Germany on smoking and lung cancer claimed that differences in cancer rates between Jews and “Aryans” were due to genetic factors (rather than exposure to chemicals in the workplace). The private sector in the United States also took an exclusionary view, with DuPont refusing to hire workers with a family history of cancer given the high rates of bladder cancer among dye workers. Scientist Carl Weller, after discovering retinoblastoma (an eye tumor) in children, advocated sterilization of the parents of such children. As early as 1956, Wilhelm Hueper, the first director of the National Cancer Institute's Environmental Cancer Section, suggested that black workers were best suited for jobs with unavoidable exposure to carcinogenic chemicals, given their perceived resistance to carcinogens such as coal tar, ultraviolet radiation, and petroleum products.
Other researchers have challenged these racial narratives about cancer rates. For example, the idea that West Africans are racially predisposed to higher rates of liver cancer lost its power when Japanese immigrants to the United States began suffering from the same disease, due to aflatoxins [toxins produced by fungi that can be found in many crops] in their diet.
The discovery of the DNA double helix in the 1950s gave impetus to molecular biology, but the field did not address the issue
Sourse: www.livescience.com