Although rapid gains in life expectancy followed social change and public health measures, progress in the other medical sciences was slow during the first half of the 20th century, possibly because of the debilitating effect of two major world wars. The position changed dramatically after World War II, a time that many still believe was the period of major achievement in the biomedical sciences for improving the health of society. This section outlines some of these developments and the effect they have had on medical practice in both industrial and developing countries. More extensive treatments of this topic are available in several monographs (Cooter and Pickstone 2000; Porter 1997; Weatherall 1995).
Epidemiology and Public Health
Modern epidemiology came into its own after World War II, when increasingly sophisticated statistical methods were first applied to the study of noninfectious disease to analyze the patterns and associations of diseases in large populations. The emergence of clinical epidemiology marked one of the most important successes of the medical sciences in the 20th century.
Up to the 1950s, conditions such as heart attacks, stroke, cancer, and diabetes were bundled together as degenerative disorders, implying that they might be the natural result of wear and tear and the inevitable consequence of aging. However, information about their frequency and distribution, plus, in particular, the speed with which their frequency increased in association with environmental change, provided excellent evidence that many of them have a major environmental component. For example, death certificate rates for cancers of the stomach and lung rose so sharply between 1950 and 1973 that major environmental factors must have been at work generating these diseases in different populations.
The first major success of clinical epidemiology was the demonstration of the relationship between cigarette smoking and lung cancer by Austin Bradford Hill and Richard Doll in the United Kingdom. This work was later replicated in many studies, currently, tobacco is estimated to cause about 8.8 percent of deaths (4.9 million) and 4.1 percent of disability-adjusted life years (59.1 million) (WHO 2002c). Despite this information, the tobacco epidemic continues, with at least 1 million more deaths attributable to tobacco in 2000 than in 1990, mainly in developing countries.
The application of epidemiological approaches to the study of large populations over a long period has provided further invaluable information about environmental factors and disease. One of the most thorough—involving the follow-up of more than 50,000 males in Framingham, Massachusetts—showed unequivocally that a number of factors seem to be linked with the likelihood of developing heart disease (Castelli and Anderson 1986). Such work led to the concept of risk factors, among them smoking, diet (especially the intake of animal fats), blood cholesterol levels, obesity, lack of exercise, and elevated blood pressure. The appreciation by epidemiologists that focusing attention on interventions against low risk factors that involve large numbers of people, as opposed to focusing on the small number of people at high risk, was an important advance. Later, it led to the definition of how important environmental agents may interact with one another—the increased risk of death from tuberculosis in smokers in India, for example.
A substantial amount of work has gone into identifying risk factors for other diseases, such as hypertension, obesity and its accompaniments, and other forms of cancer. Risk factors defined in this way, and from similar analyses of the pathological role of environmental agents such as unsafe water, poor sanitation and hygiene, pollution, and others, form the basis of The World Health Report 2002 (WHO 2002c), which sets out a program for controlling disease globally by reducing 10 conditions: underweight status; unsafe sex; high blood pressure; tobacco consumption; alcohol consumption; unsafe water, sanitation, and hygiene; iron deficiency; indoor smoke from solid fuels; high cholesterol; and obesity. These conditions are calculated to account for more than one-third of all deaths worldwide.
The epidemiological approach has its limitations, however. Where risk factors seem likely to be heterogeneous or of only limited importance, even studies involving large populations continue to give equivocal or contradictory results. Furthermore, a major lack of understanding, on the part not just of the general public but also of those who administer health services, still exists about the precise meaning and interpretation of risk. The confusing messages have led to a certain amount of public cynicism about risk factors, thus diminishing the effect of information about those risk factors that have been established on a solid basis. Why so many people in both industrial and developing countries ignore risk factors that are based on solid data is still not clear; much remains to be learned about social, cultural, psychological, and ethnic differences with respect to education about important risk factors for disease. Finally, little work has been done regarding the perception of risk factors in the developing countries (WHO 2002c).
A more recent development in the field of clinical epidemiology—one that may have major implications for developing countries—stems from the work of Barker (2001) and his colleagues, who obtained evidence suggesting that death rates from cardiovascular disease fell progressively with increasing birthweight, head circumference, and other measures of increased development at birth. Further work has suggested that the development of obesity and type 2 diabetes, which constitute part of the metabolic syndrome, is also associated with low birthweight. The notion that early fetal development may have important consequences for disease in later life is still under evaluation, but its implications, particularly for developing countries, may be far reaching.
The other major development that arose from the application of statistics to medical research was the development of the randomized controlled trial. The principles of numerically based experimental design were set out in the 1920s by the geneticist Ronald Fisher and applied with increasing success after World War II, starting with the work of Hill, Doll, and Cochrane (see Chalmers 1993; Doll 1985). Variations on this theme have become central to every aspect of clinical research involving the assessment of different forms of treatment. More recently, this approach has been extended to provide broad-scale research syntheses to help inform health care and research. Increasing the numbers of patients involved in trials and applying meta-analysis and electronic technology for updating results have made it possible to provide broad-scale analyses combining the results of many different trials. Although meta-analysis has its problems—notably the lack of publication of negative trial data—and although many potential sources of bias exist in the reporting of clinical trials, these difficulties are gradually being addressed (Egger, Davey-Smith, and Altman 2001).
More recent developments in this field come under the general heading of evidence-based medicine (EBM) (Sackett and others 1996). Although it is self-evident that the medical profession should base its work on the best available evidence, the rise of EBM as a way of thinking has been a valuable addition to the development of good clinical practice over the years. It covers certain skills that are not always self-evident, including finding and appraising evidence and, particularly, implementation—that is, actually getting research into practice. Its principles are equally germane to industrial and developing countries, and the skills required, particularly numerical, will have to become part of the education of physicians of the future. To this end, the EBM Toolbox was established (Web site: http://www.ish.ox.ac.uk/ebh.html). However, evidence for best practice obtained from large clinical trials may not always apply to particular patients; obtaining a balance between better EBM and the kind of individualized patient care that forms the basis for good clinical practice will be a major challenge for medical education.
Partial Control of Infectious Disease
The control of communicable disease has been the major advance of the 20th century in scientific medicine. It reflects the combination of improved environmental conditions and public health together with the development of immunization, antimicrobial chemotherapy, and the increasing ability to identify new pathogenic organisms. Currently, live or killed viral or bacterial vaccines—or those based on bacterial polysaccharides or bacterial toxoids—are licensed for the control of 29 common communicable diseases worldwide. The highlight of the field was the eradication of smallpox by 1977. The next target of the World Health Organization (WHO) is the global eradication of poliomyelitis. In 1998, the disease was endemic in more than 125 countries. After a resurgence in 2002, when the number of cases rose to 1,918, the numbers dropped again in 2003 to 748; by March 2004, only 32 cases had been confirmed (Roberts 2004).
The Expanded Program on Immunization (EPI), launched in 1974, which has been taken up by many countries with slight modification, includes Bacillus Calmette-Guérin (BCG) and oral polio vaccine at birth; diphtheria, tetanus, and pertussis at 6, 10, and 14 weeks; measles; and, where relevant, yellow fever at 9 months. Hepatitis B is added at different times in different communities. By 1998, hepatitis B vaccine had been incorporated into the national programs of 90 countries, but an estimated 70 percent of the world's hepatitis B carriers still live in countries without programs (Nossal 1999). Indeed, among 12 million childhood deaths analyzed in 1998, almost 4 million were the result of diseases for which adequate vaccines are available (WHO 2002a).
The development of sulfonamides and penicillin in the period preceding World War II was followed by a remarkable period of progress in the discovery of antimicrobial agents effective against bacteria, fungi, viruses, protozoa, and helminths. Overall, knowledge of the pharmacological mode of action of these agents is best established for antibacterial and antiviral drugs. Antibacterial agents may affect cell wall or protein synthesis, nucleic acid formation, or critical metabolic pathways. Because viruses live and replicate in host cells, antiviral chemotherapy has presented a much greater challenge. However, particularly with the challenge posed by HIV/AIDS, a wide range of antiviral agents has been developed, most of which are nucleoside analogues, nucleoside or nonnucleoside reverse-transcriptase inhibitors, or protease inhibitors. Essentially, those agents interfere with critical self-copying or assembly functions of viruses or retroviruses. Knowledge of the modes of action of antifungal and antiparasitic agents is increasing as well.
Resistance to antimicrobial agents has been recognized since the introduction of effective antibiotics; within a few years, penicillin-resistant strains of Staphylococcus aureus became widespread and penicillin-susceptible strains are now very uncommon (Finch and Williams 1999). At least in part caused by the indiscriminate use of antibiotics in medical practice, animal husbandry, and agriculture, multiple-antibiotic-resistant bacteria are now widespread. Resistance to antiviral agents is also occurring with increasing frequency (Perrin and Telenti 1998), and drug resistance to malaria has gradually increased in frequency and distribution across continents (Noedl, Wongsrichanalai, and Wernsdorfer 2003). The critical issue of drug resistance to infectious agents is covered in detail in chapter 55.
In summary, although the 20th century witnessed remarkable advances in the control of communicable disease, the current position is uncertain. The emergence of new infectious agents, as evidenced by the severe acute respiratory syndrome (SARS) epidemic in 2002, is a reminder of the constant danger posed by the appearance of novel organisms; more than 30 new infective agents have been identified since 1970. Effective vaccines have not yet been developed for some of the most common infections—notably tuberculosis, malaria, and HIV—and rapidly increasing populations of organisms are resistant to antibacterial and antiviral agents. Furthermore, development of new antibiotics and effective antiviral agents with which to control such agents has declined. The indiscriminate use of antibiotics, both in the community and in the hospital populations of the industrial countries, has encouraged the emergence of resistance, a phenomenon exacerbated in some of the developing countries by the use of single antimicrobial agents when combinations would have been less likely to produce resistant strains. Finally, public health measures have been hampered by the rapid movement of populations and by war, famine, and similar social disruptions in developing countries. In short, the war against communicable disease is far from over.
Pathogenesis, Control, and Management of Non-communicable Disease
The second half of the 20th century also yielded major advances in understanding pathophysiology and in managing many common noncommunicable diseases. This phase of development of the medical sciences has been characterized by a remarkable increase in the acquisition of knowledge about the biochemical and physiological basis of disease, information that, combined with some remarkable developments in the pharmaceutical industry, has led to a situation in which few noncommunicable diseases exist for which there is no treatment and many, although not curable, can be controlled over long periods of time.
Many of these advances have stemmed from medical research rather than improved environmental conditions. In 1980, Beeson published an analysis of the changes that occurred in the management of important diseases between the years 1927 and 1975, based on a comparison of methods for treating these conditions in the 1st and 14th editions of a leading American medical textbook. He found that of 181 conditions for which little effective prevention or treatment had existed in 1927, at least 50 had been managed satisfactorily by 1975. Furthermore, most of these advances seem to have stemmed from the fruits of basic and clinical research directed at the understanding of disease mechanisms (Beeson 1980; Comroe and Dripps 1976).
Modern cardiology is a good example of the evolution of scientific medicine. The major technical advances leading to a better appreciation of the physiology and pathology of the heart and circulation included studies of its electrical activity by electrocardiography; the ability to catheterize both sides of the heart; the development of echocardiography; and, more recently, the development of sophisticated ways of visualizing the heart by computerized axial tomography, nuclear magnetic resonance, and isotope scanning. These valuable tools and the development of specialized units to use them have led to a much better understanding of the physiology of the failing heart and of the effects of coronary artery disease and have revolutionized the management of congenital heart disease. Those advances have been backed by the development of effective drugs for the management of heart disease, including diuretics, beta-blockers, a wide variety of antihypertensive agents, calcium-channel blockers, and anticoagulants.
By the late 1960s, surgical techniques were developed to relieve obstruction of the coronary arteries. Coronary bypass surgery and, later, balloon angioplasty became major tools. Progress also occurred in treatment of abnormalities of cardiac rhythm, both pharmacologically and by the implantation of artificial pacemakers. More recently, the development of microelectronic circuits has made it possible to construct implantable pacemakers. Following the success of renal transplantation, cardiac transplantation and, later, heart and lung transplantation also became feasible.
Much of this work has been backed up by large-scale controlled clinical trials. These studies, for example, showed that the early use of clot-dissolving drugs together with aspirin had a major effect on reducing the likelihood of recurrences after an episode of myocardial infarction (figure 5.1). The large number of trials and observational studies of the effects of coronary bypass surgery and dilatation of the coronary arteries with balloons have given somewhat mixed results, although overall little doubt exists that, at least in some forms of coronary artery disease, surgery is able to reduce pain from angina and probably prolong life. Similar positive results have been obtained in trials that set out to evaluate the effect of the control of hypertension (Warrell and others 2003).
Effects of a One-Hour Streptokinase Infusion Together with Aspirin for One Month on the 35-Day Mortality in the Second International Study of Infarct Survival Trial among 17,187 Patients with Acute Myocardial Infarction Who Would Not Normally Have Received (more...)
The management of other chronic diseases, notably those of the gastrointestinal tract, lung, and blood has followed along similar lines. Advances in the understanding of their pathophysiology, combined with advances in analysis at the structural and biochemical levels, have enabled many of these diseases to be managed much more effectively. The pharmaceutical industry has helped enormously by developing agents such as the H2-receptor antagonists and a wide range of drugs directed at bronchospasm. There have been some surprises—the discovery that peptic ulceration is almost certainly caused by a bacterial agent has transformed the management of this disease, dramatically reducing the frequency of surgical intervention. Neurology has benefited greatly from modern diagnostic tools, while psychiatry, though little has been learned about the cause of the major psychoses, has also benefited enormously from the development of drugs for the control of both schizophrenia and the depressive disorders and from the emergence of cognitive-behavior therapy and dynamic psychotherapy.
The second half of the 20th century has witnessed major progress in the diagnosis and management of cancer (reviewed by Souhami and others 2001). Again, this progress has followed from more sophisticated diagnostic technology combined with improvements in radiotherapy and the development of powerful anticancer drugs. This approach has led to remarkable improvements in the outlook for particular cancers, including childhood leukemia, some forms of lymphoma, testicular tumors, and—more recently—tumors of the breast. Progress in managing other cancers has been slower and reflects the results of more accurate staging and assessment of the extent and spread of the tumor; the management of many common cancers still remains unsatisfactory, however. Similarly, although much progress has been made toward the prevention of common cancers—cervix and breast, for example—by population screening programs, the cost-effectiveness of screening for other common cancers—prostate, for example—remains controversial.
Many aspects of maternal and child health have improved significantly. A better understanding of the physiology and disorders of pregnancy together with improved prenatal care and obstetric skills has led to a steady reduction in maternal mortality. In an industrial country, few children now die of childhood infection; the major pediatric problems are genetic and congenital disorders, which account for about 40 percent of admissions in pediatric wards, and behavioral problems (Scriver and others 1973). Until the advent of the molecular era, little progress was made toward an understanding of the cause of these conditions. It is now known that a considerable proportion of cases of mental retardation result from definable chromosomal abnormalities or monogenic diseases, although at least 30 percent of cases remain unexplained. Major improvements have occurred in the surgical management of congenital malformation, but only limited progress has been made toward the treatment of genetic disease. Although a few factors, such as parental age and folate deficiency, have been incriminated, little is known about the reasons for the occurrence of congenital abnormalities.
In summary, the development of scientific medical practice in the 20th century led to a much greater understanding of deranged physiology and has enabled many of the common killers in Western society to be controlled, though few to be cured. However, although epidemiological studies of these conditions have defined a number of risk factors and although a great deal is understood about the pathophysiology of established disease, a major gap remains in our knowledge about how environmental factors actually cause these diseases at the cellular and molecular levels (Weatherall 1995).
Consequences of the Demographic and Epidemiological Transitions of the 20th Century
The period of development of modern scientific medicine has been accompanied by major demographic change (Chen 1996; Feachem and others 1992). The results of increasing urbanization, war and political unrest, famine, massive population movements, and similar issues must have had a major effect on the health of communities during the 20th century, but there has been a steady fall in childhood mortality throughout the New World, Europe, the Middle East, the Indian subcontinent, and many parts of Asia during this period, although unfortunately there has been much less progress in many parts of Sub-Saharan Africa. Although much of the improvement can be ascribed to improving public health and social conditions, the advent of scientific medicine—particularly the control of many infectious diseases of childhood—seems likely to be playing an increasingly important part in this epidemiological transition. Although surveys of the health of adults in the developing world carried out in the 1980s suggested that many people between the ages of 20 and 50 were still suffering mainly from diseases of poverty, many countries have now gone through an epidemiological transition such that the global pattern of disease will change dramatically by 2020, with cardiorespiratory disease, depression, and the results of accidents replacing communicable disease as their major health problems.
Countries undergoing the epidemiological transition are increasingly caught between the two worlds of malnutrition and infectious disease on the one hand and the diseases of industrial countries, particularly cardiac disease, obesity, and diabetes, on the other. The increasing epidemic of tobacco-related diseases in developing countries exacerbates this problem. The global epidemic of obesity and type 2 diabetes is a prime example of this problem (Alberti 2001). An estimated 150 million people are affected with diabetes worldwide, and that number is expected to double by 2025. Furthermore, diabetes is associated with greatly increased risk of cardiovascular disease and hypertension; in some developing countries the rate of stroke is already four to five times that in industrial countries. These frightening figures raise the questions whether, when developing countries have gone through the epidemiological transition, they may face the same pattern of diseases that are affecting industrial countries and whether such diseases may occur much more frequently and be more difficult to control.
Partly because of advances in scientific medicine, industrial countries have to face another large drain on health resources in the new millennium (Olshansky, Carnes, and Cassel 1990). In the United Kingdom, for example, between 1981 and 1989, the number of people ages 75 to 84 rose by 16 percent, and that of people age 85 and over by 39 percent; the current population of males age 85 or over is expected to reach nearly 0.5 million by 2026, at which time close to 1 million females will be in this age group. Those figures reflect the situation for many industrial countries, and a similar trend will occur in every country that passes through the epidemiological transition. Although data about the quality of life of the aged are limited, studies such as the 1986 General Household Survey in the United States indicate that restricted activity per year among people over the age of 65 was 43 days in men and 53 days in women; those data say little about the loneliness and isolation of old age. It is estimated that 20 percent of all people over age 80 will suffer from some degree of dementia, a loss of intellectual function sufficient to render it impossible for them to care for themselves. Scientific medicine in the 20th century has provided highly effective technology for partially correcting the diseases of aging while, at the same time, making little progress toward understanding the biological basis of the aging process. Furthermore, the problems of aging and its effect on health care have received little attention from the international public health community; these problems are not restricted to industrial countries but are becoming increasingly important in middle-income and, to a lesser extent, some low-income countries.
Although dire poverty is self-evident as one of the major causes of ill health in developing countries, this phenomenon is emphatically not confined to those populations. For example, in the United Kingdom, where health care is available to all through a government health service, a major discrepancy in morbidity and mortality exists between different social classes (Black 1980). Clearly this phenomenon is not related to the accessibility of care, and more detailed analyses indicate that it cannot be ascribed wholly to different exposure to risk factors. Undoubtedly social strain, isolation, mild depression, and lack of social support play a role. However, the reasons for these important discrepancies, which occur in every industrial country, remain unclear.
Economic Consequences of High-Technology Medicine
The current high-technology medical practice based on modern scientific medicine must steadily increase health expenditures. Regardless of the mechanisms for the provision of health care, its spiraling costs caused by ever more sophisticated technology and the ability to control most chronic illnesses, combined with greater public awareness and demand for medical care, are resulting in a situation in which most industrial countries are finding it impossible to control the costs of providing health care services.
The U.K. National Health Service (NHS) offers an interesting example of the steady switch to high-technology hospital practice since its inception 50 years ago (Webster 1998). Over that period, the NHS's overall expenditure on health has increased fivefold, even though health expenditure in the United Kingdom absorbs a smaller proportion of gross domestic product than in many neighboring European countries. At the start of the NHS, 48,000 doctors were practicing in the United Kingdom; by 1995 there were 106,845, of whom 61,050 were in hospital practice and 34,594 in general (primary care) practice. Although the number of hospital beds halved over the first 50 years of the NHS, the throughput of the hospital service increased from 3 million to 10 million inpatients per year, over a time when the general population growth was only 19 percent. Similarly, outpatient activity doubled, and total outpatient visits grew from 26 million to 40 million. Because many industrial countries do not have the kind of primary care referral program that is traditional in the United Kingdom, this large skew toward hospital medicine seems likely to be even greater.
The same trends are clearly shown in countries such as Malaysia, which have been rapidly passing through the epidemiological transition and in which health care is provided on a mixed public-private basis. In Malaysia, hospitalization rates have steadily increased since the 1970s, reflecting that use is slowly outstripping population growth. The number of private hospitals and institutions rose phenomenally—more than 300 percent—in the same period. In 1996, the second National Health and Morbidity Survey in Malaysia showed that the median charge per day in private hospitals was 100 times higher than that in Ministry of Health hospitals. Those figures reflect, at least in part, the acquisition of expensive medical technology that in some cases has led to inefficient use of societal resources. As in many countries, the Malaysian government has now established a Health Technology Assessment Unit to provide a mechanism for evaluating the cost-effectiveness of new technology.
Those brief examples of the effect of high-technology practice against completely different backgrounds of the provision of health care reflect the emerging pattern of medical practice in the 20th century. In particular, they emphasize how the rapid developments in high-technology medical practice and the huge costs that have accrued may have dwarfed expenditure on preventive medicine, certainly in some industrial countries and others that have gone through the epidemiological transition.
A central question for medical research and health care planning is whether the reduction in exposure to risk factors that is the current top priority for the control of common diseases in both industrial and developing countries will have a major effect on this continuing rise of high-technology hospital medical practice. The potential of this approach has been discussed in detail recently (WHO 2002c). Although the claims for the benefits of reducing either single or multiple risk factors are impressive, no way exists of knowing to what extent they are attainable. Furthermore, if, as seems likely, they will reduce morbidity and mortality in middle life, what of later? The WHO report admits that it has ignored the problem of competing risks—that is, somebody saved from a stroke in 2001 is then "available" to die from other diseases in ensuing years. Solid information about the role of risk factors exists only for a limited number of noncommunicable diseases; little is known about musculoskeletal disease, the major psychoses, dementia, and many other major causes of morbidity and mortality.
The problems of health care systems and improving performance in health care delivery have been reviewed in World Health Report 2000—Health Systems: Improving Performance (WHO 2000). Relating different systems of health care to outcomes is extremely complex, but this report emphasizes the critical nature of research directed at health care delivery. As a response to the spiraling costs of health care, many governments are introducing repeated reforms of their health care programs without pilot studies or any other scientific indication for their likely success. This vital area of medical research has tended to be neglected in many countries over the later years of the 20th century.
Summary of Scientific Medicine in the 20th Century
The two major achievements of scientific medicine in the 20th century—the development of clinical epidemiology and the partial control of infectious disease—have made only a limited contribution to the health of developing countries. Although in part this limited effect is simply a reflection of poverty and dysfunctional health care systems, it is not the whole story. As exemplified by the fact that of 1,233 new drugs that were marketed between 1975 and 1999, only 13 were approved specifically for tropical diseases, the problem goes much deeper, reflecting neglect by industrial countries of the specific medical problems of developing countries.
For those countries that have gone through the epidemiological transition and for industrial countries, the central problem is quite different. Although the application of public health measures for the control of risk factors appears to have made a major effect on the frequency of some major killers, those gains have been balanced by an increase in the frequency of other common chronic diseases and the problems of an increasingly elderly population. At the same time, remarkable developments in scientific medicine have allowed industrial countries to develop an increasingly effective high-technology, patch-up form of medical practice. None of these countries has worked out a way to control the spiraling costs of health care, and because of their increasing aged populations, little sign exists that things will improve. Although some of the diseases that produce this enormous burden may be at least partially preventable by the more effective control of risk factors, to what extent such control will be achievable is unclear, and for many diseases these factors have not been identified. In short, scientific medicine in the 20th century, for all its successes, has left a major gap in the understanding of the pathogenesis of disease between the action of environmental risk factors and the basic disease processes that follow from exposure to them and that produce the now well-defined deranged physiology that characterizes them.
These problems are reflected, at least in some countries, by increasing public disillusion with conventional medical practice that is rooted in the belief that if modern medicine could control infectious diseases, then it would be equally effective in managing the more chronic diseases that took their place. When this improvement did not happen—and when a mood of increasing frustration about what medicine could achieve had developed—a natural move occurred toward trying to find an alternative answer to these problems. Hence, many countries have seen a major migration toward complementary medicine.
It is against this rather uncertain background that the role of science and technology for medical care in the future has to be examined.
The development of technology, medicine, and social structures has been intertwined since the creation of the research university in the eighteenth century. One of the most significant changes has been the transformation of the clinical perspective to the molecular perspective. The challenge lies, therefore, in the ethical implications of the development of biotechnologies that can change the human organism on a genetic and molecular level.
Keywords Bioethics; Biopower; Functional Magnetic Resonance Imaging (fMRI); MRI; Neuroethics; Positron Emission Topography (PET); Sick Role
It is not easy to decide where to begin a history of science, especially when speaking about the relations between medicine, society, and technology. One could begin with the first classical physicians, Hippocrates (ca. 460 BC– 379 BC) and Galen (ca. 129 AD–216 AD). Indeed, into the mid-eighteenth century, the pendulum of medical wisdom swung between these two names, since knowledge until that time had to be proven by reference to a classical text.
In the history of medicine, Galen is known not only as the first practitioner with a vast anatomical knowledge but also for performing difficult operations that required the use of sophisticated instruments. He is even reputed to have undertaken the first brain surgeries (Toledo-Pereyra, 1973). For centuries, his and Hippocrates's ideas were most often referred to as the defining criteria of all medical knowledge. Up to the mid-eighteenth century, much progress was made in the application of instruments, devices, and drugs that would, in many ways, have been readily available for scholars in line with Galen or Hippocrates.
However, another beginning could be made in the nineteenth century, when modern science was combined with industrialization and technology came to the forefront with the emergence of electricity. Other accounts could focus on the discovery of penicillin or make the case that with the discovery of DNA, a new age dawned in which life could increasingly be directly manipulated, thus pinpointing the decisive moment in medical development to the twentieth century.
However, the incident that may have been most crucial for the development of medicine, and subsequently the use of technology in medicine, came in 1737, when the newly founded University of Goettingen persuaded the famous anatomist Albrecht von Haller to become one of the key figures of its faculty. While at the university, von Haller pioneered an important innovation in the education system by combining both research and education within his professorship (Lenoir, 1981a; Cunningham, 2002, 2003).
From that time forward, in ways they never had before, students lived and worked in close proximity to the creation of knowledge and the innovative application of instruments. For two elemental fields of medical knowledge — anatomy and physiology — this resulted in a spurt in knowledge creation, and by the end of the century, knowledge about physiology had exploded at such a rate that the scientific vocabulary could not keep up. Toward the end of the century, physiologists and anatomists — on the verge of creating the ultimate life science, biology — resorted to the language of the new critical philosophy of Immanuel Kant to find ways of expressing their findings (Lenoir, 1981; Stingl, 2008). It was this course that prepared the way for the breakthrough development of medicine.
Birth of the Clinic
After the emergence of anatomy and physiology, the next important step certainly was the "birth of the clinic" and the emergence of the clinical gaze, as it was called by Michel Foucault (1963). Following the French revolution, two developments set in: the myth of a nationally organized medical profession and the myth that in an untroubled and therefore healthy society, disease would disappear. The effort to realize these two myths, Foucault claimed, rendered the medical doctor a politician. The doctor's gaze became a force; the doctor, considered as all-wise, could see through the veil that covered the eyes of normal men and see the underlying reality. The effectual change from ancient to modern times thus lay not in a transformation of this idea of the doctor as wise but in the theory behind it.
As scientific research increased during this period, knowledge was increasingly perceived as fragile and dynamic. With the installation of the clinic, however, an abode was created for the accumulation of knowledge and its changes. The clinic was also storage for the technological devices employed in modernity. When the clinic was then turned into a facility for research and education as well, it became the prime force behind medical innovation.
This turn was amplified by the emergence of genetics and biotechnology, where the anonymous laboratory became a second stage for the creation of what can be called biopolitics, a political system in which populations' bodies are subject to government control.
Nikolas Rose has argued that as of the early twenty-first century, doctors, clinicians, and researchers have essentially changed their gaze (2007). While most people are still tied to the molar or somatic level, clinicians and experimenters view the human organism as a DNA-based bio-chemical system that needs to be optimized. They have, according to Rose, a molecular gaze, rather than a clinical gaze (2007).
The Clinic Versus the Laboratory
Whether the clinic or the laboratory is the main stage for the development of medical research and technology — and whether the two should be integrated into one site — has been disputed. In the history of physiology, anatomy, neurology, medicine, and psychology, the distinction between the practices of the clinic and the laboratory continued throughout the nineteenth century. Clinicians would not trust "artificial" lab results, while experimenters shunned the individualized experiences and ideas of clinical practitioners as lacking validity and universality. Pitted against each other by their own versions of objectivity and naturalism, clinicians and experimenters divided and reunited time and again.
This theme was repeated in the narrative structure of medical discourse and the technological development of medicine. In the early decades of the twentieth century, the discourse involved renowned scholars from related fields like Lawrence Henderson and Walter Cannon, whose experimental works in physiology became seminal. Henderson, an "occasional sociologist," is also credited with, at least in part, having introduced the idea that the patient-doctor relationship must be described in terms of "an equilibrating social system" (to apply the terminology of Vilfredo Pareto) in which the doctor helps the patient return to normal functioning within society. Whether Henderson or his younger Harvard colleague Talcott Parsons (who worked on the idea around the same time and had approached Henderson for advice on his project) was the actual author of this idea is not entirely clear, but both men used it in their lectures (Stingl, 2008). Parsons introduced the idea that a patient must be seen as occupying a social role, the sick role. Changes in technology, therefore, can be described in regard to the changes in the sick role as part of the social system in which it is embedded. This took a new turn in the 1960s in American medicine, when critical scholars began arguing that progress in medical technology does not necessarily translate into better health care for everyone. Quite the contrary, it can lead to a widening of the gap between social classes with only the wealthy able to afford expensive new treatments and the poor unable to receive other, less expensive treatments because medical progress has made them obsolete.
Certainly, surgical medical technology has already progressed to a stage that not long ago was considered science fiction. The classic idea of the surgeon's job being equitable to that of a "butcher with precision" has become outmoded due to the evolution of less invasive surgical instruments. Contemporary surgeons may employ micro-surgery and robots, as well as telemedicine, a technique in which the surgeon is not even in direct contact with the patient but controls a robot from some other location. These developments...