Basic Concepts of Health
Prior to discussing this complex system, it is important to identify three major concepts of healthcare delivery: primary, secondary, and tertiary prevention. These concepts are vital to understanding the U.S. healthcare system because different components of the healthcare system focus on these different areas of health, which often results in lack of coordination between the different components.
Primary, Secondary, and Tertiary Prevention
According to the American Heritage Dictionary (2001), prevention is defined as “slowing down or stopping the course of an event.” Primary prevention avoids the development of a disease. Promotion activities such as health education are primary prevention. Other examples include smoking cessation programs, immunization programs, and educational programs for pregnancy and employee safety. State health departments often develop targeted, large education campaigns regarding a specific health issue in their area. Secondary prevention activities are focused on early disease detection, which prevents progression of the disease. Screening programs, such as high blood pressure testing, are examples of secondary prevention activities. Colonoscopies and mammograms are also examples of secondary prevention activities. Many local health departments implement secondary prevention activities. Tertiary prevention reduces the impact of an already established disease by minimizing disease-related complications. Tertiary prevention focuses on rehabilitation and monitoring of diseased individuals. A person with high blood pressure who is taking blood pressure medication is an example of tertiary prevention. A physician who writes a prescription for that blood pressure medication to control high blood pressure is an example of tertiary prevention. Traditional medicine focuses on tertiary prevention, although more primary care providers are encouraging and educating their patients on healthy behaviors (Centers for Disease Control and Pretention [CDC], 2007).
We, as healthcare consumers, would like to receive primary prevention to prevent disease. We would like to participate in secondary prevention activities such as screening for cholesterol or blood pressure because it helps us manage any health problems we may be experiencing and reduces the potential impact of a disease. And, we would like to also visit our physicians for tertiary measures so, if we do have a disease, it can be managed by taking a prescribed drug or some other type of treatment. From our perspective, these three areas of health should be better coordinated for the healthcare consumer so the United States will have a healthier population.
In order to understand the current healthcare delivery system and its issues, it is important to learn the history of the development of the U.S. healthcare system. There are four major sectors of our healthcare system that will be discussed in this chapter that have impacted our current system of operations: (1) the history of practicing medicine and the development of medical education, (2) the development of the hospital system, (3) the history of public health, and (4) the history of health insurance. In Tables 1-1 to 1-4 , several important milestones are listed by date and illustrate historic highlights of each system component. The list is by no means exhaustive, but provides an introduction to how each sector has evolved as part of the U.S. healthcare system.
MILESTONES OF MEDICINE AND MEDICAL EDUCATION
The early practice of medicine did not require a major course of study, training, board exams, and licensing, as is required today. During this period, anyone who had the inclination to set up a physician practice could do so; oftentimes, clergy were also medical providers, as well as tradesmen such as barbers. The red and white striped poles outside barber shops represented blood and bandages because the barbers were often also surgeons. They used the same blades to cut hair and to perform surgery (Starr, 1982). Because there were no restrictions, competition was very intense. In most cases, physicians did not possess any technical expertise; they relied mainly on common sense to make diagnoses (Stevens, 1971). During this period, there was no health insurance, so consumers decided when they would visit a physician and paid for their visits out of their own pockets. Often, physicians treated their patients in the patients’ homes. During the late 1800s, the medical profession became more cohesive as more technically advanced services were delivered to patients. The establishment of the American Medical Association (AMA) in 1847 as a professional membership organization for physicians was a driving force for the concept of private practice in medicine. The AMA was also responsible for standardizing medical education (AMA, 2013a; Goodman & Musgrave, 1992).
In the early history of medical education, physicians gradually established large numbers of medical schools because they were inexpensive to operate, increased their prestige, and enhanced their income. Medical schools only required four or more physicians, a classroom, some discussion rooms, and legal authority to confer degrees. Physicians received the students’ tuitions directly and operated the school from this influx of money. Many physicians would affiliate with established colleges to confer degrees. Because there were no entry restrictions, as more students entered into medical schools, the existing internship program with physicians was dissolved and the Doctor of Medicine (MD) became the standard (Vault Career Intelligence, 2013). Although there were major issues with the quality of education provided because of the lack of educational requirements, medical school education became the gold standard for practicing medicine (Sultz & Young, 2006). The publication of the Flexner Report in 1910, which evaluated medical schools in Canada and the United States, was responsible for forcing medical schools to develop curriculums and admission testing. Curriculums and admission testing are still in existence today.
TABLE 1-1 Milestones of Medicine and Medical Education 1700–2013
|• 1700s: Training and apprenticeship under one physician was common until hospitals were founded in the mid-1700s. In 1765, the first medical school was established at the University of Pennsylvania.|
|• 1800s: Medical training was provided through internships with existing physicians who often were poorly trained themselves. There were only four medical schools in the United States that graduated only a handful of students. There was no formal tuition with no mandatory testing.|
|• 1847: The AMA was established as a membership organization for physicians to protect the interests of its providers. It did not become powerful until the 1900s when it organized its physician members by county and state medical societies. The AMA wanted to ensure they were protecting their financial well-being. It also began to focus on standardizing medical education.|
|• 1900s to 1930s: The medical profession was represented by general or family practitioners who operated in solitary practices. A small percentage of physicians were women. Total expenditures for medical care were less than 4% of the gross domestic product.|
|• 1904: The AMA created the Council on Medical Education to establish standards for medical education.|
|• 1928: Formal medical education was attributed to Abraham Flexner, who wrote an evaluation of medical schools in the United States and Canada indicating many schools were substandard. He made recommendations to close several schools, enact admission requirements, and set a standard curriculum. The Flexner Report led to standardized admissions testing for students called the Medical College Admission Test (MCAT), which is still used as part of the admissions process today.|
|• 1930s: The healthcare industry was dominated by male physicians and hospitals. Relationships between patient and physicians were sacred. Payments for physician care were personal.|
|• 1940s to 1960s: When group health insurance was offered, the relationship between patient and physician changed because of third-party payers (insurance). In the 1950s, federal grants supported medical school operations and teaching hospitals. In the 1960s, the Regional Medical Programs provided research grants and emphasized service innovation and provider networking.|
|• 2008: There is increased racial diversity in the number of medical school graduates. Although whites continue to represent the largest number of medical school graduates, there continues to be a decline in white graduates. Asians represent the largest ethnicity of medical school graduates. Women medical graduates continue to enter the workforce in great numbers, but men still outnumber women physicians.|
|• 2001–2012: In 2011, the ACA established the Center for Medicare & Medicaid Innovation that will examine ways to deliver care to patients. In 2012, the ACA provided incentives for physicians to establish accountable care organizations.|
|• 2012: In 2012–2013, the average annual cost for a public medical school for an in-state resident was $30,000. The annual cost for a private medical school was $50,000. Approximately 47% of the students were females.|
In 2008, there was increased racial diversity in the number of medical school graduates. Although whites continue to represent the largest number of medical school graduates, their numbers are declining. Asians represent the largest ethnic group of medical school graduates. Women medical graduates continue to enter the workforce in great numbers but men still outnumber women physicians. In 2012–2013, the average annual cost for a public medical school for an in-state resident was $30,000. The annual cost for a private medical school was $50,000 (Association of American Medical Colleges [AAMC], 2013).