In the early 19th century, almshouses or poorhouses were established to serve the indigent. They provided shelter while treating illness. Government-operated pesthouses segregated those who could spread their disease. The framework of these institutions set up the conception of the hospital. Initially, wealthy people did not want to go to hospitals because the conditions were deplorable and the providers were not skilled, so hospitals, which were first built in urban areas, were used by the poor. During this period, many of the hospitals were owned by the physicians who practiced in them (Rosen, 1983).
In the early 20th century, with the establishment of a more standardized medical education, hospitals became more accepted across socioeconomic classes and became the symbol of medicine. With the establishment of the AMA, who protected the interests of providers, the reputation of providers became more prestigious. During the 1930s and 1940s, the ownership of the hospitals changed from physician-owned to church-related and government-operated (Starr, 1982).
In 1973, the first Patient Bill of Rights was established to protect healthcare consumers in the hospitals. In 1974, a federal law was passed that required all states to have Certificate of Need (CON) laws to ensure the state approved any capital expenditures associated with hospital/medical facilities’ construction and expansion. The Act was repealed in 1987, but as of 2011, 36 states still have some type of CON mechanism (National Conference of State Legislatures [NCSL], 2013). The concept of CON was important because it encouraged state planning to ensure their medical system was based on need. In 1985, the Emergency Medical Treatment and Active Labor Act (EMTALA) was enacted to ensure that consumers were not refused treatment for an emergency. During this period, inpatient hospital use was typical; however, by the 1980s, many hospitals were offering outpatient or ambulatory surgery that continues into the 21st century. The Balanced Budget Act of 1997 authorized outpatient Medicare reimbursement to support these cost-saving measures (CDC, 2001). Hospitalists, created in 1996, are providers that focus specifically on the care of patients when they are hospitalized. This new type of provider recognized the need of providing quality hospital care (American Hospital Association [AHA], 2013; Sultz & Young, 2006). In 2002, the Joint Commission on the Accreditation of Healthcare Organizations (now The Joint Commission) issued standards to increase consumer awareness by requiring hospitals to inform patients if their results were not consistent with typical results (AHA, 2013).
Hospitals are the foundation of our healthcare system. As our health insurance system evolved, the first type of health insurance was hospital insurance. As society’s health needs increased, expansion of different medical facilities increased. There was more of a focus on ambulatory or outpatient services because we, as consumers, prefer outpatient services and, secondly, it is more cost effective. In 1980, the AHA estimated that 87% of hospitals offered outpatient surgery. Although hospitals are still an integral part of our healthcare delivery system, the method of their delivery has changed. More hospitals have recognized the trend of outpatient services and have integrated those types of services in their delivery.
MILESTONES OF PUBLIC HEALTH
The development of public health is important to note because its development was separate from the development of private medical practices. Physicians were worried that government health departments could regulate how they practiced medicine, which could limit their income. Public health specialists also approached health from a collectivistic and preventive care viewpoint—to protect as many citizens as possible from health issues and to provide strategies to prevent health issues from occurring. Private practitioners held an individualistic viewpoint—citizens more often would be paying for physician services from their health insurance or from their own pockets and physicians would be providing them guidance on how to cure their diseases, not prevent them. The two contrasting viewpoints still exist today, but there have been efforts to coordinate and collaborate more of the traditional and public health activities.
TABLE 1-2 Milestones of the Hospital and Healthcare Systems 1820–2013
|• 1820s: Almshouses or poorhouses, the precursor of hospitals, were developed to serve the poor primarily. They provided food and shelter to the poor and consequently treated the ill. Pesthouses, operated by local governments, were used to quarantine people who had contagious diseases such as cholera. The first hospitals were built around urban areas in New York City, Philadelphia, and Boston and were used often as a refuge for the poor. Dispensaries or pharmacies were established to provide free care to those who could not afford to pay and to dispense drugs to ambulatory patients.|
|• 1850s: A hospital system was finally developed but their conditions were deplorable because there were unskilled providers. Hospitals were owned primarily by the physicians who practiced in them.|
|• 1890s: Patients went to hospitals because they had no choice. There became more cohesiveness among providers because they had to rely on each other for referrals and access to hospitals, which gave them more professional power.|
|• 1920s: The development of medical technological advances increased the quality of medical training and specialization and the economic development of the United States. The establishment of hospitals became the symbol of the institutionalization of health care. In 1929, President Coolidge signed the Narcotic Control Act, which provided funding for hospital construction for drug addicts.|
|• 1930s to 1940s: Once physician-owned hospitals were now owned by church groups, larger facilities, and government at all levels.|
|• 1970 to 1980: The first Patient Bill of Rights was introduced to protect healthcare consumer representation in hospital care. In 1974, the National Health Planning and Resources Development Act required states to have CON laws to qualify for federal funding.|
|• 1980 to 1990: According to the AHA, 87% of hospitals were offering ambulatory surgery. In 1985, the EMTALA was enacted, which required hospitals to provide screening and stabilize treatment regardless of the ability to pay by the consumer.|
|• 1990 to 2000s: As a result of the Balanced Budget Act cuts of 1997, the federal government authorized an outpatient Medicare reimbursement system.|
|• 1996: Hospitalists are clinicians that provide care once a patient is hospitalized.|
|• 2002: The Joint Commission on the Accreditation of Healthcare Organizations (now The Joint Commission) issued standards to increase consumer awareness by requiring hospitals to inform patients if their results were not consistent with typical results.|
|• 2011: In 1974, a federal law was passed that required all states to have certificate of need (CON) laws to ensure the state approved any capital expenditures associated with hospital/medical facilities’ construction and expansion. The act was repealed in 1987 but as of 2011, 36 states still have some type of CON mechanism.|
|• 2013: The Center of Medicare & Medicaid Services developed a Bundled Payments for Care Improvement initiative. Acute care hospitals and other providers will enter into payment arrangements that include financial and performance accountability for episodes of care for each patient.|
During the 1700s into the 1800s, the concept of public health was born. In their reports, Edwin Chadwick, Dr. John Snow, and Lemuel Shattuck demonstrated a relationship between the environment and disease (Chadwick, 1842; Turnock, 1997). As a result of their work, public health law was enacted and, by the 1900s, public health departments were focused on the environment and its relationship to disease outbreaks.