Essay on Public Healthcare in the US

Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)

NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.

NB: All your data is kept safe from the public.

Click Here To Order Now!

Introduction

The healthcare system is a popular topic of study throughout the world. Their popularity is due not only to the universal human need for health care but also to the various means of delivery systems and financing around the world. These many differences depend greatly on each country’s political culture, history, and level of wealth (Saunders, 2002). The tight link between the United States and appears to be reflected in most developmental areas in recent years. These two countries appear to be learning from one another as they work to develop long-term healthcare solutions. According to Formosa Post (2014), the United States has copied many of its systems which is due to historical causes. However, it’s worth mentioning that the performance and dependability of each system are determined by major organizational disparities in healthcare system architecture. In comparison to the US healthcare system, ‘s healthcare system, known as the National Health Service, is said to perform substantially better.

The current development and the USA is characterized by a profound and sustained economic crisis and enormous national debt which could be associated with the effect of the COVID-19 pandemic. Against this background, the traditional structures of health systems are being discussed more and more openly and emphatically from the point of view of costs as factors that hinder competition in the industrial location. There are different considerations as to how these can be restricted. With their ‘welfare states’ are increasingly orienting themselves towards the ‘role model’ of the USA. The market, competition, and personal responsibility are still neglected factors in Britain’s health care. In the USA, however, there are very different items on the agenda today, for private healthcare organizations have made healthcare the most expensive in the world, creating an army of Americans who are either uninsured or underinsured for disease. The state is therefore currently called upon in the USA to put the health care that has gotten out of hand with its anti-social effects back in order.

Therefore, the purpose of this essay is to compare and contrast the healthcare systems of the United States of America using some indices which include:

    • Policy context
    • Resourcing and Funding
    • Performance and provision

The policy context will explain how the healthcare system in both countries has come to be the way it is. The difference in the funding system will be analyzed based on the success it has posed for each country and the performance and provision will be evaluated based on functionality such as accessibility and quality.

Background of healthcare system

Many countries’ health systems evolved in a variety of ways, with some giving private insurance, some providing universal health care, and others combining the two (Jones and Kantarjian, 2019). Historical events such as post-World War II, political beliefs, and economic conditions are said to have shaped the development of health care (Dixon et al.,2009; Schutte et al., 2018).

The National Health Service (NHS) was established in 1946 and is in charge of ‘s public healthcare system. Before this, healthcare was primarily reserved for the wealthy, unless one could acquire free treatment through charities or teaching institutions (Chang et. al., 2016). In 1911, David Lloyd George introduced the National Insurance Act, which required a minimal deduction from an employee’s pay in exchange for free healthcare. However, this program only provided coverage to those who were employed. Following WWII, an effort was made to establish a public healthcare system in which services were offered free at the time of need, services were funded through central taxation, and everyone was entitled to treatment. A basic tripartite structure was established, with hospital services, primary care (GPs), and community services being divided. Concerns about problems caused by the separation of the three core areas of care had grown by 1974, thus a major restructuring effort was undertaken to enable local governments to support all three areas of care. During the Thatcher years, the management system was restructured, and in 1990, the National Health Service and Community Treatment Act was established, having autonomous trusts to oversee hospital care.

The reforms are in place to assist reduce medical expenditures and wait times for patients. Total healthcare spending as a percentage of GDP has risen steadily since the 1980s. In addition, the private sector has assumed a larger role in health insurance, accounting for 16.7% of healthcare spending in 1999, up from 10.6% in 1980. Medical trend rates in 2006, 2009, 2010, and 2011 were 6.0 percent, 9.3 percent, 8.8 percent, and 9.5 percent, respectively, according to a report by Towers Watson. In 1998, the NHS Plan was implemented which aided in the modernization of the healthcare system.

In contrast in American history, there has always been a discussion about which level of government should be in charge of what. Anti-Federalists, for example, pushed for local autonomy in the eighteenth century, boycotted the Constitutional Convention, and claimed that the proposed text would lead to a new (American-style) monarchy. However, with the election of Thomas Jefferson as president, a long-standing era began in which the states possessed an undeniable capacity to act with minimal government interference (Sparer and Beaussier, 2018). However, the combination of state sovereignty and the Industrial Revolution’s packed cities resulted in a surge in infectious diseases and epidemics (Dixon et al, 2009). As a result, the federal government decided to intervene, passing health and welfare legislation like the Vaccine Act of 1813. However, according to Sparer and Beaussier (2018), this program did not last long, as it terminated in 1821 after 10 individuals died as a result of inadvertent vaccine distribution.

There was a substantial hospital bed shortage in the United States by the conclusion of the Great Depression and World War II. Not only did hospital development slow down at this time, but many hospitals also shuttered as a result of the country’s economic crisis. However, Hanoch and Rice (2011) report that after WWII, the federal government channeled billions of dollars to educational medical researchers in the belief that all diseases could be conquered, which, combined with the use of advanced technologies, contributed to the United States’ excessive healthcare spending. The Hill-Burton Act, also known as the Hospital Survey and Construction Act of 1946, indicated the United States’ engagement in regulating hospital bed availability by giving funds through government grants (John, 2016). Essentially, this Act mandated the development and renovation of hospital systems across the United States. The Hill-Burton Act began with 3.2 community hospital beds per 1,000 inhabitants in a geographic region, and although the Hill-Burton program ended in 1974, its target of 4.5 beds per 1,000 people was met by the 1980s (John, 2016). In the United States, private health insurance arose as a result of the Great Depression. Baylor Hospital began offering 21 days of hospitalization per year to patients who paid a 50-cent monthly charge in 1929. The American Hospital Association encouraged the growth of this ‘prepayment’ notion. In Dallas, Texas, the first Blue Cross plan was founded in 1929 to ensure hospital coverage for childbearing-aged schoolteachers. Blue Shield began in the Pacific Northwest in the early 1900s, when mining and timber camps hired doctors to treat their workers. The Blue Cross and Blue Shield Association was formed when the two organizations merged, with Blue Cross representing hospital coverage and Blue Shield representing medical services. A Blue plan, which is part of a network of 43 individually and regionally controlled Blue Cross and Blue Shield organizations, now covers around 25% of insured Americans.

The National Health Planning and Resources Development Act of 1974 established a network of government health planning organizations known as health systems agencies as one of the first initiatives to limit the growth of healthcare spending in the United States. These healthcare agencies were created to keep tabs on the distribution of healthcare resources and the rising expense of medical care in the United States. The Act required states to implement certificate-of-need statutes requiring hospitals to obtain a certificate of need from their host state before purchasing major equipment or starting construction. Even though many states still need some form of a certificate of need, federal financing for healthcare agencies was terminated in 1986. 44 The managed care movement in the early 1990s spawned the current private health insurance sector, which is exceedingly sophisticated and varied.

Policy context

The Department of Health is the government body in charge of the NHS. The Department of Health’s goals are simple: to improve the people’s general well-being. This is accomplished through directing, supporting, and leading NHS and social care organizations to deliver fair, high-quality health care and to provide patients with choices while also providing value to taxpayers (UK Department of Health 2007). The National Health Service (NHS) is the name given to the overarching government national health organization which was established in 1948. However, in the United States, healthcare services are classified as either public or private. Health care that is deemed public or government function is referred to as public health care. The prevention of diseases, the promotion of health, the reporting and control of communicable diseases, the control of environmental factors such as air and water quality, and the study and analysis of data on public health are all areas where public or government agencies provide a level of public health care (Stahl 2004). The US Department of Health and Human Services is the principal federal agency that controls many of the sub-agencies that perform these government healthcare services. These organizations include the Centers for Disease Control, Food and Drug Administration, National Institutes of Health, and the Agency for Healthcare Research and Quality (Stahl 2004).

Primary care trusts (PCTs) are made up of GPs, dentists, pharmacists, and optometrists who are in charge of coordinating the delivery of health care and the patient experience. Acute trusts, often known as NHS trusts, are secondary levels of care that are made up of NHS (or government-run) hospitals. The local organizations in charge of responding to and assessing emergencies are known as ambulance trusts. Care trusts are essentially social service companies that coordinate multiple services to suit the requirements of patients who may require more complex treatment. Patients with more serious mental health issues are served by mental health trusts (NHS 2007). However, in the United States, when seeking medical help for an episodic or non-chronic ailment, most people go to their primary care physician first. Primary care is described as the initial interaction with medical services to provide diagnosis and treatment. GPs, pediatricians, internists, obstetricians, nurse practitioners, physician’s assistants, and midwives are examples of primary care providers (PCPs). Patients of all ages, genders, and ethnicities who are suffering from a variety of medical ailments are seen by primary care professionals. As a result, PCPs must be well-versed in a wide range of ailments and frequently collaborate with secondary and tertiary care experts to provide comprehensive treatment to patients (Stahl 2004).

Furthermore, even though the levels of government engagement and social responsibility in the US and health systems vary greatly, both systems similarly administer healthcare. Both tend to start with primary care and are organized into regional, functional, and specialized subsystems. Although these subsystems are owned and operated by the government and by private businesses in the United States, the evident deviation is in the payment responsibilities.

Funding and resourcing

Even though health care funding is largely handled by the government and health care funding in the United States is largely controlled by the private sector, both are fundamentally made possible by public contributions. The extent of government engagement and forced taxation vs voluntary contributions are the most significant differences. The provides health care to all citizens through a healthcare delivery system that is similar to that of the United States, whereas the United States is burdened by the uninsured.

The NHS is primarily funded through general taxation, with National Insurance contributions supplementing it. While the NHS is often described as ‘free at the point of use,’ individuals have been expected to pay for some services (such as medications and dental treatment) since 1951. Many patients are covered by exemptions, including those under the age of 16 and those over the age of 60, as well as those receiving specified state subsidies. Over time, the relative contribution of each of these sources of funding general revenue, national insurance, and user fees has fluctuated. For example, between 2007 and 2011, the proportion of income from user fees remained at 1.2 percent, down from a peak of 5% in 1960. (Hawe and Cockcroft 2013). However, in the United States, healthcare systems are financed using the following:

    • Insurance that is purchased privately
    • Insurance plans run by the government
    • Individuals themselves (personal, out-of-pocket funds)

Private insurance is available from both for-profit and non-profit insurance providers. Although the United States has a large number of health insurance firms, each state tends to have a small number. The majority of private insurance is purchased by businesses as a benefit for their employees. Employers and employees frequently split costs. The money spent by an employer on an employee’s health insurance is not taxable income for the employee. In effect, the government is partially subsidizing this insurance. The Patient Protection and Affordable Care Act (PPACA, or Affordable Care Act [ACA]), which took effect in 2014, is a U.S. healthcare reform law aimed at increasing the availability, affordability, and use of health insurance, among other things. Similarly, Emmerson et al. (2002) report that, while the ACA amendment has provided insurance coverage to over 5 million Americans and contributed to a decrease in health expenditure growth in the United States, people on the other hand, prefer private insurance to the NHS, with Oram (2018) reporting that about 7 million people have private insurance today, compared to 2.4 million in 1990. Nonetheless, Dickman et al. (2017) contend that healthcare systems funded by taxes, such as those are more progressive than those in the United States, which rely on private insurance and out-of-pocket expenditures.

Private health insurance coverage is carried by 10.6% of the population. The majority of these are corporate subscriptions that are included in employees’ overall remuneration packages (LaingBuisson 2017). Matusitz (2014), on the other hand, claims that the private health insurance industry in the United States insures people not only for health care but also for an improvement in health status. Nonetheless, Bradbury et al. (2018) point out that, although health care is universal, both the number of persons with private medical insurance and the total amount spent on medical care by private individuals have risen dramatically in recent years.

Furthermore, whilst the NHS, ‘s principal provider of healthcare, spent 6.3 percent of GDP on health in 2017 (Jones and Kantarjian,2019), the United States spent 18 percent of its national GDP on healthcare in 2017, nearly tripling health spending (Mishuk et al, 2019). Similarly, according to Gross and Laugesen (2018), the United States is the only member of the Organization for Economic Cooperation and Development (OECD) with the highest total and public health expenditures per person, with an estimated US$10,212 in 2017, more than double the average of $4,246 and also overspends compared to all other countries.

In the United States, government-sponsored health care Although the United States health system is mostly funded by private parties, public spending makes a significant contribution (44.7 percent of health care GDP), as previously stated. Government- or public-sponsored health care includes Medicare, Medicaid, which includes the State Children’s Health Insurance Program (SCHIP), and Veteran Affairs. The Centers for Medicare and Medicaid Services (CMS) is a government organization in charge of administering the Medicare and Medicaid programs in the United States. Both Medicare and Medicaid are the principal types of public health insurance in the United States, and are a merging of formerly smaller systems, thanks to amendments to the Social Security Act in 1965.

Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)

NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.

NB: All your data is kept safe from the public.

Click Here To Order Now!