100 Years of US Medical Fascism

One hundred years ago today, on April 16, 1910, Henry Pritchett, president of the Carnegie Foundation, put the finishing touches on the Flexner Report.[1] No other document would have such a profound effect on American medicine, starting it on its path to destruction up to and beyond the recently passed (and laughably titled) Patient Protection and Affordable Care Act of 2010 (PPACA), a.k.a., "Obamacare." Flexner can only be accurately understood in the context of what led up to it.

Free-market medicine did not begin in the United States in 1776 with the Revolution. From 1830 to about 1850, licensing laws and regulations imposed during the colonial period and early America were generally repealed or ignored. This was brought about by the increasing acceptance of eclecticism (1813) and homeopathy (1825), against the mainstream medicine (allopathy) of the day that included bloodletting and high-dose injections of metal and metalloid compounds containing mercury or antimony.[2]

Eclectics emphasized plant remedies, bed rest, and steam baths, while homeopaths emphasized a different set of medicines in small doses (letting the body heal itself as much as possible), improved diet and hygiene, and stress reduction. The worst results these treatments produced were allergic reactions to no improvement. Hence it’s not surprising they began to be preferred over the ghastly bleeding and metal injections of allopathy, which killed large numbers of patients.

By 1860, there were more than 55,000 physicians practicing in the United States, one of the highest per capita numbers of doctors in the world (about 175 per 100,000).[3] By 1870, approximately 62,000 physicians were in practice in the United States,[4] roughly about 5,300 of which were homeopathic and about 2,700 eclectic.[5] Schooling was plentiful and inexpensive, and entry to the most acclaimed schools was not exceedingly difficult. Most schools were privately owned. Licenses to practice were not required or enforced, and anyone could establish a practice.[6]

Like the mythical Hollywood portrayal of the American "Wild West" as a place in which the denizens of every town were killing each other in gunfights every minute of the day, the free-market period in American medicine has also been distorted as one in which towns were mobbed by traveling quacks prescribing dangerous treatments that killed the townspeople in droves. Organized mainstream medicine concocted this myth, and as previously noted, it was they and not the homeopaths and eclectics who were killing large numbers of people via bloodletting and metal poisoning.[7] This is why it took time and effort for any caregiver to win the widespread trust of a typical community in 19th-century America. The public en masse blindly lapping up snake oil dispensed from the dirty travel trunks of carnival-tent quacks is wild legend.

Even though they were only about 13% of physicians in practice,[8] eclectics and homeopaths did damage to the incomes of the allopaths. The allopaths began organizing at the state level to use the coercive power of government to not only severely restrict (if not outright ban) eclectics and homeopaths, and the schools that trained them, but also restrict the number of allopaths in practice to dramatically increase their incomes and prestige.[9]

The American Medical Association (AMA) had already been formed in 1847 by Nathan Smith Davis. Davis had been working at the Medical Society of New York with issues of licensing and education. While the pretense was always more rigorous standards toward the supposed end of effective treatments, exclusion was the reality. Hence it was no surprise that in 1870, Davis worked successfully to prohibit female and black physicians from becoming members of the AMA.[10]

The AMA formed its Council on Medical Education in 1904 as a tool to artificially restrict education.[11] However, the AMA’s conflict of interest was too obvious. This is where Abraham Flexner and the Carnegie Foundation entered the picture. Flexner’s older brother Simon was the director of the Rockefeller Institute for Medical Research and he recommended his brother Abraham for the Carnegie job. Abraham’s acceptance of the role was the perfect special-interest symbiosis. Carnegie’s desire was to advance secularism through higher education, thus it saw the AMA’s agenda as favorable toward that end. Rockefeller’s benefactors were allied with allopathic drug companies and hated for-profit schools that couldn’t be controlled by the big-business, state-influenced foundations. Last of all, the AMA got an objective-appearing front in Carnegie.[12]

Not only was Abraham Flexner not even an allopathic physician; he was not a widely known authority on education,[13] never mind medical education, as he had never even seen the inside of a medical school before joining Carnegie. His report was already effectively written, since it was essentially the AMA’s unpublished 1906 report on US medical schools. Furthermore, Flexner was accompanied on his inspection by the AMA’s N.P. Colwell to insure the inspection would arrive at the preordained conclusions. Flexner then spent time at the AMA’s Chicago headquarters preparing what portion of the final product was his actual work.[14]

Regardless of these scandalous circumstances, state medical boards and legislatures used the report as a basis for closing medical schools. Around the time of Flexner, there was a high of a 166 medical schools; by the 1940s there were just 77 — a 54 percent reduction.[15] Most small rural schools were closed, and only two African-American schools were allowed to remain open.[16] By 1963, despite advances in technology and a huge growth in demand, one effect of the report was to keep the number of doctors per 100,000 people in the United States — 146 — at the same level it was at in 1910.[17] Of the approximately 375,000 physicians in practice in 1977, only about 6,300 or 1.7% were African-American.[18]

While physician incomes and prestige dramatically increased, so did the care-giving workload. Wolinsky and Brune (1994) report that doctors were firmly in the lower middle class at the time of the AMA’s founding and made about $600 per year. This rose to about $1,000 around 1900. After Flexner, incomes began to skyrocket such that a 1928 AMA study found average annual incomes reached a whopping (for the time) $6,354.[19] Even during the Great Depression, physicians earned four times what average workers did.[20] A 2009 survey put family-practice doctors (on the low end of the physician income range) at a median of $197,655 and spine surgeons (at the high end) at a median of $641,728.[21] These figures are mind boggling to ordinary Americans, even in good economic times. In addition, the cyclical unemployment that throws workers out of jobs in almost all other industries with the arrival of recessions or depressions became nonexistent among physicians after Flexner.

However, not even Flexner could repeal the laws of economics: the physician workload in certain areas became backbreaking to impossible, such that some physicians no longer accept new patients. Some primary-care physicians today are booked solid for at least two months, and unless you have some sort of connection to get in before that or pay for concierge care, your alternative for urgent care is the same as everyone else’s on a weekend: the emergency room where you’ll wait for hours, or a walk-in where you’ll see one or two MD names posted on the building, but wait for hours for a nurse practitioner.

Hospitals

Of course it wouldn’t make sense to restrict physician services without restricting hospitals. For-profits were the first to go, and where they were not outright prohibited, they faced a number of regulatory burdens that nonprofits escaped — such as income and property taxes. Nonprofits received generous government subsidies, tax-deductible contributions, and local planning agencies working in their favor to keep for-profit competitors from expanding. This state-sponsored discrimination against for-profit hospitals took its toll: at the time of Flexner, almost 60 percent of all US hospitals were for-profit institutions. By 1968, only 11 percent were for-profit institutions with about an 8-percent share of hospital admissions.[22]

Eliminating most for-profit medical schools and hospitals made sense for the AMA and the rest of organized mainstream medicine, since they were controlled by owners or shareholders who had the incentive to control costs in order to maximize profits. Nonprofits were free to pursue the political goals that organized mainstream medicine favored, especially the goal of a much more lengthy and costly education, which served as another barrier of entry to the profession. (Especially amusing was a 2004 article by two Dartmouth physicians arguing for maintaining restricted entry because of high costs.[23])

The Rise of Health "Insurance"

In the early 1900s, prepaid health plans were created for the timber and mining workers of Oregon and Washington to help offset the inherent risks of those industries. Within a free-market, for-profit insurance system, claims were closely monitored by adjusters. Fees, procedures, and exceptionally long hospital stays were monitored and subject to challenge. A physicians’ group in Oregon that resented this type of scrutiny created a plan where procedures were reimbursed and fees paid with few questions asked. Plans with similar structures began dominating the market in other locations because of government-provided advantages.

By 1939 these loose-cost containment plans began to be marketed under the Blue Shield name. That same year, Blue Cross was endorsed by the American Hospital Association. Already in existence for ten years, Blue Cross had begun as a hospital insurance plan for Dallas school teachers that allowed them to pay for up to three weeks of hospital care with low monthly payments.

After this, organized mainstream medicine waged an intense war on non-Blue plans. Goodman (1980) contends that some physicians lost hospital privileges and even their licenses for accepting non-Blue plans.[24] The Blues also gained government-supplied advantages not available to non-Blue plans. In many states, they paid no or low premium taxes and sometimes no real-estate taxes. They also weren’t required to maintain minimum benefit/premium ratios and could have no or low required reserves. With government advantages, the Blues steadily came to dominate the industry. By 1950, Blue Cross held 49 percent of the hospital insurance market, while Blue Shield held 52 percent of the market for standard medical insurance.[25] They merged in 1982 and today cover one of every three Americans.[26]

Blues-created "insurance" was anything but true insurance.

  • Hospitals were paid on a cost-plus basis. Insurers paid not a sum of prices charged to patients for services but artificial "costs" that bore no necessary relationship to the prices of services performed.

  • Insurance of routine procedures. This converted insurance to prepaid consumption that encouraged overuse of services.

  • Insurance premiums based on "community rating." The word "community" meant that every person in a specific geographic area regardless of age, habits, occupation, race, or sex was charged the same premium. For example, the average 60-year-old incurs four times the medical expense of the average 25-year-old, but under community rating both pay the same premium (i.e., young people are overcharged and the elderly undercharged).

  • A "pay-as-you-go" system. Unlike genuine catastrophic hospital insurance that placed premiums in growing reserves to pay claims, the new Blues’ "insurance" collected premiums that only covered expected costs over the following year. If a large group of policyholders became ill over several years, the premiums of all policyholders had to be raised to cover the increase in costs.

These traits spell cost-explosion disaster, so naturally they were incorporated into the federal government’s Medicare and Medicaid programs when they were created in the mid-1960s to address the problem of healthcare being unaffordable for the poor and elderly — a problem the state and federal governments created!

This only leaves the mystery of how health insurance became attached to employment. The answer is found two decades before Medicare and Medicaid. Wage and price controls the federal government enacted during World War II prevented large employers from competing for labor based on wage rates, so they competed based on the quality of benefits. The most effective benefit for luring labor to large employers was generous health-insurance policies.

The decision by the federal government to allow large-employer benefits to be obtained tax-free while effectively taxing plans purchased by small businesses and the self-employed created a system where medical insurance became not only perversely tied to the size of a worker’s employer but to employment itself. The price of health insurance for many self-employed workers and small businesses became unaffordable.

Health Maintenance Organizations (HMOs)

Health Maintenance Organizations (HMOs) were prepaid practices that began mainly on the US West Coast in the early 1900s. Western Clinic in Tacoma (1910) and Ross-Loos in Los Angeles (1929) were among the earliest. (Ross-Loos eventually became part of Insurance Company of North America [INA], which merged into CIGNA in 1982.) Kaiser Permanente began with a clientele of shipyard workers during World War II. After the war, it had hospitals and physicians, but no more worker clientele, so it started marketing to the wider public and by the 1970s had more than 3 million enrollees in five states.[27]

Still, HMOs had limited appeal. By 1970, Kaiser was the only major HMO in the United States, with most of its enrollees forced to join through their labor unions.[28]

Much more about HMOs will be covered in a forthcoming review in the Quarterly Journal of Austrian Economics. The purpose here is to emphasize that, despite some assertions to the contrary, HMOs are anything but free-market firms. The Health Maintenance Organization Act of 1973 made federal grants and loans available to HMOs, removed certain state restrictions if HMOs became federally certified, and required employers with 25 or more employees who offered standard health-insurance benefits to offer federally approved HMO plans.

"Obamacare," or More Accurately, ConservativeRepublicanCare

When you actually look at the bill itself, it incorporates all sorts of Republican ideas … a lot of the ideas in terms of the exchange, just being able to pool and improve the purchasing power of individuals in the insurance market, originated from the Heritage Foundation. (Barack Obama, NBC’s Today Show, March 30, 2010)

The latest chapter in US healthcare is one of the most surreal. The Patient Protection and Affordable Care Act of 2010 was signed into law by Barack Obama on March 23, 2010. Among many provisions, the act includes expanded Medicaid eligibility, prohibiting denials of coverage for preexisting conditions, and a requirement to purchase federally approved health insurance or pay a fine.

While the content of the Act is summarized in myriad places, much more interesting is its conservative Republican origins. The Heritage Foundation’s Stuart Butler, the intellectual behind urban enterprise zones, in Senate testimony in 2003 proposed a plan for universal healthcare coverage.[29] Here’s one surprising portion of the testimony that sounds like it was uttered by a European socialist:

In a civilized and rich country like the United States, it is reasonable for society to accept an obligation to ensure that all residents have affordable access to at least basic health care — much as we accept the same obligation to assure a reasonable level of housing, education and nutrition.

Keep in mind that Butler is the conservative Heritage’s current vice president of domestic and economic policy. No wonder Butler seems to have found a new admirer in New York Times columnist Paul Krugman. Butler again:

The obligations on individuals does not have to be a "hard" mandate, in the sense that failure to obtain coverage would be illegal. It could be a "soft" mandate, meaning that failure to obtain coverage could result in the loss of tax benefits and other government entitlements. In addition, if federal tax benefits or other assistance accompanied the requirement, states and localities could receive the value of the assistance forgone by the person failing to obtain coverage, in order to compensate providers who deliver services to the uninsured family.

Now "Obamacare" is certainly more than just a mandate, but the mandate is certainly what has conservatives such as Rush Limbaugh and Sean Hannity, both of whom have connections to, if not sponsorship by, the Heritage Foundation, screaming bloody murder the most. There’s no doubt that these ideas influenced Mitt Romney’s healthcare plan in Massachusetts.

Romney subjected himself to a recent interview by Fox News’ Bill O’Reilly that can only be described as a disaster.[30] O’Reilly dwelled on the fact that outside tax dollars funded half of the plan, and Romney agreed, adding that the funding was approved by two conservative Republican HHS secretaries, Tommy Thompson and Mike Leavitt. In response to a question, Romney admitted that he didn’t know that emergency-room costs in Massachusetts had increased 17% over the last two years. He repeatedly asserted that the plan solved a problem, but he couldn’t specify what it was since Massachusetts had the highest per capita costs both before the plan and after.

As far as other conservative Republicans go, former Arkansas governor Mike Huckabee has repeatedly stated that he sees "some good things" in Obamacare, especially the expanded use of Medicaid.

Voters naïve enough to think they will get a complete repeal from the Republican Party appear to be in for a major disappointment. "Obamacare," with its continuance of socialized costs for private gains in American medicine, was the treatment that the conservative Republican doctor had in mind for some time. The problem is that the Democrats were the first to implement it.

Notes

[1] Flexner, Abraham. "Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching." Bulletin IV. Carnegie, 1910.

[2] Hamowy, Ronald. "The Early Development of Medical Licensing Laws in the United States, 1875—1900." Journal of Libertarian Studies 3, no. 1.

[3] Census data. See also Hamowy.

[4] Census data. See also Hamowy.

[5] Chaillé, Stanford E. "The Medical Colleges, the Medical Profession, and the Public." New Orleans Medical and Surgical Reporter, May 1874, pp. 818—19. See Hamowy, p. 105, note 4.

[6] Hamowy, p. 73.

[7] While the founder of eclecticism, Samuel Thomson, was a farmer, homeopathy’s founder Samuel Hahnemann was an actual physician, some of whose insights ended up being incorporated into mainstream medicine.

[8] Chaillé.

[9] The other view, by sociologist Paul Starr in The Social Transformation of American Medicine (1982, Basic Books, 1982) is that eclectics and homeopaths committed career suicide by joining with allopaths in the campaign to re-secure licensing. Starr does not see Flexner as decisive in killing medical schools. Compare to Reuben Kessel, "Price Discrimination in Medicine.," Journal of Law and Economics 1 (Oct. 1958), pp. 20—53: "If impact on public policy is the criterion of importance, the Flexner report must be regarded as one of most important reports ever written."

[10] Link, Eugene Perry. The Social Ideas of American Physicians (1776—1976): Studies of the Humanitarian Tradition in American Medicine. Associated University Presses, 1992. See chapter 4.

[11] A recent article notes a supposed recent increase in schools and students undermined by an undersupply of Medicare-funded resident positions.

[12] Rockwell, Llewellyn H., Jr. "Medical Control, Medical Corruption." Chronicles. June, 1994.

[13] His first book, The American College: A Criticism, was published in October 1908, yet he joined Carnegie that same year.

[14] Goodman, John C. and Gerald L. Musgrave. Patient Power. Washington: Cato, 1992. See pp. 137—61.

[15] See both Rockwell and Goodman and Musgrave (especially chart on p. 145).

[16] Beck, Andrew H. "The Flexner Report and the Standardization of American Medical Education." JAMA, May 5, 2004. Html

[17] Goodman and Musgrave, p. 145.

[18] Goodman and Musgrave, p. 147.

[19] Very roughly, almost $80,000 2009 dollars.

[20] Wolinsky, Howard and Tom Brune. The Serpent on the Staff: The Unhealthy Politics of the American Medical Association. Tarcher Putnam, 1994. See "Rags to Riches" in chapter 3.

[21] AMGA Medical Group, 2009. Html.

[22] Goodmand and Musgrave, p. 156.

[23] Weeks, William and Amy Wallace. "Weakness in Numbers," Barron’s, June 14, 2004.

[24] Goodman, John C. The Regulation of Medical Care: Is the Price Too High? Washington: Cato, 1980. See also Patient Power, p. 159.

[25] Goodman and Musgrave, p. 160.

[26] See this BCBS page.

[27] Dranove, David. Code Red: An Economist Explains How to Revive the Healthcare System Without Destroying It. Princeton, 2008, p. 61.

[28] Holleran, Scott. "The History of HMOs." Arizona Republic. Nov. 1, 1999.

[29] Butler, Stuart. "Laying the Groundwork for Universal Health Care Coverage." March 10, 2003. Html.

[30] The O’Reilly Factor, episode April 12, 2010.

Reprinted from Mises.org.