
You could argue that medicine was never meant to become a for-profit business the way selling cars, cosmetics, and fast food are businesses. And yet, in the United States, health care has become a for-profit business. The story of how this happened is complex, but decisive elements include the advent of Medicaid and Medicare in 1966 and the widespread availability of employer-sponsored health insurance.







