Is healthcare free in America?

Is healthcare free in America?

There is no universal healthcare. The U.S. government does not provide health benefits to citizens or visitors. Any time you get medical care, someone has to pay for it.

See also  What is privatized health care?