Does America have free healthcare?

Does America have free healthcare?

There is no universal healthcare. The U.S. government does not provide health benefits to citizens or visitors. Any time you get medical care, someone has to pay for it.

See also  What is Florida's low-income health insurance?