Does the US require health insurance?

Does the US require health insurance?

Health insurance coverage is no longer mandatory at the federal level, as of Jan. 1, 2019. Some states still require you to have health insurance coverage to avoid a tax penalty.

See also  What is the difference between HealthCare.gov and Obamacare?