Is health insurance mandatory in USA?

Is health insurance mandatory in USA?

Health insurance coverage is no longer mandatory at the federal level, as of Jan. 1, 2019. Some states still require you to have health insurance coverage to avoid a tax penalty.

See also  How can a teenager stay healthy?