Is medical insurance mandatory in USA?

Is medical insurance mandatory in USA?

Health insurance coverage is no longer mandatory at the federal level, as of Jan. 1, 2019. Some states still require you to have health insurance coverage to avoid a tax penalty.

See also  Will I be penalized for no health insurance in 2021 in Texas?