Is it a must to have health insurance?

Is it a must to have health insurance?

Health insurance coverage is no longer mandatory at the federal level, as of Jan. 1, 2019. Some states still require you to have health insurance coverage to avoid a tax penalty.

See also  Which is included in the goal of the Health Insurance Portability and Accountability Act quizlet?