Is it mandatory to have health insurance in Texas?

Is it mandatory to have health insurance in Texas?

Texas residents are not required to have health insurance under state law. However, the Affordable Care Act does mandate a health insurance requirement on a federal level that includes Texans. Texas utilizes the federal exchange for health plans and has one of the highest enrollment rates in the country.

See also  Can Medi-Cal take your house?