Is medical insurance mandatory in Florida?

Is medical insurance mandatory in Florida?

A few states have passed their own health insurance requirements, but as we approach open enrollment for 2022 health plans, Florida is not one of them.

See also  Who owns Religare?