Is health insurance still mandatory in Georgia?

Is health insurance still mandatory in Georgia?

Technically, the Affordable Care Act — aka Obamacare — still says that you must have health insurance. … A few states have passed their own health insurance requirements, but as we approach open enrollment for 2022 health plans, Georgia is not one of them.

See also  How does Obamacare work for college students?