Is health insurance mandatory in Alabama?

Is health insurance mandatory in Alabama?

Alabama Healthcare Insurance: What you need to know There is no state law requiring employers to offer group healthcare insurance to their employees, but most employers do provide this benefit.

See also  What country has the best healthcare?