Is health insurance mandatory in Hawaii?

Is health insurance mandatory in Hawaii?

On January 1, 19’75, Hawaii became the first State to have a mandated health insurance pro- gram in effect. The law, enacted June 12, 19’74, requires employers to provide protection against the costs of hospital and medical care for their employees.

See also  What household chores burn the most calories?