Does every US citizen have to have health insurance?

Does every US citizen have to have health insurance?

The goal of health care reform is to make health insurance affordable and available to all Americans. And the law requires nearly all Americans to have health coverage. Most coverage satisfies this requirement, including: Insurance you get from an employer.

See also  What are the three main types of life insurance?