Does everyone in the US get health insurance?

Does everyone in the US get health insurance?

The goal of health care reform is to make health insurance affordable and available to all Americans. And the law requires nearly all Americans to have health coverage. Most coverage satisfies this requirement, including: Insurance you get from an employer.

See also  How can I check my health insurance policy online?