Does everyone get insurance in the US?

Does everyone get insurance in the US?

The goal of health care reform is to make health insurance affordable and available to all Americans. And the law requires nearly all Americans to have health coverage. Most coverage satisfies this requirement, including: Insurance you get from an employer.

See also  What are the disadvantages of progressive lenses?