Is travel insurance mandatory for USA?

Is travel insurance mandatory for USA?

Do I Need to Get US Travel Insurance? Technically, no. If you are traveling to the US for a short period of time, you are not required by law to have health insurance. However, the price of healthcare in the US is very high, so while medical insurance for visitors to the US is not mandatory, it is highly advisable.

See also  Can I take out 2 travel insurance policies?