Is health insurance mandatory for visitors to USA?
Is health insurance mandatory for visitors to USA?
Technically, no. If you are traveling to the US for a short period of time, you are not required by law to have health insurance. However, the price of healthcare in the US is very high, so while medical insurance for visitors to the US is not mandatory, it is highly advisable.