Is travel insurance required to enter USA?
Is travel insurance required to enter USA?
While you’ve already obtained your visa, you may now be wondering if you need travel insurance to enter the U.S. The short answer is no; mandatory travel insurance is not among the current U.S. entry requirements. Foreign visitors don’t have to purchase travel insurance before entering the country. Oct 27, 2020