Is it mandatory to have travel insurance?

Is it mandatory to have travel insurance?

Is travel insurance a legal requirement? No, you’re not legally required to have travel insurance. Some tour operators will insist you have a policy in place before they confirm your travel, especially to countries like the USA where there’s no public health service. Jan 29, 2020

See also  How does annual travel insurance work?