Is insurance an American thing?

Is insurance an American thing?

But throughout much of the colonial period, that’s just what Americans did. Insurance arrived on the American landscape at about the same time as the idea of a single nation—the United States—began to form, and it was ushered in by one of the country’s Founding Fathers.

See also  What company did Allstate acquire in 2011?