Is insurance an American thing?

Is insurance an American thing?

But throughout much of the colonial period, that’s just what Americans did. Insurance arrived on the American landscape at about the same time as the idea of a single nation—the United States—began to form, and it was ushered in by one of the country’s Founding Fathers.

See also  Does AEP own ercot?