Do companies give life insurance?

Do companies give life insurance?

Most employers offer group-term life insurance as an employee benefit, although other types can be offered. Term insurance is life insurance that is in effect for a certain period of time only. Generally, in the case of employer-provided term life insurance, the term is for as long as the employee is employed.

See also  How do I become a health insurance agent in California?