What is the health care?

What is the health care?

Health care is the maintenance or improvement of health via the prevention, diagnosis, treatment, amelioration, or cure of disease, illness, injury, and other physical and mental impairments in people. Health care is delivered by health professionals and allied health fields.

See also  What kind of insurance is Healthgram?