Is healthcare a right in USA?

Is healthcare a right in USA?

Universal access to health care, without discrimination, is a human right enshrined in the Universal Declaration of Human Rights and the International Covenant on Economic, Social and Cultural Rights.

See also  Does North Carolina have a penalty for no health insurance?