Is healthcare a right in USA?

Is healthcare a right in USA?

Universal access to health care, without discrimination, is a human right enshrined in the Universal Declaration of Human Rights and the International Covenant on Economic, Social and Cultural Rights.

See also  Can you claim your health insurance on your taxes?