Is healthcare a right in USA?

Is healthcare a right in USA?

Universal access to health care, without discrimination, is a human right enshrined in the Universal Declaration of Human Rights and the International Covenant on Economic, Social and Cultural Rights.

See also  What is the best health care provider in Florida?