What is a healthcare company?

What is a healthcare company?

Health Care Company means a Person that is engaged, directly or indirectly, in (a) owning, operating or managing one or more facilities which dispenses, markets or provides healthcare products or services, including, without limitation, pharmaceutical products or services, (b) purchasing, repackaging, selling or …

See also  Does baby go on mom or dad's insurance?