Health Care
What does Health Care mean?
Health care encompasses all medical treatments needed for treatment of a workplace injury. Health care is a broader, more encompassing term than medical care. While it includes all necessary medical treatments to help a claimant reach their maximum medical improvement it also includes all types of medical aid, medical examinations, medical diagnoses, medical services, medical evaluations, medical diagnosis, and all necessary treatments for the healing and prevention of disease.
A health care facility is the general term used to describe where an injured worker goes to receive medical treatment, but it can also include an emergency clinic, outpatient clinic, hospital, or doctor's office. It can include all primary care, secondary care, and tertiary care, as well as public health.
Employers may have the legal right, under work comp laws, to force employees to seek treatment only at approved health care facilities which are licensed and approved by the state and the employer's insurance company.
Health care is provided to an injured worker for the duration of their injury or until they reach their maximum medical improvement level. The goal of health care is to help the worker heal and return to work as quickly as possible. If you have been injured at work and need health care you need to talk to your employer before seeking medical treatment.
Related Pages
Lawyers near
Term of the Day
Undocumented immigrant
An undocumented immigrant is the term for a person residing in the United States without legal immigration status.
Category: Immigration