Yes. It is called the "law of common sense." If you become ill or injured, and the illness or injury arises out of the employment, it is a workers compensation case. That also includes having your work make the condition worse. You need a doctor to diagnose what is wrong, whether or not it was caused or worsened by the employment, and give you options for how to treat it. If it is due to work, the cost of treatment will eventually be the responsibility of your employer's workers compensation insurance. You also need a lawyer.
Answered on Sep 07th, 2012 at 2:14 PM