When to See a Doctor After a Work Injury
Work related accidents or injuries should be reported to your employer as soon as possible and treatment should be sought immediately, even if you feel the injury is a minor one.This means seeing the doctor right after the accident or at the first sign of any symptom that could be related to your work.