Most workers in the United States are covered by workers' compensation when they are injured at work. This enables them to gain back damages for any expenses associated with their work injury, from medical bills to lost wages. This compensation is generally awarded...