Thursday 28 November 2013

Health Insurance

Health insurance is health care provided by an employer. When a company offers health insurance benefits, they pay for all or part of the Medical Insurance premiums for their employees. Employers are not required to offer health insurance to employees.


Health insurance is usually a matter of agreement between an employer and employees. By definition, insurance is an employee welfare benefit plan established or maintained by the employer or by an employee organization as an association, or both, that the health of the participants and their dependents through insurance

No comments:

Post a Comment