Are Business Owners Required to Provide Health Insurance to Employees?
Business owners are generally not required to provide employee health insurance. However, the new health care reform laws put into place by the Obama administration encourage employers to provide health insurance to employees. The Health Care Reform Act, which goes into effect in 2014, will penalize businesses with over 50 employees that do not provide employee health insurance. However, this penalty costs business owners less than the cost of employee health insurance benefits, so the effects of the new law remain to be seen. Further, many states have proposed legislation that would prohibit the new mandatory health insurance rules from applying to employers.
→ Read More