I sometimes have people tell me they started their new job and then found out there was no health insurance. They express shock -- but they have to give me insurance, don't they?
No, they don't. There is no law requiring any employer to provide any particular benefits to employees. There are some tax incentives for employers to provide benefits like health insurance and 401(k) plans, which is why so many do (that, and the executives want them). Also, some large businesses have to pay an assessment if they don't provide health insurance for employees.
You should always ask about benefits before you accept a new job. The time to negotiate is before you accept, not after you start.
Even though employers don't have to provide benefits, once they do, the benefits are regulated by law. Here are some things you need to know about benefits you might get in your new job, what happens to your benefits when you leave, and the federal laws that govern benefits.
Read more on AOL Jobs.
Thanks again to Gina Misiroglu of Red Room for putting me in touch with the AOL people!
Have a general question about employment law? Want to share a story? I welcome all comments and questions. I can't give legal advice here about specific situations but will be glad to discuss general issues and try to point you in the right direction. If you need legal advice, contact an employment lawyer in your state. Remember, anything you post here will be seen publicly, and I will comment publicly on it. It will not be confidential. Govern yourself accordingly. If you want to communicate with me confidentially as Donna Ballman, Florida lawyer rather than as Donna Ballman, blogger, my firm's website is here.