In India, is it mandatory for employers to provide health insurance to employees? If so, what are the requirements?
No, it is not mandatory for employers to provide health insurance to employees in India. However, some employers may offer health insurance as a benefit to employees.