If you run a business or work in one, does it provide you and the employees with free healthcare? I think most big time jobs should offer some sort of health insurance to their employees. What do you think though? Should businesses be required to supply these sorts of benefits to their employees?