Some jobs out there will give you benefits that help with a lot of things. Some will help with your health insurance, put a little money away for your retirement/pension, and so on. Do you have health insurance from your employer? Or are you stuck paying that yourself? Do you wish more companies offered health insurance and other benefits like it?