The United States is the only country that offers health insurance as a benefit of
employment and many people wonder how this came to be and why now this benefit is slowly drifting away. The answer of where this came from may surprise you. It is a lesson in history that may not only tell you where we have been but may explain where we have wound up.
There were no health insurance benefits for employees up until World War II. Then the U.S. government didn't want to penalize their soldiers so in an attempt to keep a level playing ground while the country was at war the government banned companies from giving pay raises to civilians until the war was over. As a result, employers were looking for a way to reward employees without giving them money and they decided that they could pay for their health insurance coverage and that would be a nice way to reward them without breaking the pay freeze.
Initially health insurance benefits were paid for completely by the employer but as time has gone on and health insurance costs have risen gradually more of the premiums have had to be subsidized by the employees and their families. This increasing cost has resulted in an enormous financial burden on these individuals who are now working at a lower rate of salary while having to pay for most of their own health insurance coverage.
All of a sudden, fewer employers were offering free health care to their employees and it snowballed into the position that the country is in today where fewer people are covered through their place of employment. It used to be a benefit, but instead of appreciating the perk, employees started to expect it.
People can blame the deadlock between the Republicans and Democrats on the health insurance issue and they would not be totally wrong, but the blame can apply to both sides. There are no quick fixes and until a compromise is reached, American will still be debating the future of health insurance.