Here’s what I believe about health care in the US.
1. Everyone should have it.
Every other industrialized country has decided that health care is a human right and has found a way to provide it to everyone. We should do the same. We should not let anyone else die or go broke because they don’t have health insurance. If Europe can do it, so can we.
2. Making a profit from health insurance isn’t right.
Doctors, nurses and other health *care* providers should be allowed to make a profit, but nobody should be allowed to profit from providing insurance. That simply extracts more money from the system and redirects it to people who don’t make anyone better. Every other industrialized country has a non-profit health insurance program. Profit motives provide the wrong incentive. There should never be a financial incentive to deny needed care.
3. A public option should be available.
If the public option is allowed to compete with private health insurance, we win either way. If the government can provide the same level of care, pay doctors the same, and cost less overall, how can that be a bad thing? At least let them try. We have public and private schools, why not public and private insurance? May the best plan win!
That’s what I believe. If you happen to feel the same, please feel free to repost. If you believe something else, post what *you* believe.