To interact with this page you must login.
Signup
Why do many Americans think that healthcare is not a right for its own taxpaying citizens, and should it not be considered as, or more, important by the American public than is education?
Many believe in the right to go to school because it is an investment in the American people and because it enables people to pursue their dreams. Shouldn't healthcare be equally important (if not more important) for the same reason?
society
understand
tynamite
Of course they think it's a right. It's not their fault that their government is corrupt by fixing healthcare prices. Only in America could you have a system where single people pay more for healthcare than a married couple. People didn't vote for the Healthcare Senate Bill.
tynamite