The political spectrum here in the states is entirely dominated by 2 party's, Democrat and Republican. Now I know what more or less qualifies as "left" or "right" differs from 1 part of the world to the next. But am I wrong in saying that both of the (lets be honest) only important political parties are on the right side of politics?
The way I see it is, one party is far right (Democrats) and the other party is nothing but Nazis that don't like being called Nazis (republicans.)




Can comment on articles and discussions
This may not be a contribution to your question, but this is hands down the worst thread I've ever encountered of yours. Not that you are obligated to conform to my standards, but did you really feel that it was necessary to make a thread for this question? I'm serious, and am not intending to be rude.