I see it time and time again, on American news, the internet, articles, etc..Americans cannot stand 'the left'. I like to know why this is. Surely the US was founded on liberal ideals, otherwise they wouldn't have left the old world for the new one!? What's with all these hissy fits against liberals? I thought I'd post here and pose the question as so many users here are American.