As an American, I can tell you there is no far left in America. America is mostly far right and center right. Some social issues you can say are left leaning for example legalizing recreational drugs, however, that alone doesn't make America anywhere close to being far left. In terms of economic...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.