America is Unfairly Marked as the Source of All Evil

I am so fucking tired of people blaming America for all the world's ills. They see poor and sick people in Africa, whose fault is it? America naturally! Why are people starving in Asia? America again! Get some fucking perspective please! What if America dissappeared? What if it never existed? Your choice what the world would be ruled by, Naziism or Stalinism. Whenever the countries who claim they are against us need help, who do they turn to? America! Even today, Germany depends on 70,000 troops in their country for defense. When Europe had trouble in the Balkans during the 90s, they could not solve their own problems, they needed America to clean up the mess. The United States gives 60% of the world's food aid, yet I never hear anyone meantion any of that! If we withdrew from the rest of the world, stopped giving aid to everyone, and said, "the hell with you guys," do you honestly think France or Germany could run things? The world would fall apart in two minutes! I am at the point now where I would love to do that.




This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.