I'm not American and I don't understand hysterical anti-Americanism either. They say America is a sexist racist nation that held minorities down, yet civil rights and modern feminism are both AMERICAN movements. America as Western nation did WAY more for colored people than most impoverished colored countries, where women don't even have basic rights in the first place. Weren't people like MLK or Kennedy real Americans? Don't they understand it? Without the West, and without powerful Western nations like America, the World would be a worse place to live in actually. Your very technology has a Western origin. Why people don't appreciate what they have?
You people should watch Paul J. Watson's video "Some cultures are better than others". It's priceless. I'd post it, but I rarely use the forum and I don't know how to link videos yet.
Spoiler: contrary to popular belief the West actually ended slavery.