I mean they make great films, but almost everything else about the US sucks. They're really undeveloped for an developed country, still their citizens all seem to LOVE the US by heart. I just don't get it. They don't have health care, there are so many mass shootings, discrimination and sexism, the gap between the rich and poor is huge, their foster care system is shit, most people there don't give a shit about our environment, Trump, the whole abortion debate etc etc. Do Americans not see that the US isn't a great place and almost any European country is more developed and more fair in almost every aspect? I'm not saying that there is a perfect country, no way, but I feel like countries like the Netherlands, Sweden etc. are way out of the US's league. So why do Americans not see that and why do they get so aggressive if you don't agree with them thinking the US is a great country? sorry if I seem rude, I'm just really confused and kinda pissed at the us rn, sorry, pls don't be offended too much.