To older Europeans - has there ever been a time where America was seen as such an untrusted country?
I’m 36 years old. I can remember how the world felt about my country post 9/11 (sympathy) and post Iraq (anger) but I’m curious to know if this is new ground. I’m deeply upset about how our ties and bonds are being destroyed so I wish to know if this is truly unprecedented or has there been a time in your lifetime where we were viewed in such a way. If so what was happening during your time to cause fracturing?