This is for everybody who doesn't live in the United States....
It seems like everybody hates the United States nowadays. Whenever I look at magazines that ask people what they think of the United States, they always say that we are stuck up, etc.
So, what do you think of the United States and the people who live there? Please be honest...
It seems like everybody hates the United States nowadays. Whenever I look at magazines that ask people what they think of the United States, they always say that we are stuck up, etc.
So, what do you think of the United States and the people who live there? Please be honest...