Saturday, January 8, 2011

What do you think of the United States of America? What do you think of Americans? Don't worry you won't offend me at all.

I think America, and most Americans, think they're the best country in the world. Which is fucking arrogant and untrue. I think America has a lot of problems which they either ignore or downplay. To become a better country they need to accept this and try to fix it but the majority of Americans don't want to.

That's my opinion of America in general. I know plenty of Americans that are the total opposite of this but that's my opinion of many Americans. Not ALL Americans but some/most.

Thanks for this question, I don't often get such intelligent and thoughtful ones. It's nice for a change.

Ask me anything

No comments:

Post a Comment