ZombieHitler
Member
- Local time
- Today 12:25 AM
- Joined
- Mar 16, 2017
- Messages
- 27
Well in general, it seems difficult to even have a honest or direct conversation about America at all-----and I imagine is quite the controversial country it would seem. I don't know if it even exists as a country anymore, so much as some strange entity in people's mind "America" this seemingly strange place that strikes a emotional chord of fear, envy, distrust, resentment or I imagine a multitude of things, like some name that must never be mentioned but always be mindful of in the backs of your minds. (yes, I am being sort of melodramatic)
Anyway,
I was wondering in general, it's very obvious that America's always kind of had a stand-offish and complicated relationship with States (despite it's name) but is partially the negative perception of America due to it's history? I don't mean recently, or in the last couple of decades, but I mean it's history in a sense of how it came to be, and it's origin and such. Many American' mind, their country i synonymous with the notion or ideal of "freedom" is it not? to say that would make many snicker but they really mean it, don't they?
I then wonder, is the perception of America, that they won for and fought for their independence true?
And if so, was it justifiable? was Britain really the oppressors and their desire to "free" themselves really necessary? and is that what took place or really transpired?
And if so, is this generally what a large negative perception of America comes from? is having broken of and have fought for independence generally viewed as a wrong doing or an overall "bad" thing to do?
(also looking past slavery and the Native American's please...)
Or unless, actions like that negates any positive perception of America all together.
But I just ask because as a Westerner, the whole "freedom" and "independence" thing really strikes a deep chord with me and I am just wondering what all the preconceived notion and thought are surrounding to the whole notion and idea of those things. Is it all considered contrived or something?
I get this sense that many on the left are of the mind that merely, you should unquestionably or by default, give all the power and right to the States and they blame all wrong doing or problems in the world on this alone.
Anyway,
I was wondering in general, it's very obvious that America's always kind of had a stand-offish and complicated relationship with States (despite it's name) but is partially the negative perception of America due to it's history? I don't mean recently, or in the last couple of decades, but I mean it's history in a sense of how it came to be, and it's origin and such. Many American' mind, their country i synonymous with the notion or ideal of "freedom" is it not? to say that would make many snicker but they really mean it, don't they?
I then wonder, is the perception of America, that they won for and fought for their independence true?
And if so, was it justifiable? was Britain really the oppressors and their desire to "free" themselves really necessary? and is that what took place or really transpired?
And if so, is this generally what a large negative perception of America comes from? is having broken of and have fought for independence generally viewed as a wrong doing or an overall "bad" thing to do?
(also looking past slavery and the Native American's please...)
Or unless, actions like that negates any positive perception of America all together.
But I just ask because as a Westerner, the whole "freedom" and "independence" thing really strikes a deep chord with me and I am just wondering what all the preconceived notion and thought are surrounding to the whole notion and idea of those things. Is it all considered contrived or something?
I get this sense that many on the left are of the mind that merely, you should unquestionably or by default, give all the power and right to the States and they blame all wrong doing or problems in the world on this alone.