Has America turned Fascist?
Before formulating my question I would like to make it plain and clear that I am not anti-American. In fact, my question is because I am not against America.However many social commentators have noted that America has witnessed a polarization of power whereby corporate interests and business are held to be paramount in comparison to the people.
Furthermore, the government is merely serving the interests of the money classes, and the vastly powerful 'military industrial complex', as Eisenhower called it.
This goes hand in hand with an ever erosion of cilvil liberties, in tandem with the growth of the surveillance society, of course to save you from some terrible threat, real or unreal.
Has America arrived at its own version of modern fascism as its authorative hand strengthens?