nakedempire


The American Empire in a Changing World



Pages

Friday, August 3, 2012

''American Internal Colonialism''

From Foreign Policy Journal
By Devon Douglas-Bowers


''Colonialism is a word associated with the 19th and 20th centuries, with an outside force (usually European) coming into a country and destroying and uprooting the culture and people, with the main goal being the extraction of resources for the gain of the ‘mother’ country. It is defined as “the policy or practice of acquiring full or partial political control over another country, occupying it with settlers, and exploiting it economically.”[1] Yet this definition of colonialism can be expanded from examining the external to examining the internal. For what may be the first time in US history, internal colonialism is occurring as the very facades of democracy and the economic system begin to fall apart and the elites begin to colonize internally.''

read more 

No comments:

Post a Comment