The U.S. is Not a Democracy, It Never Was
One of the most steadfast beliefs regarding the United States is that it is a democracy. Whenever this conviction waivers slightly, it is almost always to point out detrimental exceptions to core American values or foundational principles. For instance, aspiring critics frequently bemoan a “loss of democracy” due to the election of clownish autocrats, draconian measures on the part of the state, the revelation of extraordinary malfeasance or corruption, deadly foreign interventions, or other such activities that are considered undemocratic exceptions. The same is true for those whose critical framework consists in always juxtaposing the actions of the U.S. government to its founding principles, highlighting the contradiction between the two and clearly placing hope in its potential resolution.
The problem, however, is that there is no contradiction or supposed loss of democracy because the United States simply never was one. This is a difficult reality for many people to confront, and they are likely more inclined to immediately dismiss such a claim as preposterous rather than take the time to scrutinize the material historical record in order to see for themselves. Such a dismissive reaction is due in large part to what is perhaps the most successful public relations campaign in modern history. What will be seen, however, if this record is soberly and methodically inspected, is that a country founded on elite, colonial rule based on the power of wealth—a plutocratic colonial oligarchy, in short—has succeeded not only in buying the label of “democracy” to market itself to the masses, but in having its citizenry, and many others, so socially and psychologically invested in its nationalist origin myth that they refuse to hear lucid and well-documented arguments to the contrary.
…click on the above link to read the rest of the article…