There was a time not too long ago (WWII and Revolutionary War eras especially) when America spoke with a sense of moral authority – not only because the world believed the rhetoric and the script, but because Americans did. Thus, America received the benefit of the doubt – even when self-interest was evidently mixed with a just cause. However, has America lost legitimacy? If so, then how did this happen and does it even matter?
Does political power really need legitimacy? If so, then can the situation be reversed?
And in any case, why should we care? The Bush administration's shredding of The Magna Carta and The Constitution draws what response from you, gentle reader? Numbness? Fear? Revulsion? Apathy?
Take a gander at Taki Mag's Authority Issues: Is There Sovereignty Beyond the State? by Thomas E. Woods, Jr. and relish a decent treatment of this, the "forbidden subject".
QUOTE We need government to uphold the norms of morality, I am told by people who specialize in the unintentionally funny. If the moral condition of society has reached a level at which we would look for relief to the kind of men who can succeed in a political system like ours, then the patient is terminal. On the other hand, had the churches not turned their attention these past 45 years to lettuce boycotts and excited pronouncements about the wondrous prospects of the modern world, they might have done more to arrest the moral decline for which we are now told we need the state. UNQUOTE
3 3 3 3