65racecoupe said:Imperialism
1. the policy of extending the rule or authority of an empire or nation over foreign countries, or of acquiring and holding colonies and dependencies.
2. advocacy of imperial interests.
3. an imperial system of government.
4. imperial government.
5. British. the policy of so uniting the separate parts of an empire with separate governments as to secure for certain purposes a single state.
http://dictionary.reference.com/search?q=imperialism
England, Spain, Portugal....These were imperialist countries.
An imperialist country has colonies around the globe. The imperialist countries try to expand their empire. The United States has never been an imperialist country, at least not for long. The westward migration in the 1800s was not imperialism.
The U.S. has more pull than anyone around the world and we do protect our interests, but find me colonies that we have. We are far from isolationists and far from imperialists.
You actually think that the situation in Iraq makes the U.S. an imperial nation?
Ok, I was looking at your own interpretation, but Mr. Webster is adequate I guess....