By the late 1800s, America had become a nation of leading economic power and fortified a belief they were a dominant nation to be respected. As history shows going back to the Babylonians, Roman Empire, Persian Empire, British Empire, German (Hitler) Empire, this commonly leads a nation to become a dictator rule and force their rule to conquer other weak nations. America has followed suit in similar way in the late 1800s by involving themselves in the Spanish-American War in 1898 helping Cuba gain its independence, taking control of Guam and Puerto Rico, and paying Spain the rights to the Philippine islands. But is the American empire really the same as previous powerful empires that bullied their way to controlling other nations? In the context that America did force their will to gain control of other world territories, they are the same. However, what I believe separates the American Empire from all other empires in history is that the justification of conquering and taking control of territories throughout the world is on the basis of humanitarian and righteous cause. Do I believe the self-interest for America is also involved, yes I do. But the emergence of the American Empire into the 21st century involves mostly in the interest to better a nation and improve the horrid conditions existent for the citizens. While I do not condone the business of any country to be involved in the affairs of another country, I do believe their is a responsibility and an obligation of an elite nation to help nations that are not self-sufficient or in the torment of control by inhumane leaders. The boundaries of when a nation should get involved or interfere in the affairs of another country is certainly very subjective, but this should not be the reason to justify no action.