History of Europe

What year did the US become imperialists?

The United States began its imperialist pursuits in the late 19th century, starting with the Spanish-American War in 1898. At the end of this conflict, the US gained new territories such as Puerto Rico, Guam, and the Philippines, among others. This began a period of U.S. imperialism, which continued until the mid-20th century.