History of Europe

When did the US officially declare war on Germany in World War 2?

The United States officially declared war on Germany on December 11, 1941, following the Japanese attack on Pearl Harbor and Germany's subsequent declaration of war on the United States.