
The United States has formalized in a document that the “Western Hemisphere,” as it generally calls America, has a prominent role.
Bringing the World to You

The United States has formalized in a document that the “Western Hemisphere,” as it generally calls America, has a prominent role.