Reply to comment


July 22, 2020, 8:10 a.m. -  WalrusRider

The United States was the first country in the Americas to gain independence from colonial powers which probably contributed to US citizens using the term American.

Post your comment

Please log in to leave a comment.