The Decline Of The Western Empire

The United States government has had an uncontested position of world domination, a one world government under which it has subjected the planet to its dictates, since the end of the Cold War with the Soviet Union. This is now changing; recognizing this is the first step to creating a brighter future, but in order Read More…

The post The Decline Of The Western Empire appeared first on The Last American Vagabond.

Source

You can skip to the end and leave a response. Pinging is currently not allowed.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes