The Decline and Fall of Biden's America

3 years ago
2

Biden becoming president is not what is weakening America. The fact that Biden ever became president is proof America was already in decline. Biden, like Trump, is a symptom of a much more pernicious disease.

Loading comments...