It strikes me that the most likely root cause of all of the woes of the western world is the American Empire.

It's as though we're living in Rat Utopia, but the rats are our governments. They have not a care in the world. There's no one coming to attack them, whatever money they need they can print, and and they get to spend their existence inventing novel theories of morality to then impose upon their people.

If governments had to obey the laws of nature, they could never oppress their most productive citizens while raining treasure down upon their criminals and bums. It's hard to appreciate just how un-natural the world we live in really is.

I'm increasingly thinking that what the world truly needs is the fall of the American Empire. We need borders to move again, we need countries to be conquered again, we need to see royal families go extinct. We need to once again return to an era when leaders have THE FEAR, that if they lose focus on the war-fighters, the productive, and the families, they too will become pray.

I have personal reasons why I don't like Russia, but I have to acknowledge that a world without predators is a world without a future.
Follow

@cjd No it's the British Empire because there is plenty of evidence that America is still secretly a British colony.

informerarchives.com/james-mon

· · Web · 2 · 1 · 2
British, American, these are just words. We're talking about the same empire and the point is that once it's collapsed, countries will have to go back to being actual countries.
Sign in to participate in the conversation
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.