Has the West Fallen?
In recent years, we’ve all heard the claim that Western civilization is collapsing under “globalist agendas.” But is that true—or is it just paranoid speculation, born from biased observation and narrow-minded thinking? Let’s break it down.
It’s definitely a popular talking point. Politicians weaponize it, people parrot it constantly. But like most things, the truth isn’t black and white. First, we have to ask: has the West even fallen? Look at Europe—you could argue yes. But then you turn to America, where conservative influence is rising, and suddenly the narrative isn’t so simple.
I speak as someone of Indigenous and Irish descent, raised in the 90s, in what felt like an “untouched” version of Western society. Has there been change? Absolutely. The demographic makeup of my country has shifted—artificially, in my view—largely driven by left-leaning globalist policies. The left has dominated my country politically, and it shows.
Now, before you dismiss me as a “racist right-wing bigot,” let me be clear: I am actually aligned with the political left. I value diversity. I love my Black brothers and sisters, my Indigenous roots, my Irish heritage. I grew up surrounded by Portuguese families and Asian friends. I care deeply about inequality and the struggles of marginalized people.
But here’s where I draw the line: religion. I hate it—with a passion. Not in the sense of violence, but in the sense that I don’t want it anywhere near me or my family.
And this is where things get complicated. Over the past few years, I’ve struggled to identify with the political side I once called home. My country’s institutions, public spaces, and even culture have increasingly been influenced by religion, particularly Islam. And yes—I am an Islamophobe. But don’t confuse that for racism. I am equally hostile toward Christianity, Judaism, and every other religion. I hate the entire structure of faith-based thinking.
Why? Because history speaks for itself. Religious conquest and genocide are not abstract concepts—they’ve scarred humanity for millennia. Islam, for instance, has its own bloody record: the Armenians, the Kabyle of Algeria, the Jews of Judea, and many more. The Quran is often misread as a peaceful counterpart to the Bible, but the real danger lies in the Hadiths—most Western liberals don’t even know what those are. Read them, and you’ll see what I mean.
But let’s be fair—Christianity is no better. Let’s talk about Mary’s age. Let’s talk about Aisha’s. Why do we normalize pedophilia when it’s tied to prophets or so-called “holy men”? Excuses like “different times” don’t cut it. A prophet claiming divine purity and moral superiority has no business embodying the same corruption as a medieval king.
So, back to the main question: has the West fallen?
No. Not yet. But it could within the next 20 years if we stay on this path. What we need is not hatred, not racism, but reform grounded in science and reason. Religion cannot be allowed to gain ground—not an inch. Because if it does, our children will be just as indoctrinated as our grandparents. Not stupid, but shackled by dogma.
If we want a future worth living in, we need to fight for it. And that means stripping religion of its influence once and for all.
Add comment
Comments