This is a legitimate question for you guys and no, I'm not trolling. This also isn't about politics per se, but male-female relations and Islam. I'm hoping this thread doesn't degenerate to petty racial/religious bickering. Lol.
So here goes. I read a lot of talk in the manosphere about two seemingly separate issues:
1) That Western society is in decay. Feminism has ruined the culture, "equalism" has led to rampant degeneracy, women are poisoned with the idea that they should be strong, independent, sexually liberated and even promiscuous, and bytchy. Men are often told to be the opposite or to submit to women. We all know the details...lots of angry slvts and betas and the breakdown of the family.
2) That Islam is the religion of (at least) an invading people, that our lax borders and political correctness have allowed Muslims to pollute Western countries, that it's not a religion of peace but war, that Muslim people are a$$-backwards and have no place here.
Now, I agree a lot with #1 and some of #2. There's a lot of backwardness in Islamic countries, real medieval stuff that makes no sense to me. And certainly the West is a dung pile.
The question is: Isn't Islam (let's say, a "liberal" version of it) fairly in line with a lot of red pill values?
When I see or hear about non-fanatical, non-jihadist Muslims, I see God-fearing people who put family first, and put the man at the top of the family without exception. I see modest and respectful women, and hard-working young men who maintain a steadfast faith (so far as I can tell) that requires daily prayer and even requires them to tithe a percentage of their earnings to charity. I see tight-knit communities, here in NYC at least, suspicious of outsiders but probably honor-bound to welcome a stranger in need no matter what. In more conservative Muslim countries, women stay at home, raise children, cook, often don't vote or even drive. Men must win bread for their families, and women must care for them. There are no slvt-walks, no half-naked selfies, no tattoos, no girls nights out.
Sound familiar?
It sounds a lot like the red pill utopia men here and elsewhere are often pining for. Sure, it's not 1950s America™, but it's a foreign version of some patriarchal ideal everyone wishes we could return to, if only we could turn back the clock.
So why are Muslims often reviled by red pill men? I don't mean the rapists and scoundrels. Show me a million people in any culture and I'll find you 50,000 animals that need to be locked up.
Or, to put it another way, if Islam did come to dominate America and the West, would we be better or worse off than we are today? Looking forward to hearing your thoughts.
So here goes. I read a lot of talk in the manosphere about two seemingly separate issues:
1) That Western society is in decay. Feminism has ruined the culture, "equalism" has led to rampant degeneracy, women are poisoned with the idea that they should be strong, independent, sexually liberated and even promiscuous, and bytchy. Men are often told to be the opposite or to submit to women. We all know the details...lots of angry slvts and betas and the breakdown of the family.
2) That Islam is the religion of (at least) an invading people, that our lax borders and political correctness have allowed Muslims to pollute Western countries, that it's not a religion of peace but war, that Muslim people are a$$-backwards and have no place here.
Now, I agree a lot with #1 and some of #2. There's a lot of backwardness in Islamic countries, real medieval stuff that makes no sense to me. And certainly the West is a dung pile.
The question is: Isn't Islam (let's say, a "liberal" version of it) fairly in line with a lot of red pill values?
When I see or hear about non-fanatical, non-jihadist Muslims, I see God-fearing people who put family first, and put the man at the top of the family without exception. I see modest and respectful women, and hard-working young men who maintain a steadfast faith (so far as I can tell) that requires daily prayer and even requires them to tithe a percentage of their earnings to charity. I see tight-knit communities, here in NYC at least, suspicious of outsiders but probably honor-bound to welcome a stranger in need no matter what. In more conservative Muslim countries, women stay at home, raise children, cook, often don't vote or even drive. Men must win bread for their families, and women must care for them. There are no slvt-walks, no half-naked selfies, no tattoos, no girls nights out.
Sound familiar?
It sounds a lot like the red pill utopia men here and elsewhere are often pining for. Sure, it's not 1950s America™, but it's a foreign version of some patriarchal ideal everyone wishes we could return to, if only we could turn back the clock.
So why are Muslims often reviled by red pill men? I don't mean the rapists and scoundrels. Show me a million people in any culture and I'll find you 50,000 animals that need to be locked up.
Or, to put it another way, if Islam did come to dominate America and the West, would we be better or worse off than we are today? Looking forward to hearing your thoughts.