All Christianity seems to do is damage the world

Think about it. While Americans are distracted with gay marriage, which wouldn’t even be a big issue if it weren’t for Christianity, the NSA is spying on them and barely anyone cares, the country still has massive unemployment, and the government itself would be broke, but they keep printing dollars, but the downside of that is that makes them less valuable. In this case, Christianity, or more or less Christian propaganda, is actually putting the future of America in a very bleak spot.

Thanks to Christian influence on Hinduism and Indian society, nobody talks about AIDS, which invariably helps AIDS thrive and spread. Even local celebrities who try to speak out about AIDS are silenced.

In Europe, and in Salem, women got burned on the stake for what are ultimately arbitrary accusations. Particularly in Salem, you would get executed simply for defending your honor and trying to prove your own innocence.

In the Philippines, Catholicism is the law of the land, and part of that means no condoms. Can you imagine what that must mean? Lots of people have to deal with the results of the fact that they can’t get contraceptives or abortions, and the result is not being able to have abortions, even in the case of rape, results in added abject squalor.

In general, in an environment where you homosexuality is shunned and discouraged (if not illegal) and you are homosexual, what happens is that you can’t tell anyone about your feelings, or outright try to practice self-denial to remain on the safe side.

In America, evangelists not brainwash whoever they can, but right-wing Christians apparently have no regard for the environment, under the rationale that Jesus will supposedly arrive soon and end this world and thus they see no reason to pay any respect. We all know what that’s leading to.

And can you imagine what telling kids that they’re gonna to burn in hell is gonna do to them?

Through this and more observations, I have begun to think that all Christianity seems to do is endanger the world and mankind, and their futures, through the power of deceit and oppression. And unless we all wake up to this and start opposing it right now, it will continue to dominate the planet and potentially destroy it, and lead to the ruination of mankind.

Same with Islam in the Middle East and parts of Southeast Asia.

Advertisements

3 responses to “All Christianity seems to do is damage the world

  1. ” Thanks to Christian influence on Hinduism and Indian society, nobody talks about AIDS, which invariably helps AIDS thrive and spread. ”

    I’m pretty sure this has less to do with Christianity, and more with the fact India is an extremely conservative society. Which has nothing to do with Christianity, since it was incredibly conservative before encountering it, and it was only until the British Empire did the caste system relax and the practice of burning widows alive on their deceased husband’s funeral pyre was outlawed.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s