Why is Christianity the dominant religion in the US?
A discussion I had on Facebook led me to think about precisely why so many Americans tend to believe that christianity is the "right" choice, the best, original religion. I was raised in such an environment, where the bible was accepted without question, even though it is the product of a middle eastern religious sect.
Christianity spread by virtue of being in proximity to a thriving civilization where travel was much easier than in primitive areas. Also, it opened the doors to the common people and promised a wonderful eternal life, just good marketing, really.
These factors and a few others contributed to it spreading across Europe and Asia, though the larger it grew, schisms continued to divide it again and again.
Because christianity originated in the cradle of western civilization, it was the default religion of your ancestors, and so it became common in america after this land was pillaged and stolen from it's owners.
And thus, you view it as "correct" and "the only true religion", when it's simply a matter of history and chance. Had Buddhism, Taoism, Hinduism, or any other religion become popular with the common people in first century Rome, America could very well be embracing those groups today.
Comments
Post a Comment