Does religion need to be taught?
This may seem like a silly question, I mean there is a book of christ that millions of people around the world study, dissect and interpret. There are professors of religion with PhD's and there are schools devoted to teaching the religious leaders of the future. So it would seem, of course! Religion needs to be taught and studied like any other science.
However, to me, doesn't religion seem more credible if you came upon it yourself? In other words, if you just absorbed the world around you and without anyone else's input, came to your own conclusions, conclusions that helped you explain the unexplainable, and brought you comfort and peace, isn't that just as good if not better than organized religion?
I grew up in a house where religion and God were not taught. That is to say that God was neither purposefully introduced to us, nor kept secret. I had religious friends and nonreligious friends, and I do not remember ever having anyone try to convert me one way or another. But I did have a spirituality- I distinctly remember myself trying to figure out why babies died, and why bad people sometimes got away with doing bad things. And all by myself, I never discussed this with anyone, I came up with my own answer: reincarnation. Babies died because in a previous life time, they were bad people. And bad people? Well, they will get their due justice in their next life times. It was just an inherent belief- and I imagine that many people who identify with certain organized religions feel that this is wrong. But why? Why should my beliefs receive any less credit than your own?
Subscribe to:
Post Comments (Atom)
Your fans request an updated post!
ReplyDelete