Hey guys, I know we shouldn't judge but reading through history does anyone else feel like the Western Christian Churches, as in the Catholic Church and the resulting schisms due to its corruption, is a scar in the history of the Christian faith?
I mean the way they manipulated both the people and Christianity to serve their political and monetary needs, it makes me shudder. I compare it to the steadfastness of our Church, and how it wouldn't even occur to us to intertwine religion and state and how they used that almost every chance they got. Such as the sale of indulgences, King Henry creating his own Church to legalize a divorce. Of course the corruption of the Catholic Church brought about honest reforms but ones that came with unholy mistakes such as Martin Luther's removal of the alter.
I mean I understand no one's perfect but when I studied Western Christianity I sort of became disgusted. And I don't find it surprising one bit now Europeans are mostly becoming Atheists. Does anyone else feel this way as well? Perhaps I'm getting worked up over nothing, but I just feel the need to share my concern.
God be with us always.
Comments