Sunday, August 14, 2011

Why do the american evangelicals think they are the true only christians in the world?

when they see the history and I see questions about what makes the difference between a catholic and a christian? I thought the evangelical schools or schools in the U.S taught to the students that the first christian church before the Reform was the roman catholic or the american schools teach the protestantism is the true christianity?

No comments:

Post a Comment