Dangers Of Man-Centered Gospel

Over the past few years, we have seen many professing Christians recanting their faith, claiming that it does not make sense anymore. Those who have done it have given reasons, but we need to ask why such things are becoming rampant. Is there a problem with God or with men? Is faith failing, or is that pressure mounting at an alarming rate? My argument is that part of the reason this is happening is the focus on man and his abilities, not pure focus on Christ. Why is it that many young people raised in a reasonably Christ-believing environment opt out once they interact with the world years later?

Share

Anger, is it a Sin?
Grace To Say No!
What Does it Mean to Fear God?
Why Does God Allow The People We Love To Die?
Skip to content