Site uses cookies to provide basic functionality.

OK
The Bible is not, never has been, and never will be the center of the Christian faith. Even though the Bible (at least in some form) has been ever present since the beginning of Christianity, it's not the central focus of the Christian faith. That position belongs to God, specifically, what God has done in and through Jesus. The Bible is the church's nonnegotiable partner, but it is not God's final word: Jesus is.