why is it fewer and fewer christians care about what scripture teaches, about being biblical in their lives and beliefs and pratices? Some don't even care if their 'biblical' beleifs are even in the bible. Someone once told me, well, it isn't in there but I believe it anyway. Now we've seen several in the past month on here tell us correct biblcal doctrines don't matter at all. Have christians always been this weak in their bible understanding? Or is this a modern laziness, a straying away from sound bible teachings?