This is my first post in a long...long time, other than in the more secular sections of the site. No offense intended. My question is what's happened to christianity? I know some would say it's alive & going strong...but is it? Really? Espically in this spoiled country? I guess what I'm aiming at is this...where is the faith? The kind of faith that Paul, Peter, John, & others in the begining of the church had? The kind that made folks give up the cares of this world & totaly devote themselves to Christ. I know I don't have it...though I believed I did at one time, and that can't help but make me wonder if mabey a whole lot of us are ,...mabey fooling ourselves when we think we're even saved. The reason I say that is the things I've witnessed over the last few years, & I do retain enough of the word to know that something is bad, bad wrong with what people call christianity today. I know it was'nt perfect even in those first days, but still even the ones we would call weak christians, that are talked about in the word are stronger in their faith than what I've seen nowdays. Why?