I got a very serious infection when I went into the hospital for some surgery a few months ago, and I almost lost my life. Who saved me, God or my doctors? My wife says it was God, but what difference does it make?
My family never had anything to do with religion, and they've always made fun of people who did. Well, recently I became a Christian, and now I'm the butt of their jokes. I feel so alone. I admit I don't like being an outcast from my own family. Have I done something wrong?
My son is in fifth grade, and he's having a lot of problems. The school says he might need medication to calm him down. I'm not sure I like him being on a drug.
How should I deal with someone who wants nothing to do with God, and cuts me off if I mention anything about God or Jesus or church?