A friend of mine says the Bible is very negative toward women, and this is why society has oppressed women throughout history. Is this true?
I believe in Jesus, but I don't get anything out of church and can't see much point in going, to be honest.
I'm very scared about the future, especially after all the economic turmoil we've seen in recent months. Are these signs that Jesus is about to return? I understand that things will get worse and worse just before He comes, and it looks like this is what's happening.