We're faced again with a challenge of our faith. We ask that those around us who ask us to be Christian, at least understand what we believe. We aren't asking anyone to be Muslim... just to have a clue about what we believe. Frankly, I'm sick of seeing emails and literature that talks about how Islam is the black-man's religion and Christianity is the white-man's religion. Hello?!?!?! I'm white as paper, and while my husband tans nicely, he's white too. Do you think we'd get into a religion that taught us that white people are the devil? Ugh. Why is not logic and reason used. And why is what "makes you feel good" so dang important? It used to feel good to do bad things... but now, that doesn't feel good anymore. We instead need to stick with what is solid, what matters, and what is not going to change over time. The things that are important in this life, are not always going to make you "feel good." Heck, the baby just did a big flip in my belly, which didn't feel good. Labor isn't going to feel good either. That doesn't mean we run away from having kids.
I try to understand why we are faced with such opposition on something people know near nothing about. All I can come up with is fear - fear of what is outside of themselves. Do we all assume that we know everything, or that our pastor knows all? Any new idea is thrown out the window because it wasn't given to us on the silver platter of "what makes you feel good" or it wasn't presented in a way that matches what you believe already? Why are we always wrong when no one knows what we even believe? *sigh*
I just don't get it.