NUDITY? THE NEW NORMAL?
Every time I get on the net or scroll through my Instagram feed I ask myself "where is our generation going to?" When did nudity become a fashion statement or mean you're taking a stand? What happened to women being encouraged to dress modestly? I keep seeing celebrities posing nude and saying they are standing against body shaming and bullying. Why do they have to do it naked? Why can't we women take pride in our bodies by covering them? When did it become okay to walk around with your breast out for all to see? I sometimes wonder why some people even bother wearing clothes. God created clothes for covering but these days it seems that people would rather be naked. Please understand that I am not judging, this is just the cold hard truth.
When will this generation learn? When will we women learn that dressing that way only attracts the wrong kind of attention? Don't you know that your body is the temple of the Holy Spirit? Comments on your body may be nice but they are empty and meaningless. I mean what respectable man goes round telling random women nice butt or breasts and why would you seek after words like that from strangers. Beware of any guy who only talks about the way you look and neglects your mind and character. May God help us all to see the truth and embrace it.
What are your thoughts on this?