I live in the far South where the women talk with sweet Southern accents, love their sweet tea and diet coke, welcome you into their homes with "something to eat or drink", and tan until their skin is burned and wrinkled.....WHAT?!? Ah, sounded like a fairly tale, I'm sure you were already imagining a sweet Southern Belle is her long dress and lace fan, but truth be told, most of the women here are covered in dark, leathery skin (and it's not natural).
Tanning seems to be all about fashion, what's hot at the time. Any doctor, or beauty magazine will tell you the dangers of tanning, including skin cancer, premature aging, and problems with your eyes. Yet, we still do it. As Christian women we say that we don't want to do what the World does, but we're still burning away in the sun or in tanning booths. Why?
Here's what I implore, let us ladies go back to being natural! Love who you are and the skin God gave you! Why wouldn't you want to do what is healthy any how? Are you going to let some model or teenager in a mini tell you what shade your skin should be? No.
Who is with me?!?!
*Note - I'm not in disagreement with some color while working outside in the yard, or garden, or what-have-you.