I just wondered if anyone notices a change in how they feel about their body when the weather warms up. I remember talking to a woman who said she hated spring because she'd gotten used to hibernating all winter, eating, and was able to hide the weight gain under big sweaters. Once spring came along, she had to expose herself. Does anyone prefer a cold climate for this reason?


Trevy Thomas
Body Image Editor