Why is it important to love and accept your body?
Many, many companies profit off the low self-esteem of women: Diet companies tell us that losing weight will make us feel better about ourselves, cosmetics companies tell us that wearing makeup will make us feel better about ourselves, drug companies tell us that getting face-tightening injections will make us feel better about ourselves. All we have to do is give them some money, and they will give us better body images. And sometimes we do, and sometimes their promises pan out. But the marketing machinery is still whirring in the background, so that once we feel decent about our wrinkles we begin focusing on our love handles, once we’ve got those under-eye circles under control the worries about hair texture and color crop up. Make no mistake; Money is being made off of women’s body insecurities.
Originally posted 2014-08-05 06:44:38.