Often I hear about plastic surgery and how it’s said to make women feel insecure, and how we as a society supposedly should crack down on this. But is that really the answer? Do we really want to teach people that the way to restore self-esteem is to make laws or use society to make them go away? Do we really want to teach women that the best way to heal your self-esteem is to make plastic surgery disappear just like that, instead of teaching them to look towards their own inner strength, or to conjure strength from within?
Self-esteem and confidence are about strength and confidence from within. Instead of trying to focus on society and laws, you should focus on the individual and his or her strength.
Part of the same goes with so called examples of sexism that people claim are there. Everything these days can be accused of sexist or demeaning to women, and everyone can be accused of being sexist, if someone felt like putting pieces together that weren’t there. The fact is, by trying to use society to get rid of things, we are victimizing people, especially women, rather than empowering them. If we wanted to empower people, we would teach them to look to their inner strength, rather than use society to try and make things go away or victimize people by telling them what they should feel insecure about.