Women and young girls are taught to embrace their sexuality, but recent studies show women who see themselves in a sexualized manner put themselves at risk for poorer attitudes toward women, their own bodies, and possibly their self-esteem. Read More