Men are allowed to show their chest but women aren't. Breasts are just a body part. They're only there to breastfeed babies. In many european countries women are allowed to bear their chest in public. Their breasts are just seen as another body part such as your arms. In America it has been sexuallized because women hide them. Woman have been brainwashed by this idea that the body should be concealed and it just leads to problems. As long as u conceal your body men are going to look and women will always feel harassed. I think the body is natural and should be displayed. In other countries such as Australia, they think this also and woman are allowed to be topless in public and women feel much more free. There are even nude beaches where you are allowed to be completely nude and you see entire families having fun on the beach nude and nobody gives them nasty looks or anything. Why is America like this. People who even show a little bit of their bodies are considered sluts