Why is it that, Black people feel the need to have a fair skin like white poeple. I'm black and I love my skin. But other black people women especially bleach thier skin to look white. I think it's stupid and it doesn't do them any favours. Any comments?