With the rampant feminist talks and progressive movements, it has me thinking about what role men play in women's lives. Have men been "that" domineering and oppressive over the years, that women were forced to ask for equality?
In the case of single parent households, are men to blame for leaving those mother's alone with the kids? Are men the one's who sets the tone and shapes the image of what a man should be, to his children?
In the case of romantic relationships, are men to blame for the relationship woes that women experience, due to men have treated them so badly?
In the case of music and entertainment, are men to blame for the "B's" and "Hoes" content gaining so much popularity?
Tune in to Darryel's Daily Dialogue tonight, "where positive words, manifest positive action."
Sorry we couldn't complete your registration. Please try again.
You must accept the Terms and conditions to register