Email us for help
Loading...
Premium support
Log Out
Our Terms of Use and Privacy Policy have changed. We think you'll like them better this way.
It seems that men and women have made a switch in their roles in life. Women have become the aggressors and men have become laid back. What has cause this change? Has society forced women to become assertive? Have women allowing men to be men?
Or have the roles really changed?
Lets Talk About It!!!