The Church is losing it's influence on American society. Why? I say it's because she's become too preoccupied trying to be accepted by society to cause any change. She's become to concerned with being loved by sinners, to have any effect at all on sinful behavior. What do you think?