Email us for help
Loading...
Premium support
Log Out
Our Terms of Use and Privacy Policy have changed. We think you'll like them better this way.
Where is the relationship between blacks and Hollywood? Most people don't know that Hollywood was built on the exploitation of black inferiority. There is a culture and legacy of black degradation in Hollywood. Movie studios, writers, film directors, casting directors, producers, and actors all seem to prefer (whiteness). The fact of the matter is we only have a few Oscar winners because our work isn't respected by the dominant culture. Our stories are unable to get told because of this phenomenon. Is a boycott enough? Should we be more concerned about creating our own?