Hollywood has successfully produced many films framed by anti-racist or pro-integrationist story lines. I'm going to guess that since 'Gone With The Wind,' Hollywood realized films about racism and segregation pull at the heartstrings of everyone and hopefully serve to purge a sense of guilt.