Has Hollywood Taken A Step Back When It Comes to Women in Hollywood?

If you think things have gotten better in Hollywood for women, think again. A recent Elle magazine article revealed that sexism is alive and well in Tinsel Town. While things certainly have improved over the decades, New York actress Katrina Day is pointing out areas where it still exists —

Has Hollywood Taken A Step Back When It Comes to Women in Hollywood? Read More »