Black women are routinely denied positions of power in America—and there are consequences

Black leadership in America is a necessity for ending racism. Here's the story of why so few Black women have taken positions of power in the United States.

alt
Photo: <a href="//www.stocksy.com”" target="“_blank”" rel="noopener noreferrer">Stocksy/BONNINSTUDIO