Answer - B - Africa decolonized and mostly became independent of European control during the 1960s-1990s.
Key Takeaway: The decolonization of Africa followed World War II, as colonized peoples agitated for independence and colonial powers withdrew their administrators from Africa. In the early 1940s, world leaders like Roosevelt and Churchill discussed provisions for making the imperial colonies autonomous. But because the African colonies were not deemed "mature" enough to rule, democratic government was only introduced at very local levels. It was not until the 1950s-1990s that the majority of Africa's nations gained their independence.