Image: Bernd von Jutrczenka/dpa/picture allianceImage: Bernd von Jutrczenka/dpa/picture alliance
German colonialism
Until 1918, the German Empire ruled over colonies in Africa, Asia and Oceania. The systematic violent oppression of the people living there is only gradually being dealt with.