

2019-04-22
(DE) •
Documentary,
History,
TV Movie
User
Score
Germans colonized the land of Namibia, in southern Africa, during a brief period of time, from 1840 to the end of the World War I. The story of the so-called German South West Africa (1884-1915) is hideous; a hidden and silenced account of looting and genocide.