Meaning of Namibia

Definition of Namibia

(noun) a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa

Other information on Namibia

WIKIPEDIA results for Namibia
Amazon results for Namibia