Meaning of west berlin

Definition of west berlin

(noun) the part of Berlin under United States and British and French control until 1989

Other information on west berlin

WIKIPEDIA results for west berlin
Amazon results for west berlin