- West Germany as a noun:
- 1
West Germany
noun
1 West Germany
A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990.
synonym: Federal Republic of Germany.
Dutch: West-Duitsland
Polish: Niemcy Zachodnie
debug info: 0.0139