Meaning of west indies

Definition of west indies

(noun) the string of islands between North America and South America; a popular resort area

Other information on west indies

WIKIPEDIA results for west indies
Amazon results for west indies