Meaning of Haiti

Definition of Haiti

(noun) a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere
an island in the West Indies

Other information on Haiti

WIKIPEDIA results for Haiti
Amazon results for Haiti