Meaning of katar

Definition of katar

(noun) an Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil
a peninsula extending northward from the Arabian mainland into the Persian Gulf

Other information on katar

WIKIPEDIA results for katar
Amazon results for katar