Meaning of tropical medicine
Definition of tropical medicine
(noun)
the
branch
of
medicine
that deals with the
diagnosis
and
treatment
of diseases that are
found
most
often
in
tropical
regions
Other information on tropical medicine
WIKIPEDIA results for
tropical medicine
Amazon results for
tropical medicine
Tweet