Meaning of Florida
Definition of Florida
(noun)
a
state
in
southeastern
United States
between
the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War
Other information on Florida
WIKIPEDIA results for
Florida
Amazon results for
Florida
Tweet