Meaning of us army

Definition of us army

(noun) the army of the United States of America; the agency that organizes and trains soldiers for land warfare

Other information on us army

WIKIPEDIA results for us army
Amazon results for us army