Meaning of western united states

Definition of western united states

(noun) the region of the United States lying to the west of the Mississippi River

Other information on western united states

WIKIPEDIA results for western united states
Amazon results for western united states