The Civil War and the Far West
The Civil War and the American conquest of the West were two of the most important events that changed the United States in the nineteenth century; however, they often treated and taught separately in history texts and classrooms. This separate categorization is hardly surprising since, in terms of geography, the majority of the Civil War took place in the southern and border states, with little military engagement in the trans-Mississippi West that occupies the focus of most western history specialists. But is also odd, given the importance of the West to American politics and identity in the decades leading up to 1861.