This article describes how the South transforms from a disjointed, locally oriented rural nation into a new nation (albeit a failed one in retrospect). With the exception of Gone with the Wind, tales of southern nationalism and romanticization of Dixie are rarely told for the shame of slavery, Jim Crow, and the seeming backwardness of the southern tradition.
Does the article successfully disprove some of these notions? Do you believe that the Civil War helped create the modern South, even if the confederacy lost? Who was more nationalistic? North or South?