From an outsider's perspective, Southern USA has some absolutely great stuff, plenty of friendly people and a good attitude towards life ... and alsa some terrible stuff, being extremely bigoted and backwards in their thinking. It really seems to depend on what state, and what PARTS of those states you're in. To an extent, the south is shown in a bad light which tends to overshadow the good parts, and that shouldn't happen. HOWEVER, there does seem to be a higher amount of racism and whatnot in the south at the same time.