No, they don't. Seriously, I'm not a big fan of the American South, but people keep acting as if nothing has changed since the 1960's. I'm always amazed at how people around the world think of the South in such caricatural terms. As far as I know most of them are mainstream conservatives nowadays, and I doubt you'd ever hear them say anything against Jews or Israel - quite the contrary.