The Walking Dead tv show is absolutely horrible. The writing, characterizations, and directing is all sloppy and terribly done. I feel like people just like it to jump on the "zombie bandwagon" as that's the popular thing right now.
I get so angry when people mention it and Breaking Bad in the same sentence as if they're comparable.
Which sucks because I heard that the comics are actually really well done. I guess the show kinda ruined its name.