blankscole
Member
Hey guys, I wanted to make this thread to state some things, along with my opinion on The Walking Dead. Now first of all I just wanted to say that I think The Walking Dead has gone way down hill. It seems as if they're not putting as much effort into the episodes any more. The first, and second seasons were without a doubt the best of the show, the third season in my opinion is almost a disappointment. I think the reason people originally liked the show, was because of the survival aspect, and how it showed their life in the zombie apocalypse. Now it seems like they're doing more story than surviving. Which don't get me wrong isn't a bad thing, if they included at least a little bit of survival. Plus the story right now (with the governor shenanigans) isn't good at all, I find it really boring. Now I'm wondering what your opinions are. Do you like the show more now than before, if so why. Or do you like it less now, and if so why?
~blankscole.
~blankscole.