I'm a late comer on both of these shows,decided to watch them on Netflix about a month ago and enjoyed them tremendously.
Walking Dead brought a theme I cared very little about (zombies) and made it into something I could appreciate.
Breaking Bad brought a creative mix into crime and justice.
Both show how mankind can fall apart when presented with an unfortunate event.
So which do you think is more addicting/ like better?
Please no spoilers
Walking Dead brought a theme I cared very little about (zombies) and made it into something I could appreciate.
Breaking Bad brought a creative mix into crime and justice.
Both show how mankind can fall apart when presented with an unfortunate event.
So which do you think is more addicting/ like better?
Please no spoilers