Unless you’re living under a rock, you’re probably aware that zombies are experiencing something of a pop culture renaissance these days, as evidenced by cult flicks like Zombie Wars and the ubiquitous The Walking Dead series. Why have these shambling, flesh-hungry monsters become the stars of the entertainment world?
There’s no one answer to this question, but there are a few possible explanations for their popularity. To begin with, zombies pose an interesting irony: they are human (at one point, at least) yet they are clearly something else. That’s why scenes where a person has to deal with a “zombiefied” relative or friend are always so heart-rending.
These horror staples can also be surprisingly symbolic. For instance, they can act as reminders of people’s inherent fear of death. They can be stand-ins for the disenfranchised, they can be a critique of corporate greed, or they can even be horror-movie counterparts of terrorists. Not surprisingly, this rich subtext has paved the way for zombie flicks of all genres, from straight-up horror to satire, and even romantic comedies.
Sometimes, though, a zombie movie is riveting not because of the living dead, but due to how their existence affects the fabric of society. As often happens, the zombie apocalypse brings out both the best and the worst in humanity, just as wars and political turmoil do.
Are you a zombie fanatic? Why do you think zombies have become so popular?