Horror Would the Zombie Apocalypse make religion more or less important?

Would the Zombie Apocalypse make religion more or less important?

  • More

    Votes: 0 0.0%
  • Less

    Votes: 0 0.0%
  • No Difference

    Votes: 3 100.0%

  • Total voters
    3

Kevin

Code Monkey
Staff member
The TV series The Walking Dead, thanks to Hershel, saw how somebody would embrace their faith even greater during the Zombie Apocalypse. It is a theme shown in other genre movies as well where a person of faith, perhaps a priest fighting vampires, relies upon faith during the collapse of society.

Others might see the cataclysm as a sign that their faith has failed them. After all, what kind of mighty entity would repay devotion by allowing such a thing to happen?

Then, of course, there are those who did not follow a faith before the apocalypse but find themselves questioning faith after.

What do you think? Would you be inclined to lose your faith or to get faith following a zombie apocalypse? Would the apocalypse be a sign that Man is being punished or that nobody is actually watching?
 
For me, it would make no difference at all. For those who are passionate about their faith it'd either make them either lose their faith or embrace it more. Some people NEED something to believe in.
 
Back
Top