Especially in religion..like they would support gay people even if the bible says its a sin they would say god is forgiving and he will forgive us bla bla bla or like having *** before marriage they would also say that too, my point is why do most people act like they belong to a certain religion when in fact deep down they know they don't believe in god??