The mainstream media is reporting recent survey results by the Pew Research Center as if it's the end of Christianity in America. But is this really the case?