Skip to main content

«  View All Posts

Should We Embrace or Evict AI in Churches?

January 10th, 2024 | 5 min. read

By Patrick Miller

should-we-embrace-or-evict-ai-in-churches

It took Twitter two years to reach 1 million users. Spotify? 5 months. Instagram? 2.5 months.

 

ChatGPT? Five days.

In the span of five days, AI broke into the conscious awareness of everyday people. For the first time, people played ChatGPT’s linguistic slot machine: tough questions in, surprisingly good answers out. White-collar workers experienced exactly what blue-collar workers did decades earlier: Here’s a machine that can do what I can do at a fraction of the cost. 

Alarm bells clanged across culture with a ferocity that, in some cases, bordered on panic. Serious thinkers who knew nothing about AI before ChatGPT felt a sudden need to share their hot takes on social media and podcasts. But another set of thinkers took a different tack: they relished the generative possibility of AI, launching a cottage industry of new AI products promising to change the world.

In the span of a few months, Christians have divided mostly into two camps about the place of AI in the church: (1) critics who fear generative AI will take jobs and sabotage spiritual formation and (2) pragmatists who hope AI will free ministry leaders to do more.

The rapid technological polarization didn’t surprise me, but I didn’t find it helpful. After several years of writing about AI, I struck a mostly cautious tone. Yet, despite my fears, I became increasingly convinced that generative AI—used ethically—could serve kingdom ends.

Now is the time to pause, converse, and think, not choose sides in a war about technology most of us still know little about. The wise man is correct: “It is dangerous to have zeal without knowledge” (Prov. 19:2, NET). The risks associated with pure critique and pure pragmatism are dangerous because both leave us far more susceptible to the unethical use of AI than we would be otherwise.

Danger of AI Critics

Let’s start with the fearful. Generative AI (i.e., algorithms that can generate text, images, code, videos, etc.) can do sermon research, create sermon graphics, generate small group questions, and write sermons, blogs, and podcast scripts. Ordinary Christians can bypass pastors and mentors (and Google, for that matter) when they have spiritual questions. Instead, they may ask an AI, which happily dispenses “wisdom.”

Where does this all-knowing computer get its information and how does it produce it? All large language models (LLMs) are trained using a specific data set. For example, ChatGPT trained on the pre-2021 internet. When you ask it a question, it predicts an answer you’ll find satisfactory given the parameters of your inquiry and its own training on what counts as satisfactory. LLMs give crowdsourced answers, calibrated to be crowd-pleasers.

If you ask ChatGPT for Christian life advice, it gives only the most conventional wisdom—highly individualistic, self-expressive, rote answers. But the mediocrity of ChatGPT’s answers isn’t the only problem.

Quick, easy access to seemingly infinite information can hijack discipleship. Why do the hard work to learn the Bible and grow in wisdom when a bot can do it for you? LLMs like ChatGPT offer the promise of mastery without work.

So when people say the sky is falling, they’re not totally wrong. AI is a technological shift so titanic that it’ll make the widespread adoption of the internet look like a skiff.

But one big problem is unacknowledged by most people warning that the sky is falling: it has already fallen. We’re already living in the fog. ChatGPT awoke the public to AI, but it didn’t bring AI into our everyday lives. AI was already there in spell-check, Google search, navigation apps, rideshare apps, Siri, Alexa, voice-to-text, social media feeds, video games, facial recognition, spam filters, AI-coded apps, AI-automated shipping and logistics, AI-assisted medical scans, AI warfare, and much more. Virtually everything you see online, you see because an AI predicted it’d interest you. Even when we rage against AI online, AI mediates the rage. It determines who sees what, thereby shaping who engages and how they see reality.

Moreover, none of these examples deals with the technology itself. Do we take umbrage with machine learning, neural networks, or algorithmic computation?

To have robust theological discussions of AI, Christians should have basic competency in the technology itself. Thankfully, there is a growing body of accessible writing and podcasts that can introduce pastors, theologians, and ethicists to the application of AI across various fields. That said, there is no replacement for dialogue with practitioners who can understand AI at a more granular level (AI engineers, developers, and researchers). 

If we want to have robust theological discussions of AI, Christians should have basic competency in the technology. If you want to protect the church from the malformative effects of AI, you can’t play whack-a-mole every time a new variation of the technology pops up. Instead, you need to open the box, look inside, and prepare disciples to respond ethically to all consumer-facing use cases.

Danger of AI Pragmatists

Not everyone is shouting “The sky is falling!” while oblivious to the cloud around them. Some Christians are aware of the fog yet embrace AI without asking serious ethical questions.

They’re pragmatists, buying into the idea that utility justifies use. They ask managerial questions only: Will this save time? Will this save money? Will this help me reach more people?

Pragmatic questions are important for anyone leading institutions like churches, so we shouldn’t dismiss them. They’re simply insufficient. Actions must conform first to the norms of the kingdom, not to the norms of efficiency.

Similarly, generative AI may be capable of writing (bland and conventional) sermons, but this is the scriptural duty of pastors. Neglecting this responsibility isn’t only unethical, it’s also unwise. A machine—however advanced it may be—can’t know the hearts of people in a congregation, so it can’t responsibly calibrate its words to shepherd them toward the living truth they need to hear. It can’t tune itself to the Holy Spirit who ought to guide our homiletic endeavors.

If you come to AI without any ethical convictions, you will make ethical missteps. Why? Because your de facto ethical grid will be utilitarianism: If doing x achieves y goal most efficiently, then doing x is the right thing to do.

Better Way

In Acts 17:26, Paul tells the Athenians, “[God] made from one man every nation of mankind to live on all the face of the earth, having determined allotted periods and the boundaries of their dwelling place.” If God sovereignly ordains national order in history, we should trust we’re not living in the era of early AI by accident.

Just as David “served the purpose of God in his own generation,” we’re called to serve God’s purpose in this generation (Acts 13:36). How we navigate AI today in our churches and in our personal lives will structure the ethical norms inherited by our children. We need to think about AI cross-generationally.

Pragmatists tend to function with a short time horizon, asking only what can be done now, without wondering about the future consequences. The fearful are equally stuck in the present. Because they don’t seek a deeper understanding of how AI works, and how it’s invisibly integrated into our lives, they end up offering knee-jerk reactions to every hot AI headline while ignoring the truly nefarious (though more invisible) ways AI is harming us.

So we need to bring together people with diverse competencies (theology, ethics, and technology) to explore the ethical ramifications of AI in everyday life, discover what uses are ethically permissible, and create simple frameworks for everyday Christians to both see and evaluate their own uses of AI.

We’ll neglect our unique intergenerational responsibility if we continue to waste energy feeding the arguments between the fearful and the pragmatists. Instead, we should embrace charitable conversations, where people with diverse expertise can educate one another and improvise novel ethical solutions for a world none of us chose—but God saw fit to put us in.

NOTE: This article was originally published on The Gospel Coalition.

 

 

Patrick Miller

Patrick Miller (MDiv, Covenant Theological Seminary) is a pastor at The Crossing. He offers cultural commentary and interviews with leading Christian thinkers on the podcast Truth Over Tribe, and is the coauthor of the forthcoming book Truth Over Tribe: Pledging Allegiance to the Lamb, Not the Donkey or the Elephant. He is married to Emily and they have two kids.

Subvert the Internet Without Abandoning It

Learn how to retool the internet for Christian mission from digital practitioners, theologians, and creators.

The latest on faith and tech from leading Christian thinkers.