Most pastors use AI. Almost none have a plan for it.
60% of pastors use AI monthly, but only 5% of churches have a policy. New research reveals where churches actually stand.
Somewhere in your church, someone is using AI. Maybe it’s you. Maybe it’s your communications director, running newsletter drafts through ChatGPT because it saves her a couple hours on Tuesday afternoons. Maybe it’s your youth pastor, who started using it for lesson outlines six months ago and hasn’t mentioned it to anyone.
This isn’t a problem. It’s just what adoption looks like before institutions catch up to individual habits.
The 2026 State of Church Technology Report, developed by Pushpay in partnership with Barna Group and reflecting input from over 1,300 church leaders, puts a number on something most leaders already sense: 60% of pastors use AI in their personal lives at least monthly. Only 33% say their church is using it in ministry or operations. And only 5% have any kind of policy around it.
We’re not attempting to tackle whether AI is good or bad for ministry. What it does is map where churches actually are right now and what separates the ones navigating this well from the ones that will spend the next two years cleaning up decisions made without a framework.
The adoption problem
AI adoption in churches, at least right now, looks less like a formal rollout and more like individual staff members solving individual problems with whatever tools are available.
The communications director who discovered she could cut first-draft time in half. The associate pastor who uses AI to research sermon background faster than any commentary library he owns. The volunteer coordinator who, after the third time rewriting the same email, decided there had to be a better way.
None of this is inherently wrong. But it creates a specific organizational problem: when AI use is distributed across individual habits and nobody has decided what’s acceptable, you don’t actually have an AI approach. You have a collection of workarounds that nobody’s officially reviewed.
That matters because undiscussed tools create undiscussed expectations. One staff member uses AI to draft pastoral care follow-ups. Another considers that a boundary he’d never cross. Both are right, but only if someone has actually drawn the lines. When no one has, you find out where the lines were after something goes sideways.
64% said policies matter. 5% have one.
This is the number that should bother church leaders most, because it’s not a knowledge gap. Church leaders overwhelmingly understand that AI governance matters: 64% of leaders in our research called it important. They just haven’t acted on that belief.
The reasons are understandable. Nobody went to seminary to write an AI use agreement. Policy work feels bureaucratic, and ministry culture tends to resist bureaucracy for good reasons. There’s also genuine uncertainty about what an AI policy for a church would even cover.
But the cost of inaction is concrete. Without any shared agreement, AI decisions in your church get made by default: by whoever is most enthusiastic, least cautious, or most time-pressured. When something feels off, a sermon that sounds like it was written by a committee or a donor communication that doesn’t quite sound like your church’s voice, there’s no process to learn from it because there was no process to begin with.
An AI policy for most churches doesn’t need to be a legal document (but if you’d like some help getting something in place, check out our AI policy generator). A 30-minute staff conversation that produces a shared understanding of where AI is appropriate, where it isn’t, and who’s responsible for what is a policy. Most churches haven’t even scheduled that conversation yet.
The churches pulling ahead
Picture an organization that uses its church management software to flag anyone who’s been attending for 90 days without connecting to a community group, then actually assigns someone to follow up. The software isn’t doing anything extraordinary. Most church management systems can surface that data. The difference is that someone decided that information would be used to inform pastoral responsibility.
That decision is what the report calls a “missional approach to technology.” About 1 in 4 churches in the study qualify as highly missional in their use of technology, meaning they’ve woven it into discipleship, worship, and community formation rather than treating it purely as an operational convenience. Those churches significantly outperform the rest in Gen Z and Millennial connection, and they show measurably stronger spiritual effectiveness across multiple indicators.
The differentiator is orientation. These churches have brought technology questions into the same conversations where ministry decisions get made, and that changes how every tool in the stack gets used.
And it’s a question your leadership team can discuss this month, regardless of what platform you’re on.
What the confidence gap actually costs
Most leaders in the report believe that technology opens real ministry opportunities. The ones pulling ahead have acted on that belief. But most haven’t, and the reason isn’t a lack of training on specific tools.
The more common problem is that nobody has put AI on the agenda as a team decision. It’s been treated as a personal productivity question, something each staff member figures out individually, rather than a leadership question about what the church collectively expects and permits. That means no shared expectations, no agreed boundaries, and no shared language for talking about what’s working and what isn’t.
The practical cost compounds quietly. Your most tech-forward staff push further ahead. Your most cautious ones grow more resistant. Nobody’s making a bad individual decision, but the team ends up fragmented around tools nobody officially agreed to use or avoid. Then AI produces something that feels wrong, and because there’s no framework, the response is either to abandon the tool entirely or shrug and move on. Neither is useful.
One honest note: no policy resolves every AI question, and the technology is moving fast enough that any agreement written today will need revisiting. The goal is a baseline your team can build from that gives everyone permission to make decisions together rather than defaulting to individual judgment.
Where to start
Before building any policy, find out what’s actually happening. Ask your staff what AI tools they’re using and what they’re using them for. The answers will surprise you, and the inventory itself becomes the foundation for an honest conversation about where the lines should be.
Then put it on a staff meeting agenda. Not a committee, not a task force. One agenda item: What are we using AI for right now? What feels like a clear yes, and what does not? Thirty minutes of that conversation produces more useful guidance than a policy drafted in isolation and handed down.
That’s genuinely all most churches need to do right now.
The churches outperforming their peers aren’t operating with better technology. They’re having more intentional conversations about it. If you’ve been waiting for a good moment to start that conversation at your church, the report gives you the data to do it well.