David Capes
Theologically, one of the things I know people are concerned about these days with AI is the loss of jobs and the loss of income and the loss of livelihood, and with that, some would say comes the loss of dignity.
Lyndon Drake
I completely sympathize with that. And it’s not just the loss of jobs. It may be changes in kinds of work that people will have to contemplate. And certainly, in the cultural West, even in the AI industry, we’ve often exported dehumanizing and undignified labor to the majority world. So in order to build the AI systems that we currently use, we’ve typically relied on the labor of people in majority world countries to do often very unpleasant work. Such as tagging images for offensive content, categorizing them, and that’s a very degrading and dehumanizing task to be given to do. We benefit from that in the richer parts of world.
I think what we’re foreseeing is something that we can also look back on to some degree, even in recent years, and see happening. I think there’s a difference of kind in that in the past, often it was lower social status work that was eroded through technological improvement. What we’re about to see is quite a few high-status types of work vanish entirely. I’ll give the example of medical imaging, where machine learning based systems really do a lot better than people in the domains they’ve been trained on. None of us want to go in and have a more error-prone diagnosis.
And it seems like a human good, that if better diagnosis and hence better treatment is a good thing that we ought to want that in society. But it’s going to come at the cost of an entire profession that up until now, we valued in particular ways. You could say I’m a radiographer and implied in that is that I’m doing something very worthwhile for other human beings. And it seems evident that, over the next few years, it’s probably going to largely vanish as a profession. Certain kinds of legal work as well are undoubtedly going to vanish, and again, it’s been a high-status profession. That’s a different thing to appreciate
socially.
And the last thing I’ll say is technological changes are often unpredictable in their details, but what we can look back on history and say is that the disruption they bring is often quite damaging to society and to cohesion. I’m very cautious about trying to get too tied down about the details of which jobs will change. I know I’ve just done that, but we’ve got to be a little bit humble about exactly what’s going to work and not work. I do think it’s going to be very disruptive to society, and that calls on the church to show particular forms of care, to look back on how we’ve cared well or badly for people in context like that in the past. And then, what ought we to do now?
David Capes
Let’s talk about ethical AI. Is that the right phrase, or what people are talking about. Is that thinking through the ethics of what we’re about to do or what we’re engaging in at this point? I’ve heard other phrases bantered about, but is that the one that you guys are tying into at Oxford?
Lyndon Drake
I think there is a range of terminological choices, and none of them really seem to me to be absolutely ideal. Another one that’s used a lot is responsible AI. And that’s partly because when you get right down to it, we don’t actually share commonality. Even in the West, we don’t share absolute commonality about what it is we’re shooting for as an ethical goal. Take, for example, Elon Musk, who sees a very, very high priority on being able to escape the bounds of earth and to go to other planets. I think there’s a very strong sense in what he talks about, ethically, of the absolute priority of the survival of the whole human race, and at that point, the dignity of individuals. Because the key thing is that life on Earth is constrained, so we need to escape and that means there are sacrifices worth making.
To my mind, that is given a slightly lower level of priority. I have to say I’m uncomfortable with it. There’s a sort of utilitarian bent there in a lot of ethics that I don’t agree with. I don’t think utilitarianism is a deeply Christian approach. I think you run into this thing, where we don’t share the ethical givens that would allow us to have a shared conversation. And that’s why responsible AI has sort of attended to a different side of things. What we’re trying to do in our project is to say unashamedly that we want to think as Christians for the benefit of the whole of society. So, saying we need everyone to share in our presuppositions, but at the same time, we do think we have something to offer from out of Christian theology.
David Capes
You know, one of the things I appreciate is those who think theologically about these new questions that keep coming up. There’s an ethical side of it too, as you’ve indicated. There’s a theological approach when we think about who God is, who we are as human beings, made in His likeness and his image bearers. There’s a lot of aspects of that, that I think you’re going to be helping us with when you come and join us. Let me use the baseball metaphor. You are our lead-off hitter for 2026. You’re doing an early January lecture, and we’re very grateful for that.
Lyndon Drake
I’m so grateful to have the invitation. I’ve been twice, and I’ve real warmth for the place and the people. And for your listeners, my family and I had the privilege of living in the equivalent of the Houston Theological Library in Yarnton. Which is just north of Oxford, and where Mark and Becky Lanier have very generously supported, not an identical setup, but a very similar setup to here.
David Capes
Yes. It’s a manor house built back in 1611 and has been brought back to its former glory.
Lyndon Drake
It has and for a much broader beneficial purpose as well, than for the family who built it.
David Capes
And there’s a new library open there, which we’re pleased with. I’m grateful that you and I have had a chance today to talk a little bit about what you’ll be saying at your lecture when you come. Give us the title of your lecture, once again.
Lyndon Drake
I’m going to be speaking on “AI Theology and Human Formation: How Our Tools Shape Us”. The big thing I’m wanting to do there is to reach back into this wonderful, long tradition of Christian theology. One of the insights that people in the Middle Ages had as they started developing new physical tools was that we don’t just use tools. Tools change us as we use them. I want to interact with that whole idea of Christian formation in that lecture.
David Capes
I think about the invention of the Gutenberg Press and how that changed, not only individuals, but also the world. Because now you could create identical documents very quickly, compared to having them handwritten on vellum, which is a type of animal skin for writing purposes. That tool really changed the world. Didn’t it?
Lyndon Drake
It absolutely did. Behind each of us is a bookshelf with some books on it, and we have access now to an unimaginable wealth of written information, which, prior to the invention of the printing press, was confined to such a small group of people. And this absolutely changed the world, but also changed the way people inhabit the world, how we participate in the goods of God’s world. And there were some things to react against, in that. And there were a lot of goods in that as well. And I think there’s a real parallel between the printing press and some of the AI progress that we see. I mean, there’s actually been a lot of AI progress already. But this sense that it’s going to change how we function as created beings within God’s world is something we ought to think about. I think we’re drawing a strong parallel to the printing press.
David Capes
Dr. Lyndon Drake, thanks for being with us today on “The Stone Chapel Podcast.” We look forward to your presentation in January of 2026.
Lyndon Drake
Thanks so much. It’s great to be with you, and I look forward to being there in person.
Transcribed by https://otter.ai
