What place does AI have in ministry or the lay apostolate? Part 2

(Image: Hitesh Choudhary / Unsplash.com) Editor’s note: Part One of this essay can be read here. The Dangers of AI in Ministry AI’s development potentially has profound implications for various fields of ministry, where it could serve as a...

What place does AI have in ministry or the lay apostolate? Part 2
What place does AI have in ministry or the lay apostolate? Part 2
(Image: Hitesh Choudhary / Unsplash.com)

Editor’s note: Part One of this essay can be read here.

The Dangers of AI in Ministry

AI’s development potentially has profound implications for various fields of ministry, where it could serve as a powerful adjunct for evangelization, education, and community building. But does this mean that it should be used in these fields? Not everything that can be done should be done. There are many cases in which there is no room for doubt about technological abilities that should never be employed, such as abortion, IVF, so-called sex change surgeries, human embryonic stem cell research, etc. There are areas in which the technology itself may not be intrinsically evil, but its application has mostly negative potential, such as nuclear weapons technology, chemical weapons technology, and human cloning.

However, there are many technologies in which the prudence of particular applications is not so clear because the benefits are potentially great but the potential for catastrophic and widespread abuse is also high. Artificial intelligence is one of these. Yet, there is almost no conceivable scenario in which the genie might be put back in the bottle on this technology, so ongoing debate about its use is needed and will certainly provide helpful insights and cautions, and lead to restraint in its use, but the technology is here to stay.

As a matter of full disclosure, I will say that I have been surprised by its capabilities. I have been familiar with artificial intelligence since the 1990s through study and research when the main focus was on expert systems driven by fuzzy logic and increasing the performance of neural networks (my first two degrees are in electrical engineering and I worked for 20 years in weapon systems development). Back then, I could not have imagined its current capability would be possible. By “nature” I tend to be cautious and risk averse. As such, my initial reaction to this technology for direct human interaction in such things as education, pastoral ministry, and the lay apostolate has been in the direction of severe restraint.

My concerns are primarily in its likely negative anthropological impacts. My experience of teaching in higher education has been of a widespread and persistent decline in students’ intellectual performance over at least the last 25 years. There are certainly many factors that have contributed to this, but I think a significant negative contribution has been the wide availability of the internet and effective search engines. It has become the rare student who can think cogently, write clearly, and be orally articulate. When I find one, my first guess is that the student was homeschooled and protected from overexposure to technology, and I find this is usually a correct guess.

Too many students have been habituated to believe they are supposed to be fed answers and their success as students is to be found in their ability to regurgitate information. This “information brokering” is the approach so many students take to “research,” using the internet to tell them answers to their questions and stitching these answers verbatim into a confused mess of text. Forming young minds for thinking, understanding, and reasoning has largely been left aside within the contemporary education system, replaced by inculcation with information, and increasingly, indoctrination with ideological tropes. Irrespective of causation, this decline has corresponded with the rise in communications technologies. I fear the situation will get significantly worse with the misuse of AI chat bots within and outside of contemporary education. It very well could be the final nail in the coffin for public education and all private educational institutions that follow the current trends of the overuse of technology in education.17

I am also concerned about the excessive amount of time people of all ages are spending with computing technology, mostly interacting with others through this technology. It is making us all dopamine addicts at increasingly young ages, with less and less attention spans and developing dopamine induced vices that stunt the development of affective maturity. The human person is a unity of body and soul, who requires regular, nurturing, incarnational interpersonal interchange for his flourishing. I am extremely cautious about contributing to the amount of time people will spend online and using AI to give them answers rather than reading books, thinking through issues on their own, and discussing them incarnationally with others. I am also wary about fostering more activities that may limit or even prevent authentic, incarnational interpersonal engagement.

Moreover, it is only through this incarnational, interpersonal exchange in which Jesus Christ is encountered in His disciples, that faith in Him is shared, the struggling are encouraged, the doubtful find reassurance, the weary receive strength, the lost are guided, the wounded find healing, those seeking are effectively catechized, and the broken experience restoration. Because faith is only shared by those with faith, it goes without saying that AI cannot possibly substitute for Christian disciples. It seems like there should be no place for AI even in education, much less Christian discipleship. My initial and cautious proclivities urge me in this direction. However, Jesus’ final missionary discourse as St. Mark records it keeps coming back to me. “Go out into all the world…”, and so much of the world is now spending much of its time in cyberspace.

Disciples must be wherever there are people who need to hear the Good News. Catholic Answers decision to employ AI also provides some points to consider. They seem to be one of the more well-funded apostolates out there and even they do not have the resources to address even a significant portion of those who go to them for help. In addition, there is a quickly increasing number of people out there who will rely first on AI as more complex search engines for their initial introduction to the Catholic faith. AI apps can be more effective in getting them to real people to assist them pastorally than would be possible by relying on the current AI driven search engines.

The potential of AI in ministry

If used with prudence, AI could be used effectively to assist in ministry. It might be considered for such matters as aiding in homily preparation, aiding with identifying patterns in one’s nightly examination of conscience and other spiritual growth activities, and facilitating the dissemination and understanding of catechetical materials. AI can also analyze data to understand better the needs of a congregation, which can help to tailor pastoral care more effectively. However, it is essential to recognize that AI should augment, not replace the interpersonal interaction that is the hallmark of authentic ministry.

As we navigate this new frontier, we must do so with discernment, ensuring that AI serves the greater good and its use comports with the authentic fulfillment of the human person. AI at its best, could be a tool for enhancing aspects of ministry and the lay apostolate, and assisting in bringing the light of faith into the dark places of our digital age.

The safeguards required

Generative AI is a technology with an almost certain likelihood of significantly changing our society. I will admit that such transformative technologies seem to arrive well before a society has much of an idea about their potential negative consequences, much less the wherewithal to minimize or even deal with these impacts. In a perfect world, I would have preferred that Generative AI be developed and rolled out more slowly and with more caution. However, I don’t think this is likely or even possible at this point. Nor will it do for Christians to ignore or carte blanche condemn it.

The Gospel needs to be proclaimed every place there are ears to hear. Today, the venues for reaching people are changing, and much of this is due to technology. Therefore, Christians need to know where technology is moving people. For example, there is a virtual world that is developing, connecting people in virtual, 3D immersive environments in which they need not leave for anything but the most basic human functions. We should expect that incarnational access to coming generations may become increasingly limited. While we cannot accept leaving people in such an anti-human situation, the Gospel first needs to reach them there.

Perhaps the approach might be something like St. John Bosco who would visit the gambling dens of gutter snipes, begin betting with them only to steal their betting pots and run to the church with the youngsters hot on his heels. There, Don Bosco would require they listen to him speak before returning their money. He was able to save not a few young boys this way. So, we too must go where the people are and engage with them there. To do so, we need to employ the technologies they will be using, and we can even use the strengths of these technologies to enhance our effectiveness at reaching them. But the Gospel is proclaimed to persons by persons, and so we need to learn how to overcome the limitations inherent in such outreach. There are cautions we must take with the technologies we employ.

With Generative AI, as with other information-based technologies, we need to realize that too much of a good thing will have negative consequences. Prolonged exposure to digital displays can alter one’s state of consciousness, making him passive and fostering dopamine dependence. This reduces a person’s ability to think, concentrate, or focus on one thing for any length of time. It contributes to laziness in thinking and a dangerous habit of needing even relatively simple concepts to be explained, leaving one with the inability to understand more complex ideas, and generally needing to be told what to think. Too much time spent alone, with only virtual interactions, also has a variety of other negative consequences for integral human fulfillment as it now feeds the loneliness epidemic.

There is something about an incarnational encounter with another which permits a person to go out of himself and to experience being received by another. This is what human persons require. An authentic exchange of persons is not purely an experience of sensitive phenomena, but even more a metaphysical exchange in which the soul of each is abstracted and inheres in each other. The degree to which this exchange is done rightly, selflessly, it becomes an authentically fulfilling human experience. Such an experience is certainly mediated by such phenomena as eye contact, facial expressions, and the tonal quality of the voice, but the human person is nourished ultimately by the metaphysical exchange obtained in face-to-face encounters with other persons. Encounters mediated by technology are not adequate for this.

So, it is not surprising that encounters with simulated persons (i.e., AI apps) will lack any capacity for fulfillment. The era of virtual relationships has already begun. As such relationships become increasingly common, we can expect the phenomenon which was once called “Facebook depression” to become increasingly debilitating for persons and society. It is now recognized that increasing amounts of time spent with social media contributes to negative mental health effects. There are many phenomena that contribute to this, but the net effect of excess time spent using technology for work, communication, entertainment, leisure, etc., is that it turns a person in on himself rather than out toward another. This diminishes rather than fulfils the human person. The earlier in life that a person begins to limit his incarnational encounters with others, the more it will diminish his ability to engage normally with others and increase long term negative impacts preventing his integral fulfillment. In order to avoid allowing the use of AI and other technology-based efforts at outreach to deform persons by its misuse and/or overuse, there are cautions that must be put in place.

To avoid promoting habits of having to be told what to think, we can use AI to help encourage thinking and reflection by training the AI system to engage users in a Socratic manner. We should develop systems that avoid simply giving answer after answer but help develop thinking and understanding by helping the person to learn to ask the right questions, consider the reasonableness of responses, recommend resources to read for going deeper on a matter, and regularly encourage users to do so. AI systems should regularly remind the user that they are not talking to a person, that they cannot think and so they can make mistakes, they do not really understand what is being discussed, and cannot at all empathize with the user or intend their good.

In this light, users should be warned regularly within the system’s responses against overreliance or naïve trust in its responses. System design should include limiting the time of the encounter, the number of questions and the types of questions that can be asked. Perhaps even requiring the user to read a resource before continuing a line of discussion with the AI app. The user must be reminded of the need for authentic human interaction and encourage him with specific recommendations for addressing his particular issue through real human encounters.

To avoid fostering a false sense that the use of an AI system is a real human interaction, it should not be developed with personalities or in any way try to mimic or replicate qualities expected in human interactions. The system must try to avoid allowing users to develop a sense of personal attachment. We should develop systems to detect such attachments and to respond by reminding the user that the system is not a person and of his need for authentic human interaction, and to provide specific recommendations for how he might do so.

A more robust system should be designed to look for signs, through the types of questions being asked, that the person may need real spiritual guidance or even mental health intervention. For example, detecting when the user shares his fears, concerns, temptations to self-harm or harming others, etc. If the need for human intervention is suspected, the user should be encouraged to consult with a priest, deacon and/or mental health provider. Some questions should not be entertained, such as asking for personal opinions or guidance about what the user should do in specific circumstances that could have significant impacts on his life. The restraint currently seen in chat systems against giving medical or mental health diagnoses, advice, or prognoses should be extended to vocational discernment decisions, moral choices, relationship issues, or spiritual warfare considerations. The system must avoid answering questions such as “what should I do?,” or “what would you do?”. Again, here the system should recommend the user seek out a priest, deacon, or mental health provider.

Mother of the Americas Institute is currently working with a developer to provide AI assistance for users with our products. We have significantly fewer financial resources than Catholic Answers and so we are even less able to help all of those who come to us. Yet, we are convinced we have uniquely effective formation programs that can help form better evangelists, marriages, and families, and we think we can safely use AI to assist in our mission. One such MAI program with we will employ AI is in an on-line version of our marriage formation program, The Great Mystery (GM). It will assist couples to understand their couples inventory and help them in understanding the actions they might take to address any issues uncovered. We also intend to use the app to improve our GM facilitator formation program and to provide more insights and practical application of MAI’s other work than we otherwise have the resources to provide. If you would like to try it out, you can try it here.

After trying out an AI application based on St. Thomas Aquinas, we were taken by surprise at its accuracy and usefulness. You might also check out the Catholic Tyro website, which is evaluating ways in which AI might aid in Catholic formation. Tyro also has AI apps for St. Augustine, St. John Henry Newman, St. Theresa of Avila and others. Our own system testing, Catholic Answers’ Justin, and Tyro’s effectiveness, have all given us confidence that Mother of the Americas Institute should move forward with our plans to allow AI to assist our apostolate in its effective outreach. We will make mistakes, but we will learn from them and continue to improve how we safely and effectively use AI to assist us in promoting the Gospel.

Endnote:

17 However, I can see as helpful, for example, using an AI Socratic chat bot to augment classical pedagogy by providing the opportunity for students to practice their thinking skills, asking the right questions, avoiding errors in thinking, etc., when the opportunity for incarnational practice with other students is lacking. Yet, the primary pedagogy still must be incarnational and communal to the degree possible.


If you value the news and views Catholic World Report provides, please consider donating to support our efforts. Your contribution will help us continue to make CWR available to all readers worldwide for free, without a subscription. Thank you for your generosity!

Click here for more information on donating to CWR. Click here to sign up for our newsletter.


Catholic World Report