AI and the meaning of life: Tech industry turns to religious leaders
March 29, 2023 at 3:25 a.m.
VATICAN CITY – The people behind chatbots are asking questions of priests and ethicists rather than turning to their artificially intelligent creations. They want to know: What is consciousness? What is the nature of humanity? What is the purpose of life?
According to Father Phillip Larrey, dean of the philosophy department at the Rome's Pontifical Lateran University, Silicon Valley techies are posing those questions to ethicists and religious leaders as artificial intelligence develops rapidly and is used in myriad ways impacting people's daily lives.
In a conversation with Catholic News Service March 21, Father Larrey, a native of Mountain View, California, and author of two books on the rise of AI, reflected on how society should engage with AI as it becomes increasingly embedded in the lives of ordinary people through accessible technologies.
AI-operated programs such as ChatGPT, a popular software created by the software company OpenAI, "can access data to an enormous extent that for human beings is no longer possible," said Father Larrey. "That is why as a species we tend to look at AI with a certain fear, because we fear the unknown."
An artificially intelligent chatbot, ChatGPT uses learning algorithms to consume, produce and infer information for human users. The software is intended to mimic human conversation and can instantaneously produce essays and articles, write programming code and give people advice based on information input by users.
It's most sophisticated model, GPT4, was released for public use March 14.
[[In-content Ad]]
Father Larrey said there are several "catastrophic risks" to unchecked and widespread AI use, such as its potential for spreading disinformation and creating code that can be used by hackers.
He also identified potential adverse effects of AI for everyday users, noting that minors can ask chatbots for advice in committing illicit activities and students can use them to complete their assignments without performing the work of learning.
A major downside of AI, he said, is that "we become dependent on the software, and we become lazy. We no longer think things out for ourselves, we turn to the machine."
Yet Father Larrey said that rejecting AI technology is a mistake. In particular, he pointed to the decision of some universities to ban the use of ChatGPT, noting that educators "are going to have to learn how to incorporate this into how they teach, what they test for, and how we can use these tools to our advantage."
"I don't think you can put the genie back in the bottle," he said. "The market motivation is so strong that you're not going to stop it."
In January, Microsoft announced a multiyear investment in OpenAI, which the New York Times and other media reported would total $10 billion. Other tech companies, including Google and Amazon, are testing their own AI-powered products to compete with existing software on the market.
That's why Father Larrey said conversations on AI must shift to what Pope Francis calls "person-centered AI." The Pope, he said, "is insisting that you need to put the human person at the center of this technology."
In January, Pope Francis addressed tech-industry leaders from companies such as Microsoft and IBM as well as members of the Jewish and Muslim communities during a conference on ethics in AI at the Vatican.
The Pope urged them to "ensure that the discriminatory use of these instruments does not take root at the expense of the most fragile and excluded" and gave an example of AI making visa decisions for asylum-seekers based on generalized data.
"It is not acceptable that the decision about someone's life and future be entrusted to an algorithm," said the Pope.
At the end of the conference, Catholic, Jewish and Muslim representatives signed a declaration calling on AI researchers to engage with ethicists and religious leaders to develop a framework for the ethical use of AI.
"On social media and other technologies that came very quickly, we were trying to catch up and we weren't exactly sure how to do this," said Father Larrey.
But with AI, he said, the tech companies themselves are "beginning to think about how to structure some guidelines and some concerns so that this technology will be used for human well-being and human flourishing."
Tech companies such as Microsoft are "looking for philosophers and theologians" to respond to those questions, he said. "They are looking for people who know how to think."
"These people, who are really changing the future of humanity, they want to talk with us, they want to talk with priests, they especially want to talk with Pope Francis," he said. "They're looking for guidance and they're looking for support. They're looking for some way to make this help people and not harm people."
Some of those guidelines, he noted, include adding parental controls to technology so that parents can monitor how their children are using AI-powered devices, or establishing structures so that human decision-making is not cut out of the equation when AI is also used, such as when making a legal decision using generalized data.
Aware of the challenges AI poses to society, Father Larrey said he is still optimistic people can use AI responsibly and for the betterment of humanity if it is developed properly.
"I think that people will win over the technology," he said. "It's not without perils, it's not without difficulties."
And within the Church, Father Larrey said he thinks "priests will be one of the last to be substituted (by AI), even though they have AI's that will hear your confession and celebrate Mass."
"People want to talk with a priest or a sister, they want the experience of the religious person that they can't get in an AI," he said.
Contributing to this story was Robert Duncan in Rome.
Related Stories
Friday, November 15, 2024
E-Editions
Events
VATICAN CITY – The people behind chatbots are asking questions of priests and ethicists rather than turning to their artificially intelligent creations. They want to know: What is consciousness? What is the nature of humanity? What is the purpose of life?
According to Father Phillip Larrey, dean of the philosophy department at the Rome's Pontifical Lateran University, Silicon Valley techies are posing those questions to ethicists and religious leaders as artificial intelligence develops rapidly and is used in myriad ways impacting people's daily lives.
In a conversation with Catholic News Service March 21, Father Larrey, a native of Mountain View, California, and author of two books on the rise of AI, reflected on how society should engage with AI as it becomes increasingly embedded in the lives of ordinary people through accessible technologies.
AI-operated programs such as ChatGPT, a popular software created by the software company OpenAI, "can access data to an enormous extent that for human beings is no longer possible," said Father Larrey. "That is why as a species we tend to look at AI with a certain fear, because we fear the unknown."
An artificially intelligent chatbot, ChatGPT uses learning algorithms to consume, produce and infer information for human users. The software is intended to mimic human conversation and can instantaneously produce essays and articles, write programming code and give people advice based on information input by users.
It's most sophisticated model, GPT4, was released for public use March 14.
[[In-content Ad]]
Father Larrey said there are several "catastrophic risks" to unchecked and widespread AI use, such as its potential for spreading disinformation and creating code that can be used by hackers.
He also identified potential adverse effects of AI for everyday users, noting that minors can ask chatbots for advice in committing illicit activities and students can use them to complete their assignments without performing the work of learning.
A major downside of AI, he said, is that "we become dependent on the software, and we become lazy. We no longer think things out for ourselves, we turn to the machine."
Yet Father Larrey said that rejecting AI technology is a mistake. In particular, he pointed to the decision of some universities to ban the use of ChatGPT, noting that educators "are going to have to learn how to incorporate this into how they teach, what they test for, and how we can use these tools to our advantage."
"I don't think you can put the genie back in the bottle," he said. "The market motivation is so strong that you're not going to stop it."
In January, Microsoft announced a multiyear investment in OpenAI, which the New York Times and other media reported would total $10 billion. Other tech companies, including Google and Amazon, are testing their own AI-powered products to compete with existing software on the market.
That's why Father Larrey said conversations on AI must shift to what Pope Francis calls "person-centered AI." The Pope, he said, "is insisting that you need to put the human person at the center of this technology."
In January, Pope Francis addressed tech-industry leaders from companies such as Microsoft and IBM as well as members of the Jewish and Muslim communities during a conference on ethics in AI at the Vatican.
The Pope urged them to "ensure that the discriminatory use of these instruments does not take root at the expense of the most fragile and excluded" and gave an example of AI making visa decisions for asylum-seekers based on generalized data.
"It is not acceptable that the decision about someone's life and future be entrusted to an algorithm," said the Pope.
At the end of the conference, Catholic, Jewish and Muslim representatives signed a declaration calling on AI researchers to engage with ethicists and religious leaders to develop a framework for the ethical use of AI.
"On social media and other technologies that came very quickly, we were trying to catch up and we weren't exactly sure how to do this," said Father Larrey.
But with AI, he said, the tech companies themselves are "beginning to think about how to structure some guidelines and some concerns so that this technology will be used for human well-being and human flourishing."
Tech companies such as Microsoft are "looking for philosophers and theologians" to respond to those questions, he said. "They are looking for people who know how to think."
"These people, who are really changing the future of humanity, they want to talk with us, they want to talk with priests, they especially want to talk with Pope Francis," he said. "They're looking for guidance and they're looking for support. They're looking for some way to make this help people and not harm people."
Some of those guidelines, he noted, include adding parental controls to technology so that parents can monitor how their children are using AI-powered devices, or establishing structures so that human decision-making is not cut out of the equation when AI is also used, such as when making a legal decision using generalized data.
Aware of the challenges AI poses to society, Father Larrey said he is still optimistic people can use AI responsibly and for the betterment of humanity if it is developed properly.
"I think that people will win over the technology," he said. "It's not without perils, it's not without difficulties."
And within the Church, Father Larrey said he thinks "priests will be one of the last to be substituted (by AI), even though they have AI's that will hear your confession and celebrate Mass."
"People want to talk with a priest or a sister, they want the experience of the religious person that they can't get in an AI," he said.
Contributing to this story was Robert Duncan in Rome.