World

India's religious chatbots condone violence using the voice of god

Many people in India are foregoing in-person contact with a guru interpreting Hindu scripture and turning to online religious chatbots, but experts are warning that chatbots using artificial intelligence to respond and play the voice of the Hindu god Krishna can be a dangerous mix.

Experts warn of potential dangers as new chatbots use AI to interpret Hindu scripture

A screen displaying an artificial intelligence chatbot is shown above an open book of Hindu scripture.
Online chatbots that give answers to probing questions about the meaning of life based on teachings from Hindu scripture have millions of users, according to their websites. (Salimah Shivji/CBC)

As Vandana Poddar performs the Hindu puja ceremony daily at her home shrine in Mumbai, she's guided on her spiritual journey by the Bhagavad Gita, a 700-verse scripture.

She even attends a weekly class dissecting the deeper meaning of the ancient religious text, with her teacher providing examples to illustrate a particular passage. 

"Interpretation is the backbone of this text," Poddar, 52, told CBC News. "Superficial knowledge can be misleading."

But many in India are foregoing that in-person contact with a guru interpreting the Bhagavad Gita and turning to online chatbots, which imitate the voice of the Hindu god Krishna and give answers to probing questions about the meaning of life based on the religious scripture's teachings. 

It's new technology with the tendency to veer off script and condone violence, according to experts, who warn that artificial intelligence chatbots playing god can be a dangerous mix. 

Several of the bots consistently provide the answer that it's OK to kill someone if it's your dharma, or duty.

In the Bhagavad Gita, written more than 2,000 years ago, the prince Arjuna is hesitant to go into battle where he will have to kill his family and friends until the Hindu god Krishna reminds him that as a warrior from the Kshatriya case, it is his duty to fight.

A person with closed eyes prays before a small shrine in their home.
Vandana Poddar uses the traditional Bhagavad Gita book as her guide during her morning puja prayer ceremony at her Mumbai home. (Salimah Shivji/CBC)

"It's miscommunication, misinformation based on religious text," said Lubna Yusuf, a Mumbai-based lawyer and a co-author of The AI Book. "A text gives a lot of philosophical value to what they are trying to say and what does a bot do? It gives you a literal answer and that's the danger here." 

At least five Gita chatbots appeared online in early 2023, powered by the language model Generative Pre-trained Transformer 3 (GPT-3). They're using artificial intelligence, which simulates a conversation and creates answers based on statistical probability models. The sites say they have millions of users. 

The main page of one of them, Gita GPT, asks, in an imitation of the voice of the Hindu god Krishna, "What troubles you, my child?" to users typing in a question. 

Another chatbot, Bhagavad Gita AI, introduces itself as "a repository of knowledge and wisdom" before telling the online user: "Ask me anything." 

The smaller print on the same page states that "the answer may not be factually correct" and exhorts the user to do their own research "before taking any action."

WATCH | The dangers of using AI to get spiritual advice: 

AI chatbots are playing god in India. What could go wrong?

1 year ago
Duration 4:00
Several GitaGPT chatbots that use generative AI to offer spiritual guidance have sprung up in India, but experts say the new technology has the power to veer off script and into dangerous territory, condoning violence.

Yusuf said the potential danger of answers that condone violence is more acute in a country like India, where religion is so emotionally charged. 

"You're creating confusion in the chaos," Yusuf said, adding that some could use the chatbots' answers to further their own political interests and cause irreversible damage. "It might incite more violence, it might create religious bias." 

She would like to see government regulation or guidelines on what topics should not be left in the hands of chatbots, such as philosophy, religion and law.

Other experts have also spoken out about the ethical concerns with mixing religion and statistical models, with one AI ethicist telling CBC the world of artificial intelligence is the "Wild West ethically right now".

"We can't control technology but we can control its application," said Jibu Elias, a New Delhi-based AI ethicist and researcher, when referring to the need for governments to set out guidelines. 

A handful of chatbots interpreting religious scripture using generative artificial intelligence, and speaking in the voice of God, are seemingly condoning violence.
Several of the chatbots using generative artificial intelligence replied to users that it's acceptable to kill someone if you are fulfilling your duty, or dharma. (GitaGPT)

But the Indian government, in a written submission, informed parliament in April that it has no plans to regulate artificial intelligence in the country, even while acknowledging the ethical concerns and risks around AI and promising to promote best practices. 

Disclaimers and toxicity filters 

Samanyou Garg, an AI entrepreneur who created the chatbot on Bhagavad Gita AI through his non-profit spiritual organization Ved Vyas Foundation, acknowledged there is still work to do on the technology, but said that is the case for all new technology. 

"AI is still not there yet, where it can be totally trusted," he told CBC News at his home in New Delhi. 

A handful of chatbots interpreting religious scripture using generative artificial intelligence, and speaking in the voice of God, are seemingly condoning violence.
A handful of chatbots interpreting religious scripture using generative artificial intelligence are seemingly condoning violence. (Screengrab of bhagavadgita.ai)

He pointed to a screen and highlighted steps he said he's taken to protect users from dubious answers, including a disclaimer that shifts the responsibility onto the user's own judgment and very careful language. 

"We've mentioned Gita AI there. We haven't said it's the actual Gita or [that] it's Krishna speaking," Garg, 26, said, adding that he wanted the chatbot to be a companion, not a replacement, for a spiritual teacher. 

The site is also working to constantly improve its toxicity filters, he said, but it takes time for the chatbot to catch up.

"We filter out the bad responses, we keep on training the model to be able to detect these newer toxicity questions."

Experts warn mixing religious interpretation of one of Hinduism most sacred texts with artificial intelligence based on statistical probabilities is a dangerous path, particularly in a country like India.
Experts warn mixing religious interpretation of one of Hinduism's most sacred texts with artificial intelligence based on statistical probabilities is a dangerous path, particularly in a country like India. (Salimah Shivji/CBC News)

For the young tech entrepreneur, the fact that his chatbot received a surge of interest without any promotion proved to him that the service is essential to expose an ancient religious text to a younger audience. 

He said that outweighed any initial pitfalls of the nascent technology. 

But that's not the consensus at Poddar's weekly Bhagavad Gita class in the Juhu suburb of Mumbai, where chants ring out from the dozen or so students intent on extracting more wisdom from the scriptures. 

Most here think outsourcing spirituality to computers is distasteful and short-sighted.

'We haven’t said it’s the actual Gita or [that] it’s Krishna speaking,' Samanyou Garg told CBC News, when asked about the potential dangerous effects of his chatbot that has provided answers that seem to condone violence, based on the ancient Hindu text, the Bhagavad Gita.
'We haven’t said it’s the actual Gita or [that] it’s Krishna speaking,' Samanyou Garg told CBC News, when asked about the potential dangerous effects of his chatbot that has provided answers that seem to condone violence, based on the ancient Hindu text, the Bhagavad Gita. (CBC)

"When you are listening to somebody, your mind works," Bijal Pandya, the guru leading the study session, said. "You start thinking, you get new questions in your mind." 

The Bhagavad Gita is full of emotions that keep changing, the 53-year old said, and that's why debate is needed to tease out the text's true meaning. 

"It's always better, that human touch," he added. "It's a spiritual thing. AI can never replace spirit. It is only replacing our intelligence." 

ABOUT THE AUTHOR

Salimah Shivji

Journalist

Salimah Shivji is CBC's South Asia correspondent, based in Mumbai. She has covered everything from natural disasters and conflicts, climate change to corruption across Canada and the world in her nearly two decades with the CBC.