How AI healthcare chatbots are learning from the questions of an Indian women’s organization – The Denver Post

How AI healthcare chatbots are learning from the questions of an Indian women’s organization – The Denver Post

Women learn to use an AI-powered chatbot developed by the Myna Mahila Foundation at the local women’s organization’s office in Mumbai, India, February 1, 2024. The chatbot, currently a pilot project, represents what many hope will be part of the impact of AI in healthcare around the world: to deliver accurate medical information in personalized responses that can reach far more people than personal clinics or trained medical professionals. (AP Photo/Rafiq Maqbool)

NEW YORK — Komal Vilas Tatkare says he has no one to ask about his most personal health questions.

“There are only men in my home – no ladies,” said the 32-year-old mum and housewife in Mumbai. “I’m not talking to anyone here. So I used this app as it helps me with my personal problems.”

The app she uses is powered by artificial intelligence running on OpenAI’s ChatGPT model, which Myna Mahila Foundation, a local women’s organization, is developing. Thatkare asks questions to the Myna Bolo chatbot and it offers answers. Through these interactions, Thatkare learned about birth control pills and how to take them.

Thatkare is one of 80 test users hired by the foundation to help train the chatbot. It’s based on a custom database of medical information about sexual health, but the chatbot’s potential success relies on test users like Thatkare to train it.

The chatbot, currently a pilot project, represents what many hope will be part of AI’s impact on healthcare around the world: providing accurate medical information in personalized responses that can reach far more people than in-person clinics or trained medical professionals. In this case, the chatbot’s focus on reproductive health also offers vital information that – due to social norms – is hard to come by elsewhere.

“If this can actually provide that non-judgmental, personal advice to women, then it could really be a game-changer when it comes to accessing sexual reproductive health information,” said Suhani Jalota, founder and CEO of the Myna Mahila Foundation. which received a $100,000 grant from the Bill & Melinda Gates Foundation last summer to develop the chatbot as part of a cohort of organizations in low- and middle-income countries trying to use AI to solve problems in their communities.

Funding organizations such as the Gates Foundation, the Patrick J. McGovern Foundation and Data.org are seeking to build this “missing environment” in AI development, particularly in areas such as healthcare and education. These philanthropic initiatives offer developers access to AI tools they might not otherwise be able to afford, so they can solve problems that are low-priority for corporations and researchers—if on their radars at all—because they don’t have much profit potential .

“No longer can the Global North and high-income countries drive the agenda and decide what needs and what not to be addressed in local communities in the Global South,” Trevor Mundell, president of global health at the Gates Foundation, wrote in an October online post, adding: “We cannot risk creating a new gap of inequality when it comes to AI.”

The Associated Press receives financial support for news coverage in Africa from the Bill & Melinda Gates Foundation.

The Myna Mahila Foundation recruited test users like Thatkare to write real questions they had. For example, “Does using a condom cause HIV?” or “Can I have sex on my period?” Foundation staff then closely monitor the chatbot’s responses, developing a customized database of vetted questions and answers along the way, which helps improving future responses.

The chatbot is not yet ready for wider distribution. The accuracy of his answers is not good enough and there are translation problems, Giallotta said. Users often write questions in a combination of languages ​​and may not provide the chatbot with enough information for it to offer an appropriate response.

“We’re still not completely sure whether women can understand everything clearly or not, and whether all the information we’re sending is completely medically accurate,” Giallotta said. They are considering training some women to help prompt the chatbot on someone else’s behalf, but they still aim to improve the chatbot so that it can be run on its own.

Dr. Christopher Longhurst, chief medical officer at UC San Diego Health, is leading the implementation of AI tools in healthcare facilities and said it is important to test and measure the impact of these new tools on patient health outcomes.

“We can’t just assume or believe or hope that these things will be good. You actually have to test it,” Longhurst said. He thinks the promise of AI in healthcare has been overstated for the next two to three years, “But I think in the long term, in the next decade, AI will be as impactful as penicillin in healthcare.”

Gialotta’s team is consulting with other projects funded by the Gates Foundation that are designing chatbots for healthcare facilities so they can solve similar problems together, said Zamir Bray, interim deputy director of technology dissemination for the Gates Foundation.

The Myna Mahila Foundation is also partnering with another Gates grantee to propose developing privacy standards for handling reproductive health data. The foundation, which is working with an outside technology firm to develop the chatbot, is also considering other steps to ensure user privacy.

“We’ve been discussing whether we should delete messages within a certain period of time when women send them to add to that privacy,” Giallotta said, since some women share phones with family members.

Leave a Comment

Your email address will not be published. Required fields are marked *