A growing share of people are using artificial intelligence (AI) chatbots for tracking mental health and accessing companionship, emotional support, or even psychotherapy. On April 8, 2026, SciLine interviewed Dr. Tanzeem Choudhury, a professor of integrated health and technology, and chief of health innovation at Cornell Tech.
TV bundle includes:
Soundbite (SOT)
VOSOT script (can be used as-is or modified)
Raw, full-length interview video & log with timecodes (upon request via form below)
These resources are free to use. No attribution to SciLine is required.
Click for VOSOT script (can be used as-is or modified)
ANCHOR AS MORE PEOPLE TURN TO A-I CHATBOTS FOR THINGS LIKE COMPANIONSHIP… LIFE ADVICE… OR EVEN AS A STAND-IN FOR THERAPY… EXPERTS SAY THE TECHNOLOGY OFFERS BOTH NEW OPPORTUNITIES AND NEW RISKS WHEN IT COMES TO MENTAL HEALTH.
VO DOCTOR TANZEEM CHOUDHURY IS A PROFESSOR AND CHIEF OF HEALTH INNOVATION AT CORNELL TECH… STUDYING A-I AND MENTAL HEALTH FOR 15-YEARS. SHE SAYS A-I TOOLS ARE AVAILABLE ANY TIME… OFTEN AT LOW COST… MAKING IT EASIER TO ACCESS SOME FORM OF MENTAL HEALTH SUPPORT. BUT SHE ALSO WARNS THAT OVER-RELYING ON CHATBOTS COULD STOP PEOPLE FROM BUILDING THE COPING SKILLS TO HANDLE STRESS AND ANXIETY ON THEIR OWN.
SOT Duration: 0:51 Super: Dr. Tanzeem Choudhury, professor and chief of health innovation at Cornell Tech “I think the biggest kind of guidance is be informed, really understand the strength and limitations, know that it’s not human, know that it doesn’t have any of the human kind of traits that we expect, although it can mimic those traits of empathy and trust and it’s basically a machine and a set of numbers, and it has none of those kind of emotional feelings or it doesn’t even know what kind of how to build kind of trust. So I think just be informed, be aware of the limitations, and understand that it does not replace any of the human kind of connection.”
VO DOCTOR CHOUDHURY ALSO CAUTIONS THAT THESE TOOLS MAY FALL SHORT IN HIGH-RISK SITUATIONS… LIKE MENTAL HEALTH CRISES. AND WHILE A-I CAN HELP…SHE SAYS IT’S NO SUBSTITUTE FOR HUMAN RELATIONSHIPS AND PROFESSIONAL CARE.
Raw, full-length interview covers:
Trends in how people are using chatbots, including as a stand-in for psychotherapy;
Recent reports of people experiencing delusions, mental health crises, or suicidality after interactions with AI chatbots;
Her proposal for a labeling system for AI tools that would use green, yellow, and red labels to identify apps that are helpful, or harmful, for users’ mental health;
Advice for people considering using an AI tool to help with their mental health; and
Her other research areas, including using smartphone data to help predict schizophrenia relapses, and developing a wearable sensor that can assess changes in mental health states and provide clinically actionable information.