Topic: Generative AI Tools

Search

Building JARA: A Judgment-Free AI Companion for Youth and Young Adults Who Use Substances or Alcohol

Tyler Marshall in Case Study
JARA is a judgment-free, evidence-based AI companion designed to help youth and young adults navigate substance use, harm reduction, and recovery goals safely, confidentially, and without stigma—bridging the gap between early support and formal care.

ChatGPT Health: OpenAI Introduces a Dedicated Health and Wellness Experience

OpenAI has announced ChatGPT Health, a new dedicated experience within ChatGPT designed to support people in understanding and organising their health information, with enhanced privacy protections and an explicit focus on supporting—not replacing—clinical care.

Feasible but Fragile: MindBench.ai and the Path to Responsible AI in Mental Health

Dr. John Torous discusses how AI in mental health is “feasible but fragile” and how MindBench.ai is being developed to ensure patient-centered, safe, and effective innovation.

Responsible AI for Youth Mental Health; What We Are Learning and What Must Change

Caroline Figueroa from Delft University of Technology in Thought Leadership

Localized Crisis Helplines Integrated into ChatGPT and Sora

ChatGPT and Sora now offer localized crisis helplines, providing immediate, confidential support and ensuring human intervention remains available for those experiencing severe distress.

OpenAI’s Curated Therapist Network Set to Accelerate Global Demand for Mental Health Professionals

OpenAI’s new curated network of therapists, integrated with ChatGPT, could dramatically increase global demand for mental health professionals by providing seamless, immediate access to qualified support.

OpenAI Updates ChatGPT to Improve Handling of Sensitive Mental Health Conversations

OpenAI has updated ChatGPT since August 2025 to improve its handling of sensitive mental health conversations, introducing safeguards for distress, self-harm, psychosis, and emotional reliance, with subsequent updates including GPT‑5.1 maintaining and expanding these safety measures.

FDA to Review AI Mental Health Devices: Setting the Stage for Regulation

The US FDA’s Digital Health Advisory Committee (DHAC) will meet on November 6, 2025 to evaluate AI-enabled digital mental health devices (including chatbots and virtual therapists) to explore how they could bridge the mental health service access gap while assessing

PsychAdapter: Revolutionizing AI with Personality & Mental Health-Aware Language Generation

“PsychAdapter” is a new AI modification that enables large language models to generate text reflecting specific personality, mental health, and demographic traits. This lightweight, plug-and-play adapter moves beyond “average” AI language to create more personalized and human-like interactions.

FTC to Investigate AI Chatbots’ Impact on Children

The FTC is preparing to investigate leading AI firms, including OpenAI, Meta, and Character.AI, over potential mental health risks chatbots pose to children. The move comes amid growing complaints and state-level probes into unsafe or misleading AI-driven interactions.

Upcoming Events

Webinar Replays

Member Login

Our Audience

eMHIC has and audience of 12 member countries (and growing) with thousands of subscribers around the world.

Share your thoughts

Have quality news or resources to share on the eMHIC Knowledge Bank? We’d love to consider publishing them for our global community.