From Promise to Practice: Scaling Safe, Human-Centered AI in Mental Health Care

The Evolution of Wysa: From Chatbot to Scalable Clinical Support

Wysa was born not in a lab or boardroom, but from deeply personal experience. Our co-founders, Jo Aggarwal and Ramakant Vempati, came to mental health innovation through lived challenges. This sparked a profound question: could technology offer real, compassionate support to people struggling with their mental health, especially in moments when human connection felt out of reach? What began as a machine learning experiment analyzing behavioral data quickly evolved into something more powerful.

In 2020, Wysa’s AI-powered mental health support was already gaining momentum. But it was in the last two years that we experienced a profound transformation, both in our technology and in the scale of impact. Today, Wysa supports over 7 million users across 95 countries. We’ve evolved from a general wellness tool into a clinically governed AI platform, supporting public mental health systems, large employers, insurers, and care delivery networks.

This evolution wasn’t just technological. It demanded rethinking how digital tools can responsibly partner with human clinicians and safeguard the most vulnerable users. It pushed us to build the infrastructure, protocols, and partnerships that enable safe, equitable, and culturally adaptable care at scale.

Lessons from the Field: What the Last Two Years Taught Us

Scaling AI in mental health is not simply a matter of adding users or markets, it is about ensuring that every new deployment is safe, relevant, and deeply integrated into real-world systems. Our experiences highlighted three key challenges:

1. Clinical Governance in Complex Systems

Our expansion into regulated health systems, especially in the UK and US, required us to mature our clinical governance rapidly. Gaining NHS approval through DCB0129 compliance meant that we were not only meeting baseline safety but embedding best practices into every layer of our platform.

We developed robust risk detection and escalation protocols, an ethics board, and internal clinical audit loops. Most importantly, we established that AI support must never operate in isolation, it must link seamlessly with care teams, safeguarding pathways, and human-in-the-loop support.

2. Building Culturally Sensitive AI

One-size-fits-all mental health tools do not work globally. A student in rural India, a parent in inner-city London, and a nurse in Minnesota all have different lived experiences, and their mental health journeys differ. In India, we partnered with public health agencies and NGOs to ensure Wysa could serve low-literacy populations while still supporting early detection and psychoeducation. In the UK, we co-designed Wysa’s interface with young people and NHS safeguarding leads to ensure acceptability and trust.

3. Earning and Sustaining Trust in AI Care

We’ve seen that trust is the make-or-break factor in digital mental health. Users want transparency: they need to know when they’re speaking with a bot, how their data is used, and what happens if they disclose risk. We’ve learned to make this clear, not through disclaimers, but through empathetic UX and consistent behavior from the AI itself.

Our triage approach reflects this. When someone shares thoughts of self-harm or suicide, Wysa doesn’t “solve” it with AI. Instead, the AI gently guides the user toward human help, whether that’s an in-app escalation or a connection to a local helpline. Trust is built not by pretending the AI is human, but by making it feel safe, supportive, and honest.

a person holding a phone
A young person engaging with Wysa’s chatbot app. Image courtesy of Wysa.

Implementation Insights: What Makes Digital Mental Health Work

Through working with more than 30 enterprise clients across sectors, from the NHS and US health systems to Fortune 100 employers and global insurers, we’ve identified three ingredients that consistently predict success:

1. Use AI to Extend, Not Replace, Human Care

Wysa’s most effective use cases are those that embed AI support within a broader care ecosystem. For example, in the US, we’ve integrated Wysa into the Collaborative Care Model (CoCM), where AI handles mood tracking, psychoeducation, and early symptom detection, freeing up care managers and clinicians to focus on high-acuity needs.

In the UK, Wysa supports users on IAPT waiting lists, helping maintain engagement, reduce dropout, and guide users into the right step of care. AI is not a shortcut to therapy, it’s a bridge to it.

2. Co-Design with Local Stakeholders

Every deployment begins not with a demo, but a design session. We work with local safeguarding teams, clinical leads, and sometimes even service users to tailor escalation thresholds, response phrasing, and support pathways.

In one NHS trust, for instance, we adjusted Wysa’s signposting responses based on local referral backlogs. In an Indian youth program, we added voice note functionality to support users with limited literacy. True integration only happens when solutions are built with the system, not for it.

3. Measure Outcomes Beyond Usage

In digital mental health, engagement metrics like downloads or daily active users are often used as proxies for impact but they don’t tell the whole story. At Wysa, we’ve deliberately gone further, building a robust outcomes framework grounded in clinical rigor and real-world relevance. We measure how users improve over time using validated clinical tools like PHQ-9 and GAD-7, tracking shifts from clinical to subclinical symptom levels. We monitor the accuracy of our AI’s risk triage and the speed with which high-risk cases are escalated to human support. We evaluate user trust, not just through surveys but through behavioral signals, do people return? Do they open up more deeply over time? And most importantly, we look at whether people feel better. Across multiple deployments, including NHS pilots and insurer partnerships, we’ve consistently seen meaningful improvements in mood, self-regulation, and functional recovery. For us, outcomes aren’t an afterthought, they are the point.

Evidence and Experience: What the Data Shows

Wysa has built a substantial body of both peer-reviewed and real-world clinical evidence demonstrating its effectiveness across diverse populations and use cases. The following three examples offer just a snapshot of how our AI-supported mental health platform is delivering measurable outcomes in complex, high-need settings.

1. Chronic Pain

Wysa’s AI-supported interventions have shown promising results in supporting individuals living with chronic pain, particularly in managing the mental health impacts associated with long-term physical conditions. In a study conducted with a chronic pain population in the UK, users who engaged with Wysa’s cognitive-behavioral and acceptance-based techniques reported significant improvements in emotional regulation, self-efficacy, and perceived pain interference. Notably, over 60% of users demonstrated a measurable reduction in anxiety and depressive symptoms, factors known to exacerbate the pain experience. 

Wysa for Chronic Pain
From Promise to Practice: Scaling Safe, Human-Centered AI in Mental Health Care 3

2. Return to Work

Among employees on leave due to stress, anxiety, or burnout, Wysa has been shown to aid both mental health recovery and reintegration into the workforce. In collaboration with an insurer partner, Wysa was offered to policyholders on short-term disability leave. Within eight weeks, 45% of users reported a clinically meaningful improvement in GAD-7 or PHQ-9 scores, and 38% successfully returned to work. The data also showed that users who engaged with Wysa’s AI tools at least three times per week returned to work one third faster than those without the tool. 

3. NHS Waitlist Recovery

One of the most compelling examples of Wysa’s real-world clinical impact has been within the UK’s NHS, where long wait times for talking therapies can leave individuals without timely support. In pilot programs with several NHS trusts, Wysa was offered as an interim resource to individuals on waitlists. Among those who engaged with the app regularly over four weeks, 40% moved from clinical to subclinical thresholds on standardized measures of depression (PHQ-9) and anxiety (GAD-7). Importantly, this group also showed improved self-reported hopefulness and emotional resilience.

Feedback from Users

“[Wysa is] an excellent complement to therapy. It’s useful for follow-up thought exercises, quick self-checks, and relaxation techniques. It’s always available for things like that.”
– Tim, 40, Hotel Front Desk Manager, USA, 2025. Discovered Wysa at his therapist’s recommendation after attempting to end his life.

“Wysa has been a life saver as waitlines for therapy is so long and private ones are so expensive. I have had it for 5 years now and also speak to a coach on Wysa. The blended support really helps me through my tough days, have panic attacks and social anxiety and am way better from how i was when i first got Wysa. my coach shares relevant tools and techniques and this helps me with more support with my once a month in person therapy sessions”
-Helen, 30, Professional, Canada, 2025


“With Wysa, you can get help at any time whenever you’re feeling depressed or have immense anxiety [and] you can be yourself with Wysa because it’s confidential.”
– Keith, 14, Student, USA, 2024.

“After listening to me, Wysa would provide me with a task or exercise which helped me to feel in control and now I know what I need to do to take care of myself in those moments. Wysa helped me to reframe my thoughts and switch up the way I think. I now focus on my strengths, which has led to an improvement in my mood and my sleep routine is so much better after using the guided sleep stories and meditations to wind down.”
– Ade, 26, PHD Student, Australia, 2023

A Look Ahead: Wysa’s Vision for 2025 and Beyond

As digital mental health matures, the conversation is shifting from access alone to integration, equity, and clinical depth. At Wysa, we see 2025 not simply as a year to scale up, but as a pivotal moment to embed what we’ve learned about safe AI, collaborative care, and population-scale resilience into more meaningful, measurable systems of care. Here’s how we’re doing it:

1. Strengthening Integration in Clinical Ecosystems

Wysa is actively advancing clinical integrations to ensure our AI-enabled mental health support seamlessly embeds into traditional care pathways. A major development was our merger with April Health, designed to embed behavioral health directly into primary care settings. Through this integration, primary care providers can now offer patients a hybrid care model where Wysa’s Therapeutic AI Coach works alongside behavioral health specialists and psychiatrists, eliminating wait times and ensuring no referral goes unanswered. This extends the Collaborative Care Model (CoCM) to rural and underserved communities, enabling proactive monitoring, referrals, and support—all within existing workflows.

Building on this, our recent partnership with RxCap brings next-generation remote monitoring to the clinical ecosystem. RxCap’s smart adherence devices and cloud software are now embedded within Wysa’s platform, allowing care teams real-time insight into medication use. This integration supports timely, informed interventions, especially for patients on complex pharmacological regimens. Initial rollouts, planned for mid‑2025, are already showing promise in improving adherence, boosting engagement, and unlocking new reimbursement pathways.

To complete the technical picture, we’re connecting Wysa with electronic health records. Through EHR integrations with primary care and behavioral health systems, symptom data, risk alerts, and medication adherence insights now flow directly into clinicians’ dashboards. This enables measurement-based care, where care managers make informed, data-driven decisions, and clinicians access AI-generated summaries without adding administrative burden. Together, these integrations position Wysa not as a standalone app, but as a fully connected clinical tool, supporting both patient outcomes and operational efficiency.

2. Supporting Public Mental Health at Scale

As governments grapple with rising demand and workforce shortages, we are partnering on national and regional mental health initiatives. In India, this means deploying Wysa through schools and state programs. In the UK, it means augmenting NHS access while safeguarding trust and safety.

We aim to provide the infrastructure for tiered digital support, enabling proactive care for mild-to-moderate conditions and smart triage for higher acuity needs.

3. Leading in Ethical, Human-Centered AI

At Wysa, ethical responsibility is not an afterthought, it’s a foundational principle that shapes how we design, deploy, and govern our technology. We’ve built safeguards into every layer of the platform, from how conversations are handled to how risk is detected and escalated. Our internal clinical governance team works alongside an external ethics board to regularly review policies, language models, and real-world use. We are transparent about what AI can and cannot do, and we ensure users are informed and in control of their data. As the regulatory landscape for AI in healthcare evolves, we actively contribute to global standards, advocating for rigorous, clinically grounded frameworks that prioritize safety, accountability, and user trust.

Closing Thoughts

AI has the potential to democratize mental health support like nothing before it. But impact depends on how that AI is built, deployed, and governed. At Wysa, we’ve learned that AI alone is not the answer, but when used to augment, support, and extend human care, it becomes a powerful tool for healing and hope.

We’re proud of the millions of lives we’ve reached, but more than that, we’re committed to listening, so we can continue to evolve in partnership with the people we serve. The future of mental health care isn’t AI or humans. It’s AI and humans, working together to meet people where they are.

Share this post

About the Author

Smriti Joshi

Chief Psychologist

at Wysa

Built for care, Wysa is a clinically validated AI driven platform offering personalized, evidence based support that is always available. We help individuals, organizations, healthcare providers, and young people thrive with compassionate, trusted care.

Wysa

Authors

Smriti Joshi

Wysa

Sources

Wysa

Wysa is an AI-powered wellbeing chatbot that helps you reflect and grow through structured conversations.

ADVERTISEMENT

Disclaimer

The views shared are those of the authors and do not necessarily reflect those of eMHIC.  For more details, see our Privacy Policy & Terms of Service

Our Audience

eMHIC has an audience of 26 member countries (and growing) with thousands of subscribers around the world.

Something to Share?

Contribute quality news and resources to the eMHIC Knowledge Bank. Your submissions will be carefully considered for our global community.

More Reading