Foreword: A Matter of Purpose
Unlike many companies, we're not mission-driven. Yuna is purpose-driven. This distinction is not
semantic; it is the core of our identity. For the founders, this purpose is a lived reality. We are here
because we recognize the suffering in the world, having navigated our own. It is from that place of
experience that we now combine our hearts, minds, and efforts to contribute to the alleviation of
that suffering by pioneering a new paradigm for human well-being in the age of artificial intelligence.
The global mental healthcare system is not simply strained; it is incomplete. More than a billion
people worldwide are living with a mental health condition, yet the vast majority receive no care at all.
This is not just a market gap; it is a moral one. It is a crisis of access, of stigma, and of trust that
leaves ripples of impact on our emotional, physical, relational, and spiritual lives.
Introducing Yuna: Your AI-Powered Mental Health Companion
Yuna is an AI-powered mental health coaching platform designed to provide accessible, scalable,
high-quality, low-cost, and ethically grounded emotional support. It serves as a digital companion,
empowering individuals with tools for self-discovery, emotional clarity, and resilience. Yuna offers
24/7 confidential conversations through a sophisticated conversational interface, leveraging large
language models (LLMs) trained on therapeutic principles.
Key functionalities include delivering evidence-based skills for managing stress and transforming
thought patterns, daily mood check-ins, and real-time wellness tracking. Crucially, Yuna integrates
robust safety monitoring to identify acute distress and seamlessly connects users to human-led
crisis resources and professional support when needed. Yuna is developed with clinical experts and
adheres to stringent privacy and security standards, including voluntary alignment with HIPAA
safeguards, to create a secure digital sanctuary for users.
A Psychologist's Perspective on the Problem
Having worked in the trenches of psychiatric hospitals researching suicide at Harvard University and
developing novel interventions for the second deadliest mental illness (eating disorders) at Cornell
University, I became acutely aware of just how insufficient mental health resources are, even at the
best facilities in the world. The hard truth is this: our collective effort in the field — including my own
freely available published work — just isn't cutting it.
To close the global treatment gap, a drastically different approach is required. This does not mean we
should be reckless; it means we must innovate with purpose. While many approach artificial
intelligence with fear, I think it is wiser to approach it with discernment, ethical rigor, and the
willingness to make a calculated, compassionate bet on what is possible. After all, true progress has
always been — and will always be — a product of courage and conviction.
1. The Advancement Machine: A New Threshold for Trust
We have crossed a threshold. Artificial intelligence has evolved beyond a mere tool for information
into a powerful engine for human advancement. It offers a potent — and potentially perilous —
solution to the staggering crisis in mental healthcare, a crisis defined not by a lack of effective
resources, but by a catastrophic failure of scale. With a billion people in need whom our system
cannot reach, our purpose-driven urgency to deliver support is immense.
Yet this same urgency, if pursued without profound care, creates new vectors of harm: corrosive
dependencies, the exploitation of vulnerability, and the erosion of trust in the very idea of a digital
sanctuary. The line between a helpful tool and a harmful product has never been finer, nor the stakes
higher.
The path forward cannot be a reactive scramble to patch ethical holes. It must be a deliberate, first-
principles approach to building a new kind of trust. This is the foundational principle of our
compassionate bet: moving beyond the opaque "black box" to a transparent "glass box" — where
technology is presented not as a synthetic friend, but as a sophisticated instrument for self-
reflection, grounded in science and bounded by an unwavering duty of care.
2. A Framework for Principled Healing
The current market chases the wrong metrics. Prioritizing short-term engagement at the cost of user
well-being is a failing strategy. Yuna was founded on a different premise: a compassionate bet. This is
not a wager left to chance, but a deliberate investment in a principle, guided by clinical expertise and
a firm ethical compass.
It is a calculated risk with a clear conviction: in the delicate domain of mental health, trust is the only metric that matters. We believe prioritizing user autonomy over platform dependency and clinical integrity over empty interaction is not just the moral path — it is the only sustainable path to closing
the treatment gap.
This bet is codified in four foundational commitments.
This commitment to a "Wisdom in the Loop" model stands in stark contrast to the prevailing
"Synthetic Relationship Model" common in the market. Where that model seeks to maximize
engagement through simulated affection, ours seeks to deliver genuine support. We liken Yuna to an
"interactive self-help book that teaches evidence-based skills," not a synthetic consciousness.
Furthermore, this pillar includes our most critical safety feature: non-negotiable ethical guardrails. The
system is designed to recognize acute distress and then, with compassionate simplicity to decrease
cognitive load in a moment of crisis, prompt the user to verify the level of threat before immediately
redirecting to human-led resources if confirmed — including immediately offering the option to
connect with crisis hotlines like 988.
By anchoring Yuna in established science, we move beyond ephemeral trends and provide users with
tools that have demonstrated efficacy in real-world clinical settings. Yuna does not offer vague
affirmations but rather structured exercises that teach users how to identify cognitive distortions
and implement coping mechanisms — directly translating techniques proven effective in clinical
settings.
By proactively disclosing these limitations, we empower users to engage critically, to question, and to
always triangulate information with their own judgment and with human professionals. We clearly
define the consultative role of our human experts, ensuring users understand that Yuna is a
supportive tool, not a substitute for professional diagnosis or treatment.
For our users, control is absolute and actionable: through intuitive interfaces, they can access their
entire data history, manage their preferences, and request the immediate and permanent deletion of
their information. Our policy is unconditional: private conversations will never be shared without private conversations will never be shared without
3. The Ripple Effect: The Symbiotic Future of Care
Technology in mental health has too often been a disruptor, attempting to replace a system rather
than heal it. We choose a different path: to be an integrator. Yuna is designed to be the connective
tissue within the broader ecosystem of care, strengthening the whole by supporting its parts.
- For the hesitant, we are a confidential first step toward seeking help.
- For the patient in therapy, we are the daily practice that integrates life-changing skills into life itself.
- For the millions in treatment deserts in the world, we are a lifeline to evidence-based knowledge that
would otherwise be out of reach.
We do not seek to replace the vital work of clinicians; we seek to extend their reach, to unburden a
strained system, and to serve as a trusted bridge to human-led care whenever it is needed.
This integrative philosophy fundamentally shapes how we define success. The return on our
compassionate bet is not found on a dashboard, but in the world. We will know we are succeeding
when we see:
- Workplaces that treat mental health not as a liability to be managed, but as their greatest asset to be
cultivated. - Cultural conversations where speaking of mental health is as normal as speaking of the weather, led by
influential voices who have the courage to be vulnerable. - Family trees where the branch of generational trauma withers, replaced by parents who pass down the
tools of emotional regulation and kindness.
4. Conclusion: Building the Future, Responsibly
Artificial intelligence holds an undeniable promise to help close the staggering global mental health
gap. This potential, however, will only be realized if it is guided by an unwavering commitment to
ethical rigor, user safety, and privacy. The shortcuts of today — prioritizing engagement over
empowerment, building synthetic relationships instead of resilient minds — are the foundational
cracks that will undermine the entire field tomorrow.
The framework pioneered by Yuna — rooted in purpose and realized through clinician-informed
design, evidence-based skills, radical transparency, and hospital-level security — proves it is possible
to offer powerful AI support without compromising on ethics. By choosing to empower rather than
enmesh, and to integrate rather than disrupt, we are taking a calculated risk on behalf of humanity.
This is the compassionate bet in action.
This framework is not proprietary; we hope it becomes the industry standard. This is an invitation to
our colleagues, to investors, to regulators, and to innovators across the health technology landscape.
Let us collectively make the compassionate bet. Let us build a future where technology is judged
not by its intelligence, but the wisdom imbued in it; not by its engagement, but by its capacity to
heal. Let us build it, together, with purpose.








