Building mental health AI that's warm enough to help, safe enough to trust, and smart enough to recognize when human intervention is non-negotiable—serving users between therapy sessions with 99.8% crisis detection accuracy.
User Retention
Crisis Detection
Therapy Session Impact
Key Outcomes
Crisis detection is a safety constraint, not a metric—99.8% sensitivity is non-negotiable
Between-session AI support produces 3.2x better clinical outcomes than therapy alone
Validation before intervention is the foundational design principle—always empathize first
Clinician integration (vs replacement) creates the clinical credibility that drives adoption
Honest AI identity disclosure builds more trust than simulating a human
Kite Therapy deploys empathetic AI companions that provide between-session mental health support using evidence-based conversational techniques (CBT, DBT, motivational interviewing) calibrated to each user's emotional state. The system detects crisis signals with 99.8% accuracy and escalates to human support immediately when needed. Users who engage with Kite between therapy sessions show 3.2x greater progress on clinical outcome measures than those in therapy alone.
Kite Therapy is a mental health support platform that bridges the gap between weekly therapy sessions, where users spend 50 minutes per week with a therapist and 10,070 minutes without one. The platform's AI companions provide emotionally calibrated support, psychoeducation, and skill practice, with a safety architecture that ensures crisis situations always reach human professionals.
Therapy is effective, but people only attend therapy 1 hour per week. Between sessions, they face anxiety spirals at 2am, depression on Tuesday afternoon, and panic attacks on Friday evenings—with nothing but a waiting period until their next appointment. Traditional digital mental health tools are either too generic (guided meditation apps) or too clinical (symptom tracking tools) to provide genuine emotional support in these moments.
167 hrs
Therapy Gap Hours Per Week
Hours between weekly therapy sessions where patients have no clinical support—where most mental health crises and skill-practice opportunities occur.
23%
Skill Practice Completion
Proportion of patients who successfully practice therapist-assigned CBT skills between sessions without structured support or accountability.
Hours
Crisis Access Time
Typical wait time to reach a mental health professional during a non-emergency crisis moment—a critical gap for people at medium risk.
AGIX Technologies built an AI companion system that uses fine-tuned language models trained on therapeutic conversation patterns, calibrated to respond with appropriate warmth and evidence-based techniques for each emotional state. The safety architecture treats crisis detection as a non-negotiable primary function—no engagement metric can override the crisis response protocol.
Emotional State Calibration Engine
Detects user emotional state from conversation signals and adjusts the AI's tone, pacing, and technique selection: grounding exercises for anxiety, validation for sadness, gentle challenge for cognitive distortions.
CBT/DBT Skill Practice Guide
Structured skill practice modules for therapist-assigned techniques, with the AI adapting the practice session to the user's current emotional state and energy level.
Crisis Detection System
Multi-signal crisis detection combining language analysis, behavioral patterns, and explicit disclosure flags. Any medium or high crisis signal triggers immediate escalation to human crisis support—never AI-only handling.
Therapist Collaboration Interface
Therapists connected to the platform receive session summaries, skill practice completion rates, and mood pattern data—extending their clinical insight between sessions without additional patient burden.
Crisis Resource Integration
When escalation is warranted, the system provides warm handoffs to crisis lines, the user's therapist's emergency contact, and local emergency services with location awareness.
Progress & Pattern Tracking
Longitudinal mood tracking, skill practice records, and engagement patterns are shared with the user's therapist as a clinical supplement—transforming the AI companion into a clinical monitoring tool.
Crisis Detection Accuracy
Sensitivity for detecting crisis-level user states—the system is calibrated to never miss a crisis at the cost of occasional false positives
User Retention
Retention rate for users who engage with the companion vs 52% for apps without between-session AI support
Therapy Progress
Improvement in clinical outcome measures for users with AI companion vs therapy-only control group
Skill Practice Rate
vs 23% baseline—users practice therapist-assigned skills 3x more often with AI companion scaffolding
"Kite has become part of my clinical toolkit. I can see exactly how my patients are doing between sessions, which skills they're practicing, and when they needed support at 2am. It extends my reach without extending my hours."
Licensed Clinical Psychologist
Private Practice, San Francisco
Understand where the user is emotionally
Each session begins with an optional mood check-in, but the AI is continuously reading emotional signals from conversation content regardless. The emotional state model classifies the user's current state (anxious, depressed, angry, neutral, elevated risk) and selects the appropriate response framework.
Non-Negotiable Crisis Detection Priority
Making crisis detection a safety constraint that cannot be overridden by any engagement or retention metric ensured the system never optimized for retention at the cost of safety.
Therapeutic Alliance Through Consistency
Unlike human therapists who may be unavailable at 2am, the AI is always present with consistent warmth. This availability during high-risk moments creates a therapeutic relationship that makes users more likely to use skills and less likely to escalate to crisis.
Clinician Integration Created Credibility
Connecting the AI to a user's existing therapist—rather than replacing them—created clinical legitimacy. Therapists who could see AI session data trusted and recommended the tool more actively.
Evidence-Based Technique Grounding
Training the response model on actual therapeutic conversations rather than general empathetic dialogue produced responses that felt genuinely helpful rather than platitudinous.
Honest About AI Limitations
The AI is explicit that it's an AI and cannot provide therapy. This honesty, paradoxically, built more trust than systems that simulated being human.
Every AI system has constraints. Here's what to know before building something similar.
Not a Replacement for Therapy
The system explicitly positions itself as between-session support, not clinical treatment. Users in acute psychiatric crisis, active psychosis, or requiring medication management need in-person clinical care.
Cultural and Language Nuance
Emotional expression, appropriate responses to distress, and culturally appropriate therapeutic techniques vary significantly across cultures. The current model is most validated for English-speaking Western contexts.
Trauma Processing Requires Specialists
Complex trauma, PTSD processing, and trauma-informed care require specialized clinical expertise. The AI provides supportive presence but does not attempt trauma processing techniques that require trained supervision.
Interpersonal Crisis Situations
When users are in crisis situations involving other people (domestic violence, immediate physical threat), the AI's ability to help is limited. These situations require immediate human emergency services.
Explore the services, industry solutions, and intelligence types that power this system.
Common questions about building empathetic ai companion systems like the one deployed at Kite Therapy.