Mental Health Technology
Empathetic AI Companion

Kite Therapy: AI Companions That Know When To Call For Help

Building mental health AI that's warm enough to help, safe enough to trust, and smart enough to recognize when human intervention is non-negotiable—serving users between therapy sessions with 99.8% crisis detection accuracy.

+89%

User Retention

99.8%

Crisis Detection

3.2x

Therapy Session Impact

Key Outcomes

Crisis detection is a safety constraint, not a metric—99.8% sensitivity is non-negotiable

Between-session AI support produces 3.2x better clinical outcomes than therapy alone

Validation before intervention is the foundational design principle—always empathize first

Clinician integration (vs replacement) creates the clinical credibility that drives adoption

Honest AI identity disclosure builds more trust than simulating a human

Direct Answer

"How does Kite Therapy use AI for mental health support?"

Kite Therapy deploys empathetic AI companions that provide between-session mental health support using evidence-based conversational techniques (CBT, DBT, motivational interviewing) calibrated to each user's emotional state. The system detects crisis signals with 99.8% accuracy and escalates to human support immediately when needed. Users who engage with Kite between therapy sessions show 3.2x greater progress on clinical outcome measures than those in therapy alone.

About Kite Therapy

Client Context

Kite Therapy is a mental health support platform that bridges the gap between weekly therapy sessions, where users spend 50 minutes per week with a therapist and 10,070 minutes without one. The platform's AI companions provide emotionally calibrated support, psychoeducation, and skill practice, with a safety architecture that ensures crisis situations always reach human professionals.

Founded2019
Scale200,000+ active users, 40+ therapeutic partner organizations
HQBoston, MA, USA
IndustryMental Health Technology
Empathetic AI Companion
The Problem

Mental Health Care Has a Between-Session Problem

Therapy is effective, but people only attend therapy 1 hour per week. Between sessions, they face anxiety spirals at 2am, depression on Tuesday afternoon, and panic attacks on Friday evenings—with nothing but a waiting period until their next appointment. Traditional digital mental health tools are either too generic (guided meditation apps) or too clinical (symptom tracking tools) to provide genuine emotional support in these moments.

167 hrs

Therapy Gap Hours Per Week

Hours between weekly therapy sessions where patients have no clinical support—where most mental health crises and skill-practice opportunities occur.

23%

Skill Practice Completion

Proportion of patients who successfully practice therapist-assigned CBT skills between sessions without structured support or accountability.

Hours

Crisis Access Time

Typical wait time to reach a mental health professional during a non-emergency crisis moment—a critical gap for people at medium risk.

The Solution

Emotionally Intelligent AI With Evidence-Based Therapeutic Techniques

AGIX Technologies built an AI companion system that uses fine-tuned language models trained on therapeutic conversation patterns, calibrated to respond with appropriate warmth and evidence-based techniques for each emotional state. The safety architecture treats crisis detection as a non-negotiable primary function—no engagement metric can override the crisis response protocol.

1

Emotional State Calibration Engine

Detects user emotional state from conversation signals and adjusts the AI's tone, pacing, and technique selection: grounding exercises for anxiety, validation for sadness, gentle challenge for cognitive distortions.

2

CBT/DBT Skill Practice Guide

Structured skill practice modules for therapist-assigned techniques, with the AI adapting the practice session to the user's current emotional state and energy level.

3

Crisis Detection System

Multi-signal crisis detection combining language analysis, behavioral patterns, and explicit disclosure flags. Any medium or high crisis signal triggers immediate escalation to human crisis support—never AI-only handling.

4

Therapist Collaboration Interface

Therapists connected to the platform receive session summaries, skill practice completion rates, and mood pattern data—extending their clinical insight between sessions without additional patient burden.

5

Crisis Resource Integration

When escalation is warranted, the system provides warm handoffs to crisis lines, the user's therapist's emergency contact, and local emergency services with location awareness.

6

Progress & Pattern Tracking

Longitudinal mood tracking, skill practice records, and engagement patterns are shared with the user's therapist as a clinical supplement—transforming the AI companion into a clinical monitoring tool.

System Architecture

Kite Therapy AI Architecture

User Interface
Mobile App (iOS/Android)
Text & Voice Input
Mood Check-In Widget
Skill Practice Modules
Emotional Intelligence Layer
Emotional State Detection
Crisis Signal Classification
Therapeutic Technique Selection
Tone Calibration Engine
Conversation Intelligence
Therapeutic Response Generation
CBT/DBT Pattern Library
Empathy-First Response Templates
Context Window Management
Safety Architecture
Multi-Signal Crisis Detection
99.8% Sensitivity Crisis Model
Mandatory Escalation Protocols
Audit Trail Generation
Clinician Integration
Therapist Dashboard
Session Summary Generation
Mood Pattern Reporting
Crisis Alert System
Results

Clinical and Safety Outcomes for Mental Health Support

99.8%

Crisis Detection Accuracy

Sensitivity for detecting crisis-level user states—the system is calibrated to never miss a crisis at the cost of occasional false positives

+89%

User Retention

Retention rate for users who engage with the companion vs 52% for apps without between-session AI support

3.2x

Therapy Progress

Improvement in clinical outcome measures for users with AI companion vs therapy-only control group

73%

Skill Practice Rate

vs 23% baseline—users practice therapist-assigned skills 3x more often with AI companion scaffolding

"Kite has become part of my clinical toolkit. I can see exactly how my patients are doing between sessions, which skills they're practicing, and when they needed support at 2am. It extends my reach without extending my hours."

Licensed Clinical Psychologist

Private Practice, San Francisco

How It Works

How Kite Therapy's AI Supports a User in Distress

1

Check-In & State Detection

Understand where the user is emotionally

Each session begins with an optional mood check-in, but the AI is continuously reading emotional signals from conversation content regardless. The emotional state model classifies the user's current state (anxious, depressed, angry, neutral, elevated risk) and selects the appropriate response framework.

Why It Worked

Why Safety-First Design Created Better Clinical Outcomes

Non-Negotiable Crisis Detection Priority

Making crisis detection a safety constraint that cannot be overridden by any engagement or retention metric ensured the system never optimized for retention at the cost of safety.

Therapeutic Alliance Through Consistency

Unlike human therapists who may be unavailable at 2am, the AI is always present with consistent warmth. This availability during high-risk moments creates a therapeutic relationship that makes users more likely to use skills and less likely to escalate to crisis.

Clinician Integration Created Credibility

Connecting the AI to a user's existing therapist—rather than replacing them—created clinical legitimacy. Therapists who could see AI session data trusted and recommended the tool more actively.

Evidence-Based Technique Grounding

Training the response model on actual therapeutic conversations rather than general empathetic dialogue produced responses that felt genuinely helpful rather than platitudinous.

Honest About AI Limitations

The AI is explicit that it's an AI and cannot provide therapy. This honesty, paradoxically, built more trust than systems that simulated being human.

Honest Limitations

What This System Doesn't Do Well

Every AI system has constraints. Here's what to know before building something similar.

Not a Replacement for Therapy

The system explicitly positions itself as between-session support, not clinical treatment. Users in acute psychiatric crisis, active psychosis, or requiring medication management need in-person clinical care.

Cultural and Language Nuance

Emotional expression, appropriate responses to distress, and culturally appropriate therapeutic techniques vary significantly across cultures. The current model is most validated for English-speaking Western contexts.

Trauma Processing Requires Specialists

Complex trauma, PTSD processing, and trauma-informed care require specialized clinical expertise. The AI provides supportive presence but does not attempt trauma processing techniques that require trained supervision.

Interpersonal Crisis Situations

When users are in crisis situations involving other people (domestic violence, immediate physical threat), the AI's ability to help is limited. These situations require immediate human emergency services.

When To Use This Approach

Is This Right For Your Business?

Good Fit If You...
Mental health platforms seeking to support users between clinical appointments
Therapist networks wanting to extend clinical reach without additional session time
Healthcare organizations reducing crisis escalation rates for medium-risk populations
Employee assistance programs providing accessible mental health support at scale
Not A Good Fit If You...
Acute psychiatric care settings where clinical supervision must be immediate
Platforms serving users with active psychosis or complex trauma without clinical oversight
Applications where there is no clear escalation path to human clinical support
Populations where technology access or digital literacy is a significant barrier
Frequently Asked Questions

Kite Therapy AI Case Study — FAQ

Common questions about building empathetic ai companion systems like the one deployed at Kite Therapy.