AI
02 Sep, 2025
10 Min read
Mental health and addiction are among the biggest silent crises of our time. According to the World Health Organization, nearly 970 million people worldwide live with a mental health condition or addiction.Addictions and struggles appear in multiple forms:
Recovery from these issues is rarely solved with a single therapy session. Research shows that multiple, consistent sessions with professional guidance are often required for long-term healing and behavior change. Yet most people fail to access this level of support due to:
These barriers leave millions without the consistent care they need. At Stixor, our team asked:Could AI, combined with the ubiquity of WhatsApp, make therapy-like support more continuous, accessible, and affordable?
In recent years, AI-powered mental health apps (such as chatbot therapists) have emerged.They provide instant conversations and guidance at a fraction of the cost of human therapy.However, they face a critical limitation: low retention.
At Stixor, we tried to address this gap. Our team of AI Engineers experimented with whether AI,integrated with WhatsApp, could support people in a way that:
1. Improves retention by engaging users where they already spend their time.
2. Provides structured, therapy-like first sessions using CBT and DBT techniques.
3. Offers continuous follow-ups with micro-tasks and adaptive support.
4. Ensures safety mechanisms for serious cases through escalation protocols
The experiment was structured to mimic the flow of therapy: starting with a structured first session, then continuing through daily engagement on WhatsApp.
Users began by selecting their challenge:
This created a personalized profile for the AI to work with.
The AI simulated the therapist’s first session, applying principles from Cognitive Behavioral Therapy (CBT) and Dialectical Behavior Therapy (DBT).Structured questions helped uncover the root causes and triggers.
Example:
AI: “When do you feel the strongest urge to smoke?”
User: “Usually after meals.”
AI: “That’s a common trigger. Let’s explore replacing it with a healthier action after meals.”
After the initial session, the AI transitioned to WhatsApp for daily check-ins.Why WhatsApp?
The AI acted as a daily accountability partner, sending reminders and motivational nudges.Example for smoking:
User responses fed into a dynamic progress tracker.The AI adjusted difficulty based on engagement:
Our Response: True. But our AI was never intended as a replacement — only as a support layer to extend human care. Escalation protocols ensured critical cases reached professionals.
Our Response: We experimented with adaptive frequency. If users ignored messages, the AI slowed down; if users engaged more, it leaned in with additional support.
Our Response: Data was encrypted, anonymized, and processed in compliance with GDPR/HIPAA. No raw conversation data was shared externally.
Our Response: Exactly why this was framed as an experiment. Early trials suggested better retention vs. standalone AI apps, but larger-scale studies are needed.
At Stixor, our team’s experimentation showed both promise and limitations in combining AI with WhatsApp for mental health and addiction recovery.
By simulating structured first sessions, providing continuous nudges, and adapting dynamically, the AI companion demonstrated that daily micro-interventions can meaningfully support recovery journeys.
Yet, this experiment also highlighted the importance of ethical safeguards, privacy protections, and hybrid models where AI works alongside not instead of human care
Established in 2021, we‘re a global IT Services provider delivering innovative business solutions and technology services worldwide.
Established in 2021, we‘re a global IT Services provider delivering innovative business solutions and technology services worldwide.