Initiative

Alesvia Mind

Planned

AI & Mental Health Professional Support

Mental health professionals are on the front lines of AI's impact on human wellbeing — yet most received zero training on the topic. Alesvia Mind equips psychologists, psychiatrists, and therapists to navigate the AI age ethically, confidently, and on their own terms.

The Challenge

Patients are arriving in therapy already using AI chatbots for emotional support. A Brown University study found AI chatbots systematically violate mental health ethics standards. The WHO convened experts in March 2026 specifically on responsible AI for mental health. Yet 71% of psychologists have never used AI tools, and most training programs don't cover AI literacy. The gap between what practitioners face and what they're prepared for is growing daily.

What We're Building

  • Continuing education programs for mental health professionals on AI literacy
  • Ethical assessment frameworks for AI tools used in clinical settings
  • Patient-facing resources: how to talk to your therapist about your AI use
  • Clinical guidelines for AI-related issues — emotional dependency, parasocial relationships, AI-induced anxiety
  • Advocacy for regulatory guardrails on AI therapy bots

Who This Serves

  • Psychologists and psychiatrists
  • Psychotherapists and counselors
  • Clinical social workers
  • Their patients — indirectly, through better-informed care

Get Involved

Alesvia Mind is in the planning phase. If you are a mental health professional, researcher, or institution interested in shaping this initiative, we want to hear from you.

Get Involved