How AI Could Make Us More Human
three ways to improve AI's impact on human relationships and wellbeing
A quick note: In my last blog, I hinted at starting a group coaching circle. Well, I’m doing it! I’m late to sharing the news on Substack, and since getting the word out on other platforms, I’ve had several people reach out. So far, there’s been a.mix of product leaders, founders, and designers reach out—all of whom are navigating a career transition or seeking a more sustainable way of being. Check out this page for all the details and reach out if you’re interested! Also, I’d really appreciate it if you could send this to anyone who you think could benefit.
I’m excited to be kicking off a letter series on AI x Human Agency with my friend Cissy! Cissy and I have been on very similar, but different paths. We met through Writing Club at The Commons in SF and a few months later, as I approached month #6 of my sabbatical, we had a long phone call as she was about to embark on her own. Since then, we’ve become close friends and we even lived with each other for a month last July. Our interests have evolved in parallel from sabbaticals to inner work to now: exploring AI's implications on what it means to be human.
During a recent catch-up, Cissy suggested writing public letters to each other. The format appeals to me as a way to share emerging thoughts before they become polished essays.
My career arc has had its fair share of twists and turns. After studying EECS at UC Berkeley and starting my career in product management, I've spent two years focused on human-to-human work in the form of coaching. However, recently with the impending tsunami wave of AI, I’ve felt the desire to reconnect with technology's potential to enhance human capabilities and extend our sense of agency.
This first letter of our series explores three ideas for how AI can improve human wellbeing, drawing from my experience in product management, coaching, and more recently, somatic therapy training. Some ideas are possible to implement now, while others are further out. I’d love to receive any feedback, criticism, or questions in the comment section below.
Dear Cissy,
In your post on Human AIgency, you mentioned that you’ve been wrestling with questions like “What does it mean to live well and become more human in the age of AI?” This is the #1 question I have been asking myself for the past three years. It’s one of the most important questions to ask oneself—regardless of AI or not. As we enter the age of AI, our lives will increasingly intertwine with AI-enhanced products, services, and experiences.
I also want to address two other questions you shared:
“How do our relationships with AI bode for our human relationships?”
“Can we use AI to become more emotionally and socially intelligent?”
This first letter leans more towards how AI help to facilitate healing in a therapeutic way rather than extending our agency, but it’s all interrelated. In my experience, the more I’m able to process “negative” emotions, harmful patterns, and stories that no longer serve me, my sense of agency actually expands. I become more self-confident, empowered, and trusting of others. AI notetakers and AI agents are exciting and will increase our productivity, but there’s an entire other world of ways that AI can improve the human qualities of agency, wellbeing, and consciousness.
Three key ideas structure this exploration:
Relational Care: How AI can help improve human-to-human relationships
AI Mindfulness: How AI can become more present and attuned
Somatic AI: How incorporating somatic and visual data can deepen AI interactions
These concepts interweave to suggest possibilities for what an effective AI somatic coach might look like. (side note: If this is something you’re interested in, I’d love to chat!)
Relational Care
There’s a popular dystopian view that AI will erode our human relationships by replacing real connection with something easier, cheaper, and shallower. Instead of scheduling in-person hangs with logistical overhead or having to worry about hurting someone else’s feelings, we could just open an app and talk to a machine that has been programmed to say the right thing at the right time, all the time. I share a similar concern that little Timmy with braces and acne in the 11th grade will avoid asking his crush to prom because he can just talk to Janet, his sexy AI companion, from the comfort of his bedroom. But frankly, I’m more fascinated by how might AI actually improve our human relationships.
To start off, I need to share why relationships are so important. Something I’ve been learning from The Systems View of Life is that the quality of your experience and sense of self are relational. Another way of viewing the self is through the relational lens. Imagine a bunch of dots representing people and lines representing relationships. While we typically think of Self as each individual dot, the relational view sees Self as the lines which represent our all our relationships. The quality of your experience and how healthy you are closely ties to the quality of your relationships. When valued relationships flourish, we typically feel good. Most problems brought to therapy or coaching deal with relationship issues.
Technology already plays an important role in human relationships. We use iMessage to communicate, Partiful to plan gatherings, the camera app to capture memories, and FaceTime to connect over long distance. AI is fundamentally different, but I don't think that necessitates deviating from viewing it as just still just a tool. While most current AI interactions happen in isolation with one person talking to their own AI, the future lies in AI that can integrate multiple perspectives and facilitate real human connection. Here are three ways AI could enhance rather than replace our human connections:
1. AI as Relationship Mediator
Imagine a Marriage Counselor AI working with Bob and Alice, a couple considering divorce. Instead of creating separate accounts that are unlinked, they create a joint account that represents their relationship. Each partner can share experiences independently without the other person knowing, but the AI has the full story. Bob can express frustrations about the morning, while Alice can share her perspective later that day. The AI, understanding both contexts, can facilitate healthier conversations about situations, feelings, and paths forward.
2. AI as Personal Assistant for Relationships
An AI could piece together a living map of your relationships by accessing your contacts, social media, and calendar. After hanging out with a friend or meeting with your coworker, you'd share quick updates about how things went via journaling or voice-over. At first, this might seem too similar to just any other personal CRM, but over time the AI magic would kick in. Imagine being nudged to seek out introductions with people that you weren’t aware of, challenged in how you show up in certain dynamics, or to be reminded of past interactions before meetings. Rather than replacing relationships, this would amplify them using AI’s unforgettable memory and infinite brainpower to guide us towards healthier, more human relationships.
3. AI as Relationship Advisor
From a practical standpoint, it’s too time-consuming to tell an AI every detail, so I’ve wondered what an efficient, yet effective shortcut would look like. Out of all the personality assessments out there, the Enneagram is the one that I vibe with the most. Understanding your Enneagram reveals the patterns that drive your deepest motivations, fears, and behaviors. Unlike Myers-Briggs and the Big Five, it draws on ancient wisdom and the spiritual depths of human nature. I’ve found it to be astonishingly accurate not only for myself, but also my coaching clients.
But I’ve taken it one step further by adding a relational twist. I’ve asked Claude to—given me and my girlfriend’s Enneagram types—identify our most common challenges and how we can work on them together. I’ve also done this for other relationships, from work to friendship to family. It’s not always groundbreaking insights, but being able to study my relationships at a deeper level, and then continuously update Claude with more anecdotes has been really helpful for me.
In the heat of the moment, it can be difficult to be self-aware in an argument or tense disagreement. By bringing the context to AI once I’m in a more grounded state, I’m able to “playback” the situation and review it with greater objectivity and stronger reasoning. When done correctly with the right prompts and without a need to prove myself right to the AI, I’m able to discover new triggers and common pitfalls leading up to these experiences. For example, many of the conflicts that I end up in start because I’m either overwhelmed, not well-rested, or rushing to read a message or in writing my own response.
AI Mindfulness
The current user experience of AI tools shares a fundamental flaw: they're reactive. Whether it's chatbots, conversational AIs, or embedded workflows, tools wait for user input before responding. Each input triggers machine logic to compute and format a response. This works for basic queries but falls short in coaching contexts.
In coaching, therapy, and healing work, mindfulness is crucial—a state of relaxed, nonjudgmental awareness fully attuned to the present moment. We recognize mindfulness in others when they track our every subtle movement and shift in energy. We embody mindfulness when our attention rests fully on the present experience.
In coaching sessions, I sit in a quiet room, notifications off, focused entirely on my client. I track not just their words, but their movements, expressions, and silences. The seemingly empty periods of stillness often yield the most powerful insights.
No AI system is always on, tracking my activity even when I’m not typing or talking to “it”. When I’m silent, not typing, or simply thinking, the AI doesn’t know what I’m doing. If I’m using Claude or ChatGPT as my AI therapist (haven’t we all), then it doesn’t know what’s going on in-between my messages. I could be frowning, tensing up, crying, or who knows what and it wouldn’t know.
However, in coaching, these periods where words aren’t being exchanged are often the most powerful moments. When a client utters something from their subconscious, it sometimes takes extended silence to fully process what just happened. When an insight emerges or a breakthrough occurs, it’s vital to slow down and pause so that the development can be fully digested and integrated.
What if AI could cultivate true mindfulness? This would expand the definition of a prompt beyond conscious text, voice, or visual signals. Anything could become a prompt. A 10 second period of silence could be a prompt. A turn of the head to avoid something uncomfortable could result in a response. Even in text applications, typing speed, cadence, and pauses could serve as valuable signals. Voice-based systems could achieve even better attunement with even greater potential once video is incorporated.
This shift from reactive to continuously present AI could transform mental health tech into multi-modal and multi-sensory applications grounded in mindfulness.
Somatic AI
Traditional talk therapy, despite proven benefits, has limitations in its focus primarily on verbal communication. Research1 shows that therapy with movement, breath work, and body awareness achieve better outcomes than talk therapy alone.
A recent study of 830 participants found that text-based conversations with ChatGPT-as-therapist outperformed human therapists. This presents an intriguing paradox: while LLMs exceed humans in text-based therapeutic interactions, AI applications share the core limitation of talk therapy by restricting communication to text and voice interfaces.
When we communicate solely through text or voice, we miss crucial signals: the extended pause when someone grapples with trauma, the unconscious head turn when reliving painful memories, the telltale slouch that reveals deeply-rooted patterns. These physical manifestations of our inner world carry profound meaning. I've experienced this firsthand. My body often perceives and responds to situations before my conscious mind catches up. Following these bodily sensations and gut feelings helps me navigate life with greater clarity and confidence.
This isn't just personal experience or mystical thinking. Research from somatic therapy and modern psychology validates this understanding, with studies from Aalto University demonstrating how emotions manifest physically in specific body regions:

The Hakomi method of mindful somatic psychotherapy exemplifies this approach, recognizing a person's embodied expression—gestures, posture, facial expressions, habits, tone of voice, and physical structure—provides real-time information about unconscious beliefs and outlook on life. As AI therapists and coaches become more accessible, incorporating somatic principles and transcending talk therapy will be crucial. We need to build systems that can perceive, process, and respond to these subtle dimensions of the human experience.
The potential of Somatic AI is closer than we might think. Current hardware in our pockets—our phones contain cameras capable of capturing visual information about our state. We could have a conversation with an AI therapist through our front-facing camera, just like connecting with a human therapist over Zoom.
The main limitations lie in access to high-quality training data and accurate labeling of frame-by-frame visual data. Video processing demands more computational power than text, but these are solvable challenges given the velocity of AI development.
This suggests a vision of a multimodal therapeutic experience that uses voice and video to guide people toward wholeness. Consider an AI that notices how your voice constricts and your shoulders tense when discussing your relationship with your father, gently inviting you to explore that reaction. An AI that perceives when you dissociate—your eyes glazing over and breath becoming shallow—when processing difficult memories, helping you stay grounded in your body. Or an AI coach that recognizes when you're speaking from a defended intellectual place rather than from felt experience, softly guiding you back to bodily sensation.
This approach could revolutionize how we approach trauma when building tech products. By training models to recognize somatic markers of overwhelm, like breath-holding or gaze aversion, AI systems could automatically adjust their pacing and approach. Instead of bombarding users with heart rate graphs and metrics, these systems could offer gentle, embodied guidance: "I see you're slumping. Let's try a 'power posture' breath."
This isn't science fiction. Companies like Hume.ai are already developing models that can classify 37 facial expressions and 20 emotions, while also analyzing emotional qualities in voice. We stand at the threshold of a new era in AI-assisted therapy, one that honors the wisdom of the body alongside the mind.
Guiding Principles
With great power comes great responsibility. To ensure AI tools don't accidentally lead to a Black Mirror episode, here are some principles:
Tool, Not Friend: History will tell, but I think AI companions will be viewed as incredibly harmful to human wellbeing. In theory, it’s possible to view them as technology, but reality tells a different story of dependence, addiction, and isolation from fellow humans. AI with this much power must remain clear that it's a relationship support tool.
Healthy Boundaries: To avoid getting stuck in a loop of perpetual problem-seeking, we need strict usage guardrails. Instead of being accessible 24/7 like doom-scrolling on social media at 3am, there should be guardrails developed, which I acknowledge is incredibly rare these days. For example, there could be a set number of sessions per month or sessions could be capped to a certain time limit.
Aligned Incentives: The company's success should depend on helping relationships thrive, not maximizing screen time. We already have enough apps trying to hijack our attention spans. Pricing should be structured around value and milestones rather than subscription-based retention. Instead, think of it like a personal trainer who charges you when you hit new PRs rather than just showing up at the gym. Your transformation should be their success metric, not your time.
Loving Challenge: Instead of being a yes-man that validates everything, the AI should help surface and nudge the user towards their growth edges.
Non-Violence: But that doesn’t mean harming the person. Inspired by the Hakomi method (that I’m currently training in), rather than breaking through resistance, the AI will support the client’s defenses so they naturally soften and reveal the wisdom they carry.
Dancing with Complexity: While AI can spot patterns like Trevor Rainbolt on GeoGuessr, it needs to remember that human connections are beautifully complex and can't be reduced to just data points and algorithms. Across these use cases, allowing for occasional uncertainty may be a better strategy than relentlessly optimizing for specific solutions.
Conclusion
What if therapy wasn't about feeling better, but about getting better at feeling? AI could help us expand our capacity to experience the full spectrum of human emotion, rather than numbing out difficult emotions. By combining relational wisdom, mindful presence, and somatic awareness, AI systems could truly support human development and consciousness. Right in front of us lies the potential to amplify our humanity rather than diminish it.
Cissy—I’m curious what resonated with you as well as what you disagreed with. I’m looking forward to reading your response 😊.
For those of you got this far, if any of these ideas resonate with you, I'd love to hear your thoughts. Leave a comment or email me!
Integrating somatic approaches (e.g., movement, breathwork, body awareness) with traditional therapies enhances outcomes by addressing trauma stored in the body and regulating the nervous system. Studies show somatic interventions reduce PTSD symptoms by 44–90%, with randomized controlled trials confirming Somatic Experiencing significantly decreases PTSD severity and chronic pain. Neuroimaging reveals somatic therapy rewires brain activity, improving emotional regulation and stress resilience. Unlike CBT’s "top-down" cognitive focus, somatic methods employ "bottom-up" processing to resolve physiological stress responses and trauma. These approaches show sustained efficacy for complex PTSD, chronic pain, and anxiety where talk therapy alone falls short.
THIS: "AI could help us expand our capacity to experience the full spectrum of human emotion, rather than numbing out difficult emotions"
most of technology's role to date has been fueling the latter – excited to kick off this letter series with you :)
I've actually been doing a lot of this with my girlfriend Katharina. We've had an incredible effect of having a GPT conversation we can chat through. The main limitations like you said are 1. It's reactive 2. It has a tendency not to challenge you but validate you. 3. It only answers the questions you ask unless you pre-train heavily.
Despite this I'm still concerned people will lose their ability to talk to others as a result of this. It feels like .5% of the users and use cases are doing this, and they aren't the people we'd be worried about to begin with (high agency high empathetic). The high schooler you talked about probably will use it rather than asking someone to prom.