Skip to main content

Friends, Family, and Artificial Intelligence.

Computer, family

Have you noticed how quickly society’s technology use evolves? In just a few short years, artificial intelligence has gone from something we hear about in the news to a daily companion. We ask AI to help us brainstorm ideas, edit our emails, or even talk through personal dilemmas. For many, AI offers a sense of efficiency — I know I often utilize this tool in my own work. But for others, AI offers an eerie sense of understanding and comfort — the same way a friend or family member would. It seems as people spend more time interacting with AI, it’s worth asking an important psychological question: how might these interactions be shaping the way we relate to other humans? 

When it comes to AI, technology made to replicate human thinking, intersections with the idea of interpersonal connection become complex... On one hand, AI can enhance communication and self-understanding; it can help us clarify the way we think in a deduced fashion. On the other, it can subtly alter expectations, emotional patterns, and relational habits that hurt our very potent sense of emotional humanity. We need to understand this dynamic to help us preserve our most human connections in an increasingly digital world.

1. The Allure of Effortless Understanding

AI systems are designed to be responsive, nonjudgmental, and adaptive. When you talk to an AI, you’re met with attentiveness, patience, and carefully worded empathy. It can feel like a conversation without risk—no rejection, no tension, no misunderstanding. As a therapist, we know the value of intricately balancing both safety and safe risks. This ease and guarantee of zero conflict can set up a contrast with real-world relationships, where communication is naturally imperfect. A family member might misunderstand, interrupt, or respond defensively. A friend might not always know what to say or might be distracted by their own struggles. We need to be used to this. After spending too much time with AI “listeners,” people might unconsciously begin expecting the same level of predictability and composure from the humans in their lives—and feel frustrated when real relationships don’t measure up.

Psychologically, this mirrors the concept of “interpersonal calibration”—our expectations of others are influenced by the patterns of interaction we experience most frequently. When our conversational baseline becomes shaped by AI, we may start to find human communication comparatively inefficient or emotionally taxing.

2. Emotional Regulation and the “Always-Available” Listener

For those experiencing loneliness, anxiety, or social fatigue, AI offers an outlet: a space to express thoughts without fear of burdening anyone. This can be healthy in moderation — many hesitate to seek mental health services or talk to friends about what they are going through due to these fears. My hope is that AI tools can even encourage people to articulate their emotions or rehearse difficult conversations— but then encourage that person to talk to a real friend. However, this is idealistic. On its own, if this would not be the outcome, I do understand the benefit of how it functions similar to journaling or cognitive rehearsal in therapy. This is helpful and healthy, and can save people from serious distress who really struggle to put themselves out of their isolation when in a crisis. (Back to that point about conversations without “risk.”) 

However, when AI becomes a primary emotional outlet, a few patterns can emerge:

  • Reduced emotional risk-taking to the point where it becomes detrimental: People may become less inclined to share vulnerability with loved ones if they can “process” everything privately with AI. This is problematic if they never come up from this hole of isolation and never learn to take the real risks of sharing. This is solidifying the belief that they do not belong, that nobody will ever understand them. They become removed from society for reasons we could have avoided had they gotten the social help they needed. They might become reduced to only connecting with AI companions thinking they wouldn’t be successful in society without ever really getting the chance to try. Think about what would happen to a person if it became normalized to only engage with AI. Think about who controls AI. Think about what this person might be like in situations where they are forced to connect with real humans. It is all quite sad to think about. 
  • Reinforced avoidance patterns: For individuals prone to social withdrawal or interpersonal anxiety, AI may inadvertently reinforce avoidance by providing a “safe” alternative to human connection. Conflict or emotional messiness—the parts of relationships that deepen intimacy—might feel increasingly intolerable.

3. The Mirror Effect: How AI Shapes Self-Perception

AI often reflects our language, tone, and emotional states back to us. This mirroring can be illuminating, helping us identify patterns in our thinking or phrasing. But AI’s reflective nature can also amplify certain tendencies. For example, someone with perfectionistic traits might find AI’s polished responses validating, reinforcing unrealistic standards for communication or productivity. Someone who craves control might appreciate how predictably AI responds—and begin expecting similar compliance from others. Over time, people may internalize a sense that emotional regulation should always be neat, communication should always be efficient, and feedback should always feel kind. Human relationships, of course, rarely operate on those terms. 

Human withdrawal in a world without AI meant depression, self-harm, and suicide. Human withdrawal in a world with AI might create a barrier before approaching that place — the zombieland purgatory of fake and refined companionship. I understand the role of risk prevention in that sense. But I also worry it won’t be the last resort we’re describing. It could become a preference. 

4. Communication Patterns and the “Efficiency Mindset”

One subtle but widespread effect of AI use is the internalization of efficiency as a relational value. AI is fast, structured, and logical. Its communication style encourages clarity and optimization. When people begin to model that style unconsciously, they might start approaching conversations with friends or family as problems to be solved rather than connections to be nurtured. Conversations are not about solving a problem and walking away – it’s the journey and connection that happens within that conflict. Conflict and conversation is about building a relationship and experiencing contact with others.While clarity and structure are valuable skills, relationships thrive on presence, not productivity. We risk prioritizing the efficiency of communication over the humanity of it.

6. Rebalancing: Using AI to Deepen, Not Replace, Human Connection

The goal is not to avoid AI—it’s to use it intentionally. 

  • Notice emotional reactions. If AI feels easier to talk to than people, ask yourself what that reveals about your needs, fears, or communication habits.
  • Set relational goals. For every time period spent using AI, consider one human interaction that builds connection—call a friend, write a message, share a feeling.
  • Keep space for imperfection. Real relationships are messy and uncertain. That messiness is where trust and intimacy grow.

In mental health work, technology is often framed as either a threat or a tool—but the truth is, it’s both. It depends on how consciously we engage with it.

7. Looking Ahead: The Psychology of the AI Era

As AI continues to evolve, mental health professionals will need to understand its impact on attachment, self-concept, and communication norms. Future generations may form their relational templates partly through AI interactions—making it crucial to teach digital emotional literacy early on. Artificial intelligence can never replace the warmth, unpredictability, and shared presence that define human relationships. Friends and family define emotionally fulfilling lives; there is no replacement for that.