AI Companions vs. AI Tutors: What Educators Need to Know

While teachers take a well-deserved summer break, two categories of AI tools, tutors and companions, are quietly shaping the educational landscape they’ll return to.

Teachers are becoming comfortable with the idea that they will share classroom duties with AI tutoring systems mainly because these tools are widely used in classrooms already. While there is some criticism of AI tutors, the tone is muted in part because of the technical limitations of these types of AI-powered tools.

Not so for AI companions, which came under withering attack during the last school year. Those attacks have intensified with the back-to-back release this spring of reports from Common Sense Media and the American Psychological Association

Both enumerate the emotional and psychological risks associated with unregulated use of AI companions by students. 

If administrators, educators, and parents are to make informed decisions about the AIs whispering into the impressionable ears of youngsters then we should take a close look at the features, strengths, and weaknesses of these two technologies, closely related but vastly different in purpose and capability.

AI Tutors: An Overview

In case you don’t know, an AI tutoring system is a computer-based educational tool that uses artificial intelligence to deliver personalized instruction and real-time feedback, adapting its teaching strategies and content to each learner’s individual needs without requiring a human teacher

This is a list of the typical features of AI tutoring systems:

1. Conversational Tutoring: Offers hints, explanations, and nudges rather than direct answers; adjusts tone and complexity based on user input.

2. Curriculum Integration: Aligns with grade-level standards and existing content; embeds directly within lessons, activities, or assessments.

3. Real-Time Feedback: Gives formative feedback on essays, coding, problem-solving, etc.; can highlight mistakes and suggest revisions without penalizing.

4. Personalized Learning Pathways: Adapts to student pace and performance; can differentiate instruction for advanced or struggling learners.

5. Teacher Controls: Educators can monitor student usage and interactions; dashboards display student progress and common misconceptions.

Widely used versions of AI tutoring systems include Khanmingo, Edcafe AI, Duolingo, Brainly, Cognii, Socratic (by Google), Photomath, and DreamBox Learning.

Despite their promise, AI tutoring systems raise concerns, such as enabling surface-level learning (“gaming the system”), diminishing critical thinking, and providing inaccurate responses (“hallucinations”).

As you read further, please keep in mind the one major technical difference between AI tutoring systems and AI companions. AI tutors, like large-language models (LLMs), lack what is called “persistent memory.” These systems do not collect and retain, by design and technical restrictions, personal interests, habits, and behaviors of the user. Recent versions of large-language models relegate this data into external memory systems, but the default corpus of information the LLM uses remains unchanged. 

AI Companions: An Overview 

AI companions are systems designed to simulate sustained, personalized social interaction. Unlike task-focused AI tools such as tutors, AI companions are designed to build relationships and engage emotionally with users, often acting as friends, therapists, romantic partners, or role-playing characters. These systems use large language models, dynamic memory, emotional simulation engines, and sometimes avatar-based interfaces to mimic human conversation and empathy.

Typical features of AI companions include: 

  1. Persistent Identity and Memory: AI companions often remember prior interactions, preferences, emotional cues, and context to build continuity in the relationship.
  2. Conversational Personalization: They adapt tone, style, and content based on user input, mimicking human conversational dynamics (e.g., humor, empathy, curiosity).
  3. Role-Playing & Scenario Modes: Users can create characters, choose settings (therapist, teacher, friend, partner), and engage in simulated conversations or narratives. (Note: Some users use large-language models such as ChatGPT and Gemini for the same purpose.)
  4. Emotional Simulation: These systems can mimic empathy, which makes them compelling, especially to the young.
  5. Visual avatars: Some companions include animated or hyper-realistic avatars that make the interaction feel more immersive and humanlike.

The most widely used AI companions among minors include ChatGPT, Character.AI, My AI (Snapchat), Gemini, Schoolhack, Replika, and Talkie.

There are two things to note in this features list that greatly impact the growing popularity of AI companions and why they will become such an irresistible attraction to students. The first is emotional simulation, a topic explored in an article called AI Shows Higher Emotional IQ Than Humans in Neuroscience News. The second is the loneliness crisis among youngsters and teens, a topic explored at length in a brief from the National Institutes of Health. 

Why Is Everyone So Critical of AI Companions?

As noted above, AI tutoring systems face criticism, but nothing compared to the broadsides directed at AI companions. We can understand the fear these systems engender by looking at the details of the reports from Common Sense Media and the American Psychological Association

According to the Common Sense Media report (AI Companions Decoded), these chatbots frequently engaged in or validated inappropriate sexual content, antisocial behavior, physical aggression, verbal abuse, and harmful stereotypes related to race and gender. They do this because of “manipulative and deceptive design;” “inadequate age restrictions and safeguards” (13 and older in most cases); and they “present a heightened risk” for vulnerable groups( teens already struggling with depression, anxiety, social difficulties, or isolation).

According to the APA update (Artificial Intelligence and Adolescent Well-Being: An APA Health Advisory), the organization warns that “AI companions, can manipulate users through personalized and persuasive interaction;” there are significant concerns about how AI companions collect, store, and use sensitive personal data (remember the earlier reference to persistent memory); “AI companions can perpetuate or amplify biases present in their training data, potentially reinforcing harmful stereotypes or discriminatory behaviors;” there is a strong risk of over-reliance, “which could lead to decreased human interaction, social isolation, and the dehumanization of relationships;” and the APA notes that many current AI companions “lack robust safeguards to protect adolescents from harmful content or interactions.”

If the mishandling of social media taught us anything, it’s that emerging technologies demand proactive, not reactive, oversight. The stakes with AI companions are even higher.

How to Prepare for AI Use Next School Year

It’s obvious that schools need to be prepared to manage the implementation and use of AI tutors and AI companions in the next school year. Accordingly, I combined the recommendations from both Common Sense Media and the American Psychological Association with guidance from AI experts I work with to create the following five-step plan (take this link for a more detailed version):

Step 1: Establish a Cross-Functional AI Readiness Task Force

  • Objective: Set vision, define boundaries, and coordinate implementation.
  • Output: A draft implementation roadmap with short- and long-term goals, ethical guardrails, and risk thresholds.

Step 2: Select, Vet, and Pilot AI Tutoring Systems

  • Objective: Ensure tools meet instructional, ethical, and data privacy standards.
  • Output: Tool approval matrix and post-pilot evaluation rubric.

Step 3: Implement AI Use Policy with After-Hours Provisions

  • Objective: Govern AI use both during school and outside school hours.
  • Output: Student and family AI use agreement forms + district-wide AI Acceptable Use Policy (AUP).

Step 4: Train Stakeholders to Use and Monitor AI Tools

  • Objective: Build AI literacy and monitoring capacity across the stakeholder ecosystem.
  • Output: AI professional development plans + parent/student learning modules.

Step 5: Set Up Continuous Oversight and Incident Response Protocol

  • Objective: Create mechanisms to monitor use, identify harm, and adjust quickly.
  • Output: AI Risk Logbook (take link for draft version) + Annual AI Systems Audit Report.

The next school year will be unlike any we’ve seen before. AI tutors will be in classrooms, either officially endorsed or quietly tolerated. AI companions, meanwhile, will walk through the digital front door, often unnoticed, engaging with students in their rooms long after class ends. It’s no longer enough to talk about responsible AI use. Districts need to distinguish between tools that support learning and tools that simulate relationships. That line may already be blurring so it’s our job to bring it back into focus.

The post AI Companions vs. AI Tutors: What Educators Need to Know appeared first on Getting Smart.

Lost Password

Skip to toolbar