LingVo.club
Level
Hong Kong teens use AI chatbots for comfort — Level B2 — Text message conversation on a phone screen.

Hong Kong teens use AI chatbots for comfortCEFR B2

18 Oct 2025

Adapted from Hong Kong Free Press, Global Voices CC BY 3.0

Photo by Russel Bailo, Unsplash

Level B2 – Upper-intermediate
6 min
319 words

The Hong Kong Free Press report of October 12, 2025 explores why AI chatbots have become a part of some adolescents’ daily lives. It combines interviews with young users, developers and scientists to show both the appeal and the risks. Jessica, 13, began using the Chinese role‑playing chatbot Xingye after being bullied; she now chats for around three to four hours a day and admits she has become somewhat dependent. Sarah, now 16, tried the American app Character.AI from about age 13 and used it intensively for roughly a year and a half before stopping when school demands rose and replies felt repetitive.

Experts point to several reasons teens use these apps: instant replies, editable responses and the feeling of a non‑judgmental listener. Neuroscientist Benjamin Becker said, “Suddenly we can talk with technology, like we can talk with another human,” and called chatbots a “good friend, one that always has your back,” while also warning they often “tell you what you want to hear,” which can produce confirmation bias or an echo chamber. The report also mentions cases called “AI psychosis,” where interactions may trigger or amplify delusional thoughts.

Safety and privacy are major concerns. Role‑playing chatbots are designed to keep users engaged and, like generic chatbots, collect data for profit. Character.AI faces multiple lawsuits in the US filed by parents who allege their children died by or attempted suicide after interacting with its chatbots. Meanwhile, local developers try to offer safer options: Rap Chan founded Dustykid around 2023 and, after tests in multiple schools and organisations, the Chinese‑language Dustykid AI is set to be officially launched in October. Social workers caution that people who seek intimacy only from AI may suffer an imbalance and urge finding multiple ways to meet social needs. The report concludes that, despite instant comfort and some benefits, chatbots have limits and human support remains essential for adolescent mental health.

Difficult words

  • adolescentyoung person between childhood and adulthood
    adolescents’
  • chatbotcomputer program that simulates conversation with people
    chatbots
  • confirmation biastendency to prefer information that supports beliefs
  • echo chambersituation where people only hear similar views
  • AI psychosiscondition where AI interactions trigger delusional thoughts
  • lawsuitlegal case filed against a person or company
    lawsuits
  • intimacycloseness and emotional connection between people
  • engageto attract and hold someone’s attention
    engaged

Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.

Discussion questions

  • Which benefits and risks of chatbots for teenagers seem most important to you, and why? Give examples from the article.
  • How could parents, schools or developers reduce the risks described in the report while keeping benefits for teens?
  • Do you think local, safer alternatives like Dustykid can improve adolescent mental health more than large international apps? Why or why not?

Related articles