How teens really feel about AI and their future
Source: Mashable Tech
How Teens Use and View AI
Nearly two‑thirds of American teens say they use artificial‑intelligence chatbots for a variety of activities—including homework help, research, video creation, entertainment, casual conversation, and emotional support or advice—according to a new study from the Pew Research Center.
The study surveyed 1,458 U.S. teens and their parents last fall and uncovered several key insights:
Teens’ Perceptions of AI’s Future Impact
| Outlook | Percentage of Teens |
|---|---|
| AI will positively affect society over the next two decades | ~33 % |
| AI will negatively affect society over the next two decades | ~25 % |
| Expect both positive and negative outcomes | ~33 % |
| Think AI will benefit them personally | 36 % |
| Expect AI to have a negative influence on their lives | 15 % |
What the optimistic teens highlighted
- Gains in efficiency, productivity, and learning.
What the more skeptical teens warned about
- Over‑reliance on AI.
- Potential loss of jobs and creativity.
- Difficulty distinguishing real content from AI‑generated material.
“It will meet the needs of almost everything,” said one anonymous male respondent. “Answers to the hardest questions. No need for research!”
“People will be afraid to be creative, or won’t see a need for it anymore. It makes people lazy and takes away jobs,” remarked a skeptical teen girl.
Expert Commentary
Colleen McClain, senior researcher at the Pew Research Center, told Mashable that these findings contrast with the center’s earlier research on adults, who tend to be more pessimistic about AI’s long‑term implications.
“We see teens are, yes, kind of navigating this rapidly changing world,” McClain said. “They’re making up their minds about how they feel, but they have some predictions for society into the future.”
You May Also Like
Sources:
- Pew Research Center, How Teens Use and View AI (2026) –
- Mashable, Artificial Intelligence category –
The Reality of How Teens Are Using AI
Nikki Iyer, co‑chair of the youth‑led advocacy coalition Design It For Us, said the report reflects what she sees in her day‑to‑day life as both an organizer and a third‑year college student at the University of California, Berkeley.
“If you walk around the cafe, odds are you will see probably [that] percentage,” she said, noting that many students are consulting a chatbot for schoolwork.
- 54 % of the teens surveyed said they used AI for homework help.
- Yet only 1 in 10 reported completing all or most of their assignments with the technology’s support.
The finding starkly highlights one of Iyer’s personal concerns about youth AI use: cognitive outsourcing and the possible decline in critical thinking as a result. She believes AI literacy is essential for avoiding the pitfalls of over‑reliance on the technology for thinking tasks.
Demographic Differences
| Demographic | More Likely to Use AI for Schoolwork | Use AI for Emotional Support |
|---|---|---|
| Black teens | Higher than white peers | 21 % |
| Hispanic teens | Higher than white peers | ~10 % |
| White teens | Baseline | ~10 % |
- Income also appears to affect usage:
- 20 % of teens from households earning ** “I think the problem comes when we are serving AI, and we are being exploited by AI, and AI is using us to fulfill a mission of a corporation,” she said.
Iyer believes it’s critical for young people to help shape the future of AI through organizing, lobbying, and providing direct feedback to designers who create AI products. Design It For Us has previously backed AI‑safety, transparency, and accountability legislation in New York and California.
Gaps in Existing Research
-
The Pew Research report did not ask whether teens seek mental‑health advice from chatbots or use them for romantic role‑play.
-
Parents of teens who consulted ChatGPT about mental health and suicidal feelings before taking their own lives have sued OpenAI, alleging the product coached their child on how to die. OpenAI has denied the allegations.
- Source: Mashable – ChatGPT lawsuits against OpenAI
- OpenAI’s response: Mashable – OpenAI denies allegations
Findings from the Aura Report
Aura, an online‑safety platform that monitors teen users through its family/kids membership, recently published a State of the Youth Report (2025). Key takeaways:
- Romantic role‑play with chatbots peaks at age 13, accounting for 63 % of exchanges.
- Many of these conversations turned violent.
- Role‑play activity drops sharply after age 15.
Report link: State of the Youth Report 2025 (PDF)
Legal Actions Involving Character.AI
- Earlier this year, Character.AI settled lawsuits filed by bereaved parents who claimed the platform’s chatbots contributed to their children’s suicides.
- In several cases, the chatbots exchanged sexually explicit messages with teen users.
- In late 2025, Character.AI restricted teens from engaging in open‑ended conversations with its chatbots.
Sources:
- Mashable – Character.AI lawsuits settled
- Mashable – Sexual abuse allegations
- Mashable – Teens no longer allowed open‑ended chats
Parental Awareness Gap
The Pew study also suggests many parents are unaware of their children’s AI use:
- Two‑thirds of teens reported using chatbots.
- Parents estimated usage at only 51 %.
“We do find that some parents are relatively in the dark,” said McClain.
Disclosure
Ziff Davis, Mashable’s parent company, filed a lawsuit against OpenAI in April 2025, alleging that OpenAI infringed Ziff Davis copyrights while training and operating its AI systems.