📖+10 XP
🎧+10 XP
✅+15 XP
Level A1 – BeginnerCEFR A1
2 min
72 words
- LLMs can give advice or instructions to online users.
- This kind of advice can be dangerous sometimes too.
- Researchers studied safety in models at a university recently.
- They want models to avoid harming people online directly.
- Safety training can make model answers less accurate sometimes.
- Some safety checks are easy for users to bypass.
- The team found important parts inside the models recently.
- They froze some parts so safety stayed the same.
Difficult words
- advice — words that tell someone what to do
- dangerous — likely to cause harm or hurt people
- researcher — people who study and test thingsResearchers
- safety — the state of no danger for people
- accurate — correct and true, not wrong
- bypass — go around a rule or system
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Do you use online advice?
- Have you seen wrong advice online?
- Do you worry about safety online?
Related articles
17 Mar 2026
19 Nov 2025
AI and citizen photos identify Anopheles stephensi in Madagascar
Scientists used AI and a citizen photo from the GLOBE Observer app to identify Anopheles stephensi in Madagascar. The study shows how apps, a 60x lens and a dashboard can help monitor this urban malaria mosquito, but access and awareness limit use.
22 Dec 2025
18 Oct 2025
27 Feb 2026