AI scans student emails for threats, but is it ethical?
Lawrence High School in Kansas is under fire for using Gaggle, an AI tool that scans students' emails, papers, and uploads for threats like violence or self-harm.
The school signed a $160,000 contract for the software, which is used in over 1,500 US districts.
But students say the system invades their privacy and often gets things wrong.
Lawsuit challenges the tech's use
Students reported harmless things—like an art project or a joke email—being flagged as suspicious by the AI.
Investigations found that LGBTQ students were at times outed to parents or school officials by the system.
There are also concerns about sensitive content being briefly posted online and outside reviewers seeing private data.
Now, there's a lawsuit challenging how this tech is used, with many questioning if constant surveillance does more harm than good for student trust and expression.