this post was submitted on 08 Sep 2024
91 points (100.0% liked)

Technology

37599 readers
268 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the U.S. don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts.

"As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth," the Electronic Software Foundation (EFF) says.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

you are viewing a single comment's thread
view the rest of the comments

I feel you're coming at this from an abstract angle more than how these things actually play out in practice. This isn't reliable software, it isn't proven to work, and the social and economic realities of the students and families and districts have to be taken into account. The article does a better job explaining that. There are documented harms here. You, an adult, might have a good understanding of how to use a monitored device in a way that keeps you safe from some of the potential harms, but this software is predatory and markets itself deceptively. It's very different than what I think you are describing.