Last month, the White Houseits highly anticipated executive order on artificial intelligence. Unfortunately, the Biden administration missed another critical opportunity to address the AI civil rights crisis unfolding across Black communities nationwide.
With each passing day, it becomes clear that these dangerous technologies are drawing new color lines for the 21st century, threatening to relegate yet another generation of Black people to second-class citizenship. Nowhere is this more evident than in the use of algorithmic technologies in America’s public schools.
Controversial, data-driven technologies are showing up in public schools nationwide at alarming rates. AI-enabled systems such as facial recognition, predictive policing, geolocation tracking, student device monitoring and even aerial drones are commonplace in public schools.
For example, a recent nationalof educators found that over 88 percent of schools use student device monitoring, 33 percent use facial recognition and 38 percent share student data with law enforcement. Many of these tools are designed for use and routinely used by regimes to repress ethnic minorities — making their use in schools all the more frightening.
The harms of these technologies are not evenly shared. Research shows that these tools disproportionately affect Black youth, youth with disabilities, immigrant youth, LGBTQ youth and youth in low-income communities. For example,at Johns Hopkins University found that schools with large surveillance infrastructure suspend students at higher rates, leading to worse academic outcomes for Black students.
Students, parents and teachers are concerned by these developments. Black families are especiallyabout how these technologies may be used to expand police presence in schools. Their concerns are supported by research demonstrating that students’ digital are increasingly used to disproportionately discipline, expel and even arrest Black schoolchildren — effectively opening a new digital frontier in the longstanding school-to-prison pipeline.
As a civil rights attorney who investigates how new technologies violate the rights of communities of color, I’ve seen these challenges play out in stunning ways.
For the last three years, I’ve worked inwith advocates to end a secret predictive policing program used by a Florida sheriff’s office against vulnerable schoolchildren. Through litigation and open records requests, we obtained copies of the sheriff’s secret youth database which contained the names of up to 18,000 children each academic year.
The office built this database using an algorithm that assessed confidential student records — including grades, attendance records and histories of child abuse — to identify students believed to be at the greatest risk of falling into “a life of crime.” Public records also showed that school-based police officers were instructed to surveil these children in school and“actionable intelligence” for criminal investigations. Local reporting that police were directed to target these children and their families with such intensity that they would feel pressure to either “move or sue.”
To be sure, the experiences of this community reflect an especially egregious example of technology-driven rights abuses in schools. However, what used to be an outlier is quickly becoming the norm.
For example, in, local leaders are planning to launch a “school safety” drone surveillance program to monitor “high crime areas” near schools. In some instances, schoolchildren will be trained to operate the new aerial surveillance system.
States likeare using a dropout prevention algorithm that explicitly treats a student’s race as a risk factor despite widespread inaccuracies and encoded bias.
In, the state national guard is using geofencing technologies to target select schools for military recruitment. Meanwhile, cities like , , and have used a suite of policing technologies to build massive “gang databases” that almost exclusively target Black and Hispanic children.
And states likeare rolling out sophisticated social media surveillance and content moderation technologies. These tools could open the door for censoring classroom conversations on race, gender and social inequality — denying students the freedom to learn.
Despite the prolific use of these technologies, there are still solutions to undo digital authoritarianism in America’s public schools.
The clearest solution is a ban on using federal funds for schools to purchase these technologies in the first place. Federal funding is abehind the widespread adoption of these technologies. Eliminating federal funds as a revenue source for school districts to procure these systems will go a long way in addressing this challenge.
The Biden administration can follow the lead of New York State, which recently issued aon the use of facial recognition in its public schools. Federal agencies like the United States Department of Education have preexisting legal authority in federal civil rights and student privacy laws to develop a federal ban on technologies that violate students’ rights.
To be sure, there is a constructive role for technology in schools. However, most reasonable minds would agree that racial injustice is not one of them. Policymakers must address the growing threat these technologies present to the freedoms and rights that protect our children and youth.
Clarence Okoh is a civil rights attorney and Just Tech fellow at the Center for Law and Social Policy in Washington, D.C. He is a Public Voices Fellow on Technology in the Public Interest with The OpEd Project in partnership with The MacArthur Foundation.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
The WEBFI algorithm actively curates and presents current news from the Internet, delivering it in both written and video formats on our platform. Unlike many other news sources, WEBFI Network - News Balance Security is committed to a user-friendly experience. We refrain from displaying advertising within our content, avoid any redirects to external sites, and meticulously filter out any graphic content deemed unsafe, sensitive, or private. Our primary goal is to provide visitors with a distraction-free and secure environment, ensuring they receive the news they seek.
Importantly, WEBFI Network does not collect any personal information from our visitors, and we do not engage in newsletter subscriptions. We take pride in remaining entirely advertiser-free, thanks to the support of our contributors and our dedicated hosting service partners. It's crucial to note that the opinions and content presented on our platform do not necessarily align with WEBFI NETWORK's opinion, philosophy, or vision. We strongly uphold the principle of freedom of speech, welcoming a diverse range of perspectives and ideas.
🌐 Discover News Balance 🇺🇲 - Your Round-the-Clock Source for Unbiased News!
Experience a continuous stream of comprehensive, unbiased news coverage 24/7/365 with News Balance 🇺🇲. Our carefully curated playlist ⏯ delivers a harmonious blend of national and global politics, cutting-edge tech updates, weather forecasts, noteworthy events, and captivating entertainment news.
The best part? No subscriptions, registrations, or downloads required. Enjoy an ad-free news experience with News Balance 🇺🇲.
"Introducing Unstoppable Private WebFi Websites – Your Forever Digital Haven.
Experience a lifetime of ownership with WebFi – where your digital presence is a lifelong investment. Embark on your journey to own a private website for life.
Our private servers set the gold standard in security and performance, ensuring your website stays in top form. With our lifetime license, the days of fretting about hosting renewals are behind you.
Unlock your very own WebFi space granting you a perpetual haven for your projects, free from the burden of recurring payments. Your sole financial commitment? Domain annuities to your domain provider – nothing more!
Choose WebFi and own your digital future, secure, simple, and everlasting."LEARN MORE