Skip to main content

Civil rights

Some of society’s most important decisions are being made automatically, and there’s no guarantee they’re being made fairly. The shift to algorithmic decision-making has opened up a new front in the fight for civil rights, one algorithm at a time. Algorithms are choosing who can rent an apartment and whether an accused criminal gets bail, with little accountability for whether those decisions are being made fairly. As police and border agents embrace facial recognition, a simple shift in error rates can result in massive racial disparities in the people getting searched. And in every instance, the agencies responsible will point to automation as a reason why no bias could be present. That push for algorithmic accountability is one of the most important fights in technology today, with implications for nearly every industry and sector of society. This is where we’ll track those fights and try to shine a light on the chilling new threat to civil rights.

The American system of democracy has crashed

Some patriotic reflections on this Independence Day.

Elizabeth Lopatto and Sarah JeongCommentsComment Icon Bubble
How to secure your phone before attending a protestHow to secure your phone before attending a protest
Tech
The Verge guide to privacy and security
Barbara Krasnoff and Aliya ChaudhryCommentsComment Icon Bubble
T
Tina Nguyen
Here’s how the State Department finds which students to deport.

Journalist Ken Klippensten obtained an internal memo directing employees to conduct a “social media review” of foreign students applying for visas, and must flag any online posts or screenshots “advocating for, sympathizing with, or persuading others to endorse [or] support” a terrorist organization. The directive is seemingly aimed at students who participated in pro-Palestine – or as the memo specifically characterized it, “pro-Hamas” – campus protests.

R
External Link
Richard Lawler
Clearview AI CEO Hoan Ton-That resigns.

According to Forbes reporter David Jeans, the problem isn’t that it’s a facial recognition surveillance company with a database of pictures scraped from social media without permission and representing the end of privacy and drawing attention from regulators. It’s actually the failure to secure government contracts.

Surveillance has a body countSurveillance has a body count
Policy