Every day government decisions from bus routes to policing used to be based on limited information and human judgment. Check out our new database and join MuckRock in learning more about these uses.
The largest city in the country seemed ready to create a framework to follow. But the final NYC Automated Decision Systems Task Force report, released last month, concluded that quick answers wouldn’t be available for addressing questions of fairness and accountability, even when trying to define ADS itself.
When undercover officers with the Jacksonville Sheriff’s Office bought crack cocaine from someone in 2015, they couldn’t actually identify the seller. Less than a year later, though, Willie Allen Lynch was sentenced to 8 years in prison, picked through a facial recognition system. His case — and others like it — could reshape the limits placed on AI.
Two years after Boston Public Schools faced parental backlash over an algorithm to improve bus routing, school districts are looking to give the software another chance, drawn by potential to save millions.
The AI Now Institute is calling for checks on the datasets used by predictive policing systems because of concerns that the technology can perpetuate, rather than address, “dirty” policing practices.