Research shows that facial recognition software makes a lot of mistakes and is particularly inaccurate on darker-skinned faces. Those errors can turn innocent people into suspects when used by police to identify criminals. The criminal justice system also uses risk assessment algorithms to assist in bail and sentencing recommendations that have been shown to overate the risk of black suspects and underrate the risk of white ones.
In the latest issue of The Takeaway, Artificial Intelligence: A Double-edged Sword, Bush School of Government and Public Service faculty member Justin Bullock describes some of the more questionable applications of artificial intelligence (AI) tools and proposes a framework for evaluating them.
The Takeaway is a publication of the Mosbacher Institute for Trade, Economics, and Public Policy at the Bush School of Government & Public Service at Texas A&M University.