Committee challenges police use of AI

Police use of AI and facial recognition technology is not subject to proper oversight and risks exacerbating discrimination, a parliamentary committee has warned.

The Lords Justice and Home Affairs committee said new technologies were being created in a ‘new Wild West’ without the law and public awareness keeping up with developments, warning that lack of oversight meant ‘users are actually inventing As things progress. ”.

The cross-party group added that while AI has the potential to improve people’s lives, it could have “serious implications” for human rights and civil liberties in the justice system.

“Algorithms are used to improve crime detection, facilitate security categorization of prisoners, streamline entry clearance processes at our borders, and generate new intelligence that feeds the entire criminal justice pipeline,” the peers said.

The committee added that the review did not ensure that the new tools were “safe, necessary, proportionate and effective”.

According to the group, police and other law enforcement agencies were buying equipment in a “disturbingly opaque” market, with details of how the systems worked being kept secret due to the companies’ insistence on commercial confidentiality.

As a result, they called for a mandatory registry of algorithms used in criminal justice tools, a national body to set standards and certify new technologies, and new local ethics boards to oversee their use.

They also called for the police’s duty of candor so that there is full transparency. The committee said AI can have huge effects on people’s lives, especially those in marginalized communities, and without transparency there can be no oversight or accountability when things go wrong.

Baroness Hamwee, chair of the committee, said: “What would it be like to be convicted and jailed on the basis of an AI that you don’t understand and can’t challenge?

“Without adequate safeguards, advanced technologies can affect human rights, compromise fair trials, increase inequalities and weaken the rule of law. The tools available must be suitable for their purpose and not be used without supervision. »

Hamwee said they welcome the benefits AI can bring to the UK justice system, but not if there is not proper oversight. “Humans need to be the ultimate decision makers, knowing how to question the tools they use and how to challenge their results,” she said.

Peers also raised concerns about the use of AI in “predictive policing” (predicting crime before it happens), highlighting concerns that it could compound discrimination issues by incorporating “human bias” algorithms.

Professor Karen Yeung, an expert in law, ethics and IT at the University of Birmingham, told the committee that the ‘criminal risk assessment’ tools did not focus on white-collar crimes such as insider trading, due to lack of data, but instead focused on crimes for which there was more information.

“It’s really pernicious. We’re looking at high-volume data that is mostly about the poor, and we’re turning it into predictive tools about the poor,” Yeung explained. “We’re leaving whole sections of society untouched by these tools. .”

Sign up for the E&T News email to get great stories like this delivered to your inbox every day.