An Amazon facial recognition tool has wrongly identified 28 members of the US Congress as police suspects.

The American Civil Liberties Union (ACLU) compared official photos of the politicians with a database of public arrest photos.

Amazon took issue with the findings, saying the system needed to be set at 95% accuracy not the 80% used by ACLU.

But the civil rights group said it highlighted the inadequacy of facial recognition technology.

"Our test reinforces that face surveillance is not safe for government use," said Jacob Snow, ACLU's technology and civil liberties lawyer.

"Face surveillance will be used to power discriminatory surveillance and policing that targets communities of colour, immigrants, and activists. Once unleashed, that damage can't be undone."

In response to the test, a spokesperson for Amazon Web Services told the BBC: "We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement."

"With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds."

Unintended consequences
Rekognition is touted by Amazon as being useful for a range of things, from detecting offensive content to identifying celebrities.

It is also working with some US local law enforcement agencies to implement the system for identifying criminals.

The 80% accuracy range used by ACLU is the system's default setting but a spokeswoman for Amazon Web Services told Reuters that, for identifying individuals, it recommended setting a threshold of 95% or higher.

According to ACLU, nearly 40% of the system's false matches were for black Congress members, even though they make up only 20% of the legislature.

Among those being wrongly identified was civil rights leader John Lewis, who is a member of the Congressional Black Caucus.

That group recently wrote to Amazon chief executive Jeff Bezos expressing concerns about the "profound negative unintended consequences" facial recognition systems could have for black people.

"Congress should press for a federal moratorium on the use of face surveillance until its harms, particularly to vulnerable communities, are fully considered," said ACLU's legislative counsel Neema Singh Guliani.

"The public deserves a full debate about how and if face surveillance should be used."

In the UK, lawyers for civil liberties group Big Brother Watch have launched a legal challenge against the use of automatic facial recognition technology by police.