The image recognition software built into Google Photos is impressive. Show it a photo of the Eiffel Tower, and it will know that picture was taken in Paris. Snap an image of a dog or a tree, and it will automatically put them in a group with all your other pictures of dogs or trees.

But the software is far from foolproof. And when it fails, it does so in a spectacular way — as when it recently processed a photo of two black friends and labeled them “Gorillas.”

Jacky Alcine, a 21-year old programmer who lives in Brooklyn, N.Y., was checking out his Google Photos account last night when he saw that the service had automatically generated a folder titled “Gorillas.” It contained nothing but pictures of him and a friend that he had taken in 2013.

When alerted to the error, Google provided a solution to the problem within hours and issued an immediate mea culpa.

“We’re appalled and genuinely sorry that this happened,” a Google representative told Yahoo Tech.

“We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future,” the spokesperson added.

Alcine first noticed the problem when he looked at his photo collection and found the “Gorillas” folder containing images of himself and his friend, who is also black. He then took to Twitter to call out Google for the issue.

Shortly thereafter, Google’s chief architect of social, Yonatan Zunger, tweeted Alcine asking if Google could access his account to see where things went wrong. A few hours later, Google alerted him that the problem had been fixed.

Alcine said that as of Monday evening, the issue had largely been addressed, though he noted, “there’s still complications with the hands obscuring the face causing it to still match to the gorilla tag. Chimp gives results as well (but not chimpanzee).”

Alcine believes the gaffe was caused by a faulty Google algorithm. But he added, “This could have been avoided with accurate and more complete classifying of black people, especially darker-toned people of color like myself and my friend.”

For its part, a Google representative said, “We test our image recognition systems on people of all races and colors.”

This isn’t the first time a tech company’s facial recognition software has run into trouble when dealing with people of color.

In 2009, HP’s facial recognition software was unable to detect a black person’s face but had no problem identifying and tracking a white person’s face.

It’s clear there was no malicious intent by either Google or HP in these incidents, but they do illustrate the fact that image and facial recognition technologies still have some work ahead.

Update: Flickr had a similar issue with its automatic tagging algorithm. In May, several users complained that tags with the words “animal” and “ape” were being automatically added to images of both black and white people.

Source:
https://www.yahoo.com/tech/google-ph...793782784.html