Your Flickr Photos May Have Been Used for Facial Recognition



It has become common for people to post selfies, and photos of friends and family, online. Professional photographers who use models may post their photos in an online portfolio. Unfortunately, photos that include people’s faces are being used without permission by researchers who want to create facial recognition algorithms.

NBC News reported that, in January of 2019, IBM released a collection of nearly a million photos that were taken from Flickr and coded those photos to describe the subject’s appearance. According to NBC News, IBM promoted the collection to researchers as a progressive step toward reducing bias in facial recognition.

I personally feel that there are a lot of ethical problems with what IBM has done. The most obvious one is that it didn’t ask the photographers if it could use their photos.

A company as large as IBM has the money to pay photographers for the use of their photos. Stealing other people’s art is wrong. IBM is also big enough to hire a few people to get consent forms from the people who are in the photographs.

Another ethical problem is that facial recognition software is controversial. It evokes a “Big Brother is watching you” kind of feeling. Personally, I would feel disgusted if my face was used to train facial recognition software.

In July of 2018, the ACLU tested Amazon’s facial recognition tool (called “Rekognition”). It incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. False matches could result in police arresting the wrong person.

NBC News reported that IBM said Flickr users can opt out of the database. However, NBC News discovered that it’s almost impossible to get photos removed.

Now would be a good time to make your Flickr and Instagram accounts private. Don’t let grabby companies steal your photos and use them in an ethically questionable algorithm.