Why You Should Secure Your Online Photos
Why You Should Secure Your Online Photos GA S REGULAR Menu Lifewire Tech for Humans Newsletter! Search Close GO News > Smart & Connected Life
The scraping of photos is part of an arms race among companies to develop better facial recognition. For example, the company Clearview AI sucked up 3 billion images and took this a step further by creating an AI app, Maple noted. The app acts as a search engine and allows a user to take a photo of someone, upload it, and see a list of public pictures of that person and links to where they came from. The fact that these photos are used without people's knowledge is a significant privacy violation. "Interestingly enough, we see the most hesitation for this software at the government/law enforcement level, due to legalities and profiling concerns," Laura Hoffner, a crisis manager at the risk consultancy firm Concentric Advisors, said in an email interview. "But that means the private industry is superseding the government in experience and access." Users who want to keep the photos they have already posted online private have limited options. "There isn’t much you can do other than take the nuclear option, that is, hire a lawyer and sue the company in question," Maple said. "But of course, you’ve got to be dedicated and moneyed."
Why You Should Secure Your Online Photos
Your mug may be more places than you think
By Sascha Brodsky Sascha Brodsky Senior Tech Reporter Macalester College Columbia University Sascha Brodsky is a freelance journalist based in New York City. His writing has appeared in The Atlantic, the Guardian, the Los Angeles Times and many other publications. lifewire's editorial guidelines Updated on February 4, 2021 11:29AM EST Fact checked by Rich Scherr Fact checked by Rich Scherr University of Maryland Baltimore County Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. lifewire's fact checking process Tweet Share Email Tweet Share Email Smart & Connected Life Mobile Phones Internet & Security Computers & Tablets Smart Life Home Theater & Entertainment Software & Apps Social Media Streaming GamingKey Takeaways
Software companies search through millions of publicly available online photos to build facial recognition systems. A new website helps you find out if your Flickr photos were used for AI research.The use of online photos by big tech companies is an invasion of privacy, some experts say. Dimitris Otis / Getty Images Software companies are scooping up private photos to build facial recognition systems, and a new website can help determine if your pictures are among them. The website, called exposing.ai, searches through public databases to determine if your Flickr photos were used for AI research. Software developers often use publicly available images to train their recognition systems. The practice may be legal, but some experts believe it's not ethical. "The fact that these photos are used without people's knowledge is a significant privacy violation," Thierry Tremblay, the CEO and founder of the database software company Kohezion, said in an email interview. "That's a particular concern for minorities who could be profiled and targeted. Furthermore, users don't necessarily consent to get scanned every time they go out in public."Flickr May Reveal More Than You Know
The exposing.ai website works by looking to see whether your photos were included in publicly available datasets. It looks for Flickr usernames and photo IDs. All you have to do is enter your Flickr username, photo URL, or hashtag in the site’s search bar. The site was launched last month, and is based on years of research into public image datasets, Exposing.ai’s creators wrote on the website. "Telling the complex story of how yesterday's photographs became today's training data is part of the goal of this ongoing project," they said. izusek / Getty Images The site searches millions of records, but "countless more face recognition training datasets exist and are continuously being scraped from social media, news, and entertainment sites," they wrote. Companies are hoovering up images to power their software projects. "Certainly tech giants like Google, Amazon, Facebook, and Apple are all deep into researching and exploring facial recognition technology," Nat Maple, chief marketing officer of cybersecurity company BullGuard, said in an email interview.An Arms Race for Photos
The scraping of photos is part of an arms race among companies to develop better facial recognition. For example, the company Clearview AI sucked up 3 billion images and took this a step further by creating an AI app, Maple noted. The app acts as a search engine and allows a user to take a photo of someone, upload it, and see a list of public pictures of that person and links to where they came from. The fact that these photos are used without people's knowledge is a significant privacy violation. "Interestingly enough, we see the most hesitation for this software at the government/law enforcement level, due to legalities and profiling concerns," Laura Hoffner, a crisis manager at the risk consultancy firm Concentric Advisors, said in an email interview. "But that means the private industry is superseding the government in experience and access." Users who want to keep the photos they have already posted online private have limited options. "There isn’t much you can do other than take the nuclear option, that is, hire a lawyer and sue the company in question," Maple said. "But of course, you’ve got to be dedicated and moneyed."