Originally founded in 2017 by two computer programmers in Poland, it’s an AI tool that’s like a reverse image search on steroids — it scans a face in a photo and crawls dark corners of the internet to surface photos many people didn’t even know existed of themselves in the background of restaurants or attending a concert.
While the company claims it is a service that can help people monitor their online presence, it has generated controversy for its use as a surveillance tool for stalkers, collecting countless images of children and for adding images of dead people to its database without permission.
Allyn, B. (2023, October 11). ‘Too dangerous’: Why even Google was afraid to release this technology. NPR. https://www.npr.org/2023/10/11/1204822946/facial-recognition-search-engine-ai-pim-eyes-google