Forget Big Brother. A stranger in a coffee shop can watch you and learn virtually everything about you, where you’ve been and even predict your movements “with greater ease and precision than ever before,” experts say.
All the user would need is a photo and advanced artificial intelligence technology that already exists, said Kevin Baragona, a founder of DeepAI.org.
“There are services online that can use a photo of you, and I can find everything. Every instance of your face on the internet, every place you’ve been and use that for stalker-type purposes,” Baragona told Fox News Digital.
“So, for example, if you run into someone in public, and you’re able to get a photo of them, you might be able to find their name using online services. And if you pay enough, you might be able to find where they’ve been, where they might currently be and even predict where they’ll go.”
One company, PimEyes, which is an online face search engine that scours the internet to reverse image search, is fending off a legal complaint in the United Kingdom filed by privacy campaign group Big Brother Watch.
The company says its product is intended to allow people to search for publicly available information about themselves, but Big Brother Watch said its uses can be much more sinister and are a “great threat to privacy of millions of U.K. residents,” according to the complaint.
“Images of anyone, including children, can be scoured and tracked across the internet,” said Madeleine Stone, a legal and policy officer at Big Brother Watch.
PimEyes responded in a statement, saying, “PimEyes has never been and is not a tool to establish the identity or details of any individual. The purpose of the PimEyes service is to collect information about URLs that publish certain types of images in public domains.”
C.A. Goldberg, a New York City-based victims’ rights law firm dealing with AI-related crimes, warned the country about AI’s potential uses in stalking.
“AI could enable offenders to track and monitor their victims with greater ease and precision than ever before,” the law firm wrote in a blog post on its website.
“AI-powered algorithms could, for example, analyze and predict a person’s movements by gathering data from an array of sources: social media posts, geotagged photos, etc., to approximate or even anticipate a victim’s location,” the firm wrote.
“Advanced facial recognition technology powered by AI is far more effective than humans at identifying individuals from images or videos; even when the quality is low or the person is partially obscured. Stalkers could track victims in real-time through surveillance cameras, social media or other online sources.”
Anyone with access to these databases “could exploit them,” according to the firm.
Baragona told Fox News Digital AI will redefine humanity, but how that plays out depends on how it’s used and who is using it.
The technology is already being used and continues to advance at a rapid rate, and countries around the world are grappling with how to implement guardrails and protections.
“In general, the industry is not having its ‘come to Jesus’ moment,” Baragona said. “While I’m very concerned about the perils of AI, I’m also a strong believer in the power of AI and how it can make the world a better place.
“It’s a technological leap, and history has shown our lives tend to vastly improve due to these leaps.”