Software Engineer Vishnu Mohandas decided to leave Google in several ways when he learned that the tech giant had briefly helped him. US military develops AI to study drone images. In 2020, he quit his job working on Google Assistant and also stopped saving all his images to Google Photos. He was concerned that its content could be used to train AI systems, even if they were not specifically related to the Pentagon project. “I have no control over any future outcomes this will enable,” Mohandas thought. “So now shouldn’t I be more responsible?” »
Mohandas, who taught himself programming and is based in Bangalore, India, decided to develop an alternative photo storage and sharing service that was open source And end-to-end encrypted. Something “more private, healthier and more trustworthy,” he says. The paid service he designed, Ente, is profitable and claims to have more than 100,000 users, many of whom are already part of a privacy-obsessed crowd. But Mohandas struggled to explain to a broader audience why they should reconsider relying on Google Photos, despite all the conveniences it offers.
Then, one weekend in May, an Ente intern had an idea: give people an idea of what some of Google’s AI models can learn from studying images. Last month, Ente launched https://Theyseeyourphotos.coma website and marketing stunt designed to turn Google’s technology against itself. People can upload any photo to the website, which is then sent to a Google Cloud. computer vision program which writes a surprisingly in-depth description of it in three paragraphs. (Ente prompts the AI model to document small details in uploaded images.)
One of the first photos Mohandas tried to upload was a selfie with his wife and daughter in front of a temple in Indonesia. Google’s analysis was exhaustive, even documenting the specific watch model his wife wore, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. “We had to change the prompts to make them slightly more wholesome but still scary,” says Mohandas. Ente began asking the model to produce short, objective results – nothing gloomy.
The same family photo uploaded to Theyseeyourphotos now returns a more generic result that includes the name of the temple and the “partially cloudy sky and lush greenery” surrounding it. But the AI still makes a number of assumptions about Mohandas and his family, such as that their faces express “shared contentment” and that “the parents are probably of South Asian, middle-class descent.” He judges their clothing (“suitable for sightseeing”) and notes that “the woman’s watch shows a time of approximately 2 p.m., which corroborates the image’s metadata.”
Google spokesperson Colin Smith declined to comment directly on Ente’s project. He asked WIRED to support pages These status uploads to Google Photos are only used to train generative AI models that help people manage their image libraries, such as those that analyze the age and location of photo subjects. said it does not sell the content stored in Google Photos to third parties or use it for advertising purposes. Users can turn off some Photos analytics features, but they can’t completely block Google from accessing their images because the data isn’t end-to-end encrypted.