Empty of the ecovacs robot, which were found to suffer from critical cybersecurity defectsCollect photos, videos and vocal recordings – taken from customers’ homes – to train the company’s AI models.
Chinese Home Robotics Company, which sells a range of popular Deebot models in Australia, said its users “readily participate” in a product improvement program.
When users opt in this program via the Ecovacs smartphone application, it is not told what data will be collected, only that it will “help us to strengthen the improvement of product functions and attached quality”.
Users are invited to click on “above” to read the details, but no link is available on this page.
Ecovacs privacy policy – Available elsewhere in the application – Allows the collection of user data covers for research purposes, in particular:
- The 2D or 3D card of the user’s house generated by the device
- Vocal records of the device’s microphone
- Photos or videos recorded by the camera of the device
He also indicates that vocal recordings, videos and photos that are deleted via the application can continue to be held and used by Ecovacs.
An Ecovac spokesperson confirmed that the company uses the data collected as part of its product improvement program to train its AI models.
Critical cybersecurity defects – allowing certain Ecovacs models to be hacked from afar – have a doubt about the company’s ability to protect this sensitive information.
Cybersecurity researcher, Dennis Giese, reported the problems to the company last year after finding a series of basic errors endangering the privacy of Ecovacs customers.
“If their robots are broken like that,” he asked, “what does their back-end look like?
“Even if the company is not malicious, it could be the victim of business spying or actors of the national state.”
Ecovacs – which is evaluated at $ 4.6 billion – said that it “explored more complete test methods” and that it is committed to solving security problems in its flagship vacuum of the robot robot in November.
Ecovacs says that anonymous data
In a blog article from 2020Two engineers from the Ecovacs robotics department described a problem they had faced.
“Building an in -depth learning model without much data is like making a house without plans,” wrote Liang Bao and Chengqi LV.
“Due to the prospect of unique soil view and the categories of rare objects, we do not find any public data that meets our needs.
“Consequently, we first cooperated with many institutions to collect data from around the world.”
A company spokesperson told the ABC that this set of pre-launched data did not involve “real information on user households”.
But since the launch of the products, they have confirmed that the data of users who had opted for its “product improvement program” have been used to form its AI model.
“During this data collection, we anonymous user information at the machine level, ensuring that only anonymized data is downloaded from our servers,” the spokesman said in a press release.
“We have implemented strict access management protocols to display and use this anonymized user data.”
Intimate photos shared on social networks
Imaging from the robot empty has already been disclosed. In 2022, the intimate photos taken by Irobot devices were shared on Facebook, including a person sitting on the toilet.
The robots which had taken them made, in this case, part of a test program in which users had chosen.
Company spokesperson Tell Mit Tech Review Whether it was “special development robots with hardware and software changes that are not and have never been present on Irobot consumer products to buy”.
The devices were physically labeled with bright green stickers (they said “the current video recording”) and the users had agreed to send data to Irobot for research purposes.
It is one thing to allow a company based in the United States to access the imagery of devices. But that’s another for photos to meet on a social media site.
And then there is the question of how they found themselves there.
Images disclosed by contract workers
Irobot had contracted an AI training data company called Scale IA to analyze raw images for the formation of its object detection algorithm.
The founder of the AI scale, Alex Wang, described his business – which is estimated at $ 20 billion – as resolving the “data problem” for the AI industry.
“Our data engine generates almost all the data necessary to feed the main language models,” he said in an interview with CNBC.
Reality for its millions of contract workers, as described in a report in 2023 of the Washington Postis much less glamorous.
“Workers differentiate pedestrians from palm trees in videos used to develop automated driving algorithms; They label the images so that AI can generate representations of politicians and celebrities; They modify pieces of text to ensure that language models like Chatgpt do not produce coal. “”
Irobot ended his relationship with the AI scale after his entrepreneurs disclosed photos on social networks.
Do cleaning robots even need high definition cameras?
Researchers from the Australian Center for Robotics have developed a solution that can completely avoid this problem.
To keep sensitive images out of the reach of pirates, they have developed a technology that changes “how a robot sees the world”.
Indeed, it is an intrinsically “preserving” camera.
By blurring the image taken by the camera beyond recognition before it is even scanned, there is no way that the distance attacks can access raw imaging.
And yet, enough information is always kept in the blurred image so that the robot can navigate.
“There is no potential for a security break,” said Donald Dansereau, lecturer at the University of Sydney who supervised the project.
Technology is not entirely ready for marketing, but Dr. Dansereau is convinced that it will see the adoption of technological companies.
He underlines that there is “no magic solution on the technological side – a good policy and good literacy are always necessary”.