Roomba sends intrusive photos to labeler, including a woman on the toilet
Roomba vacuum cleaners used for training purposes captured sensitive photos of volunteers, which were then shared on social media.

Roomba vacuums collect sensitive images, such as one showing a woman sitting on a toilet seat, that are reportedly shared among those tasked with training the robotic cleaning system. This is the relevant part. In August 2022, Amazon acquired iRobot for $1.7 billion. Now, Amazon's record on privacy itself is rather sketchy. Several investigations that surfaced more than a year ago revealed that Amazon handled sensitive customer data in a rather irresponsible manner.
According to documents reviewed by MIT Technology Review, images taken by Roomba robotic vacuums were shared on places like Discord and Facebook with the contractor responsible for labeling them. iRobot claims the leaked images were captured using test hardware, for which volunteers agreed to provide audiovisual data. Training data captured by the cleaning robot’s on-board cameras is collected by iRobot and then sent to a startup called Scale AI, which hires contractors to label them for the AI to understand.
Beware Of Your Friendly Robo Cleaner

The process of AI annotating data is often outsourced through contractors to low-wage workers in Asia and Africa. It's worth noting that none of this is commercially sold consumer-grade hardware. The two companies are severing professional ties after iRobot said Scale AI violated its privacy policy by allowing the dissemination of sensitive media. Scale AI also admitted that its data labeling employees violated its code of conduct.
The soon-to-be Amazon company also claims that these "specially developed robots with hardware and software modifications" also come with volunteer agreements expressly telling them that their data will be sent for training purposes. Additionally, they were advised to keep sensitive subjects and children out of areas where the robot moved around to take photos, videos and audio clips. Worryingly, 95 percent of iRobot’s data is collected from real households, and most of the volunteers are paid households recruited by employees or contractors. Only a small fraction of the AI training data was collected from the faux houses created as models in the test set.
Leaked images were collected in 2020 from homes in countries including the US, Japan, France, and Germany, 15 of which were seen by MIT Technology Review. iRobot claims to remove all identifying information about a person from the training data. If any images appear to be sensitive in nature, such as images depicting private moments or any nude state, it automatically deletes the entire log. However, it's unclear whether the images were automatically removed, or whether human moderators saw them first and then decided to take the necessary action.
A rather worrying conclusion to draw from the iRobot saga is that these leaks occurred during the testing phase of a cleaning robot in development, where volunteers were told what they were getting into. With a commercial device like the Roomba, most people don't even bother to read the data collection terms carefully, and could end up unknowingly handing large amounts of personally identifiable data to companies with lax data security measures.
More: Uber Eats' delivery robots are coming to Miami