Amazon’s robot is a disaster and is useless according to its developers

While Amazon yesterday unveiled Astro, its first domestic robot, internal opinions are not kind to it. Several anonymous sources report that the latter does not work at all. His facial recognition is a disaster and he even goes so far as to fall down the stairs. Not to mention the phenomenal amount of personal data it collects.

Credit: Amazon

Yesterday, Amazon unveiled its first domestic robot named Astro. Equipped with multiple cameras, microphones and other sensors, this allows you to constantly keep an eye on your house. He is able to detect possible dangers and unusual events, and warns when a stranger breaks into his home. In short, Astro is the security solution to feel good at home, according to Amazon. Corn its creators themselves do not seem to be of the same opinion.

Beyond its relatively high price, 1499,99 dollars all the same, it is its operation which poses problem. Or at least, the many flaws in the latter. And the developers do not go hand in hand to describe them. According to two internal sources wishing to remain anonymous, buying the robot would simply be a mistake, so far are the promises of Amazon from reality. Worse, it represents a real danger for the privacy of its users.

Astro can jump down the stairs at any time

In his promotion video, everything seems to be going – no pun intended – like clockwork for Astro. Corn the reality would be quite different, reveals a person who worked on the project. “Astro is terrible and will certainly throw himself down a staircase if the opportunity presents itself”, admits the latter. She then qualifies the “Home security proposal” from Amazon « risible », as its functionalities are unreliable.

And it does not stop there. If it turns out that your robot does indeed have suicidal tendencies, there is a good chance thathe does not survive the fall. “The device seems fragile for a product whose cost is absurd”, reports the source. “The mast has broken on several devices, getting stuck in the extended or retracted position, and there is no way to send it back to Amazon when that happens. “

Astro is against your personal data

If this observation is already annoying for Amazon, that of the use of users’ personal data is even more so. In an internal document, Astro’s recognition and analysis system is referred to as Sentry. The latter’s role is to determine whether a situation is abnormal and, if so, to conduct a survey to assess the risks. This applies in particular to people who are not registered in his directory as residing in the house.

On the same topic: Why Amazon would like to create some sort of humanoid Alexa robot

“Sentry must first try to identify the person if he is not yet unknown for a period of 30 seconds”, can we read in the document. “When the person is identified as unknown or 30 seconds have passed, Sentry should start following the person until Sentry Mode is deactivated.”. What Amazon then calls “A series of actions taken by Sentry for investigation” actually turns out to be a video and audio recording of the person in question, without her having given her consent.

“In my opinion, it’s a privacy nightmare that illustrates the way we trade our personal data for convenience with devices like Vesta [nom de code en interne d’Asto, ndlr] », commented a second source who worked on the robot. The latter further adds that Astro’s facial recognition performance is ridiculous, which is not very reassuring given the fact that it is supposed to follow strangers to detect their potential dangerousness.

amazon astro
Credit: Amazon

Amazon ensures Astro is working properly

For his part, Amazon says there is no problem to use Astro. “We designed Astro to handle a lot of the data processing on the device, including the images and raw data from the sensors it processes as it moves around your home.” informed the builder. “This allows Astro to react quickly to its surroundings. In addition, your visual identity is stored on the device and Astro uses the processing on the device to recognize you. “

Ayanna Howard, an algorithmic bias expert who participated in the project, adds that “Amazon has designed, tested and improved its approaches based on data and feedback to minimize bias in its visual identification function. They set performance goals that were relevant to their product use cases, trained their models on a large volume of incredibly diverse data, and shared enough detail with me about their methods in a sincere effort to ensure that the option not only works statistically well for all of their clients, but also continues to improve over time on behalf of those clients ”.

Source: Vice

Leave a Comment