Startups This robot has crossed a line it shouldn't have...

This robot has crossed a line it shouldn’t have done because people told it •


- Advertisment -

Video of a sidewalk delivery robot breaching yellow warning tape and rolling through a Los Angeles crime scene went viral this week, garnering more than 650,000 views on Twitter and sparking debate over whether the technology is prime-time ready.

It turns out that the robot’s error, at least in this case, was caused by humans.

The video of the event was taken and posted to Twitter by William Gude, the owner of Film the police LA, a police watchdog account in LA. Gude was near a suspected school shooting at Hollywood High School around 10 a.m. when he captured the bot on video as it hovered on the corner of the street, looking confused, until someone lifted the tape, letting the bot make its way could be prosecuted by the crime scene.

Uber spinout Serve Robotics told that the robot’s self-driving system did not decide to cross the crime scene. It was the choice of a human operator who controlled the bot remotely.

The company’s delivery robots have so-called Level 4 autonomy, which means that they can drive themselves under certain conditions without a human taking over. Serve has been testing its robots with Uber Eats in the area since May.

Serve Robotics has a policy that requires a human operator to remotely monitor and assist their bot at any intersection. The human operator will also remotely take control if the bot encounters an obstacle such as a construction zone or a fallen tree and cannot figure out how to navigate around it within 30 seconds.

In this case, the bot, which had just completed a delivery, approached the intersection and a human operator took over, according to the company’s internal company policy. Initially, the human operator paused at the yellow warning ribbon. But when bystanders hung up the tape and apparently “swung through,” the human operator decided to carry on, Serve Robotics CEO Ali Kashani told

“The robot would never have crossed (by itself),” Kashani said. “There’s just a lot of systems in place to make sure it never crosses until a human gives permission.”

The error of judgment here is that someone decided to actually keep crossing, he added.

Regardless of the reason, Kashani said it shouldn’t have happened. Serve has extracted data from the incident and is working on a new set of protocols for humans and AI to prevent this in the future, he added.

A few obvious steps are to make sure employees follow standard operating procedure (or SOP), including proper training and developing new rules for what to do if a person tries to swing the robot through a barricade.

But Kashani said there are also ways to use software to prevent this from happening again.

Software can be used to help people make better decisions or avoid an area altogether, he said. For example, the company can work with local law enforcement to send current information about police incidents to a robot so it can navigate those areas. Another option is to give the software the ability to identify law enforcement officers and then alert human decision makers and remind them of local laws.

These lessons will be critical as the robots progress and expand their operational domains.

“The funny thing is that the robot did the right thing; it stopped,” Kashani said. “So this really goes back to giving people enough context to make good decisions until we’re sure we don’t need people to make those decisions.”

Serve Robotics’ bots haven’t reached that point yet. However, Kashani told that the robots are becoming more independent and mostly work alone, with two exceptions: intersections and some sort of blockage.

The scenario unfolding this week goes against how many people see AI, Kashani said.

“I think the story in general is that people are really great in edge cases and then AI makes mistakes, or maybe it’s not ready for the real world,” Kashani said. “Funnily enough, we learn the opposite, which is that we notice that people make a lot of mistakes and that we should rely more on AI.”

Shreya Christina
Shreya has been with for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider team, Shreya seeks to understand an audience before creating memorable, persuasive copy.


Please enter your comment!
Please enter your name here

Latest news

1xBet apk Скачать на Андроид и iOS Бесплатное официальное приложение

ContentХарактеристики мобильного приложение 1хБет на андроидПодскажите, можно ли скачать это приложение бесплатно на мобильный?Можно ли скачать приложение 1xBet с...

¿1xbet es confiable y legal en Chile? Resolvemos tus dudas en agosto 2023 Goal com Chile

También ten en cuenta factores como lesiones, cambios en la alineación y otros eventos que puedan influir en el...

1xbet казино официальный сайт 1хбет зеркало казино онлайн

ContentПреимущества и недостатки версии для смартфоновBet официальный сайт онлайнОбмен промокодов 1 xBet на баллыКак вывести деньги с казино?Промокод на...

1xBet APK Скачать на Андроид бесплатно на русском языке

ContentСкачать 1xbetЕсть ли бонусы за установку приложения 1 xBet на андроид?Ставки на спорт в приложениях 1xBetПоддерживаемые устройства AndroidПодскажите, можно...
- Advertisement -

Linkinizlə keçmiş hər yeni oyunçu daimi olaraq sizə təyin olunur Gətirdiyiniz hər bir oyunçu üçün xalis gəlirimizin 40%-ə...

ContentBet Azərbaycan bukmeker: rəsmi saytın nəzərdən keçirilməsiMinimum və maksimum tariflərMərclər 1xBetBet şəxsi hesabınıza daxil olunİlk depozit bonusuCanlıDepozitİlk depozit bonusuBet...

Must read

- Advertisement -

You might also likeRELATED
Recommended to you