General Motors Wants To Stop People Being Mean To Autonomous Vehicles
General Motors has been granted a patent for a system that will attempt to prevent autonomous cars from being bullied by pedestrians or human drivers.
As autonomous vehicles become increasingly integrated into urban environments—at least in California and Arizona— they face a unique and growing challenge: mean behavior from pedestrians, cyclists, and other human road users. Such behavior often hilariously exploits the safety-first programming of autonomous systems, creating situations that disrupt traffic flow and potentially compromise safety.
The USPTO recently published a patent application from General Motors for a system of "Responses To Vulnerable Road User's Adversarial Behavior" or in regular speak: bullying by pedestrians.
GM's system will use sensors, cameras, and machine learning algorithms, to identify potential threats from pedestrians by analyzing movement patterns, and the person's proximity, before evaluating the individual's intent to interact with the autonomous vehicle.
The document lists examples of adversarial behavior including deliberate violations of right-of-way, sudden movements designed to force an autonomous vehicle to brake, tailgating by aggressive drivers, and even rude gestures or verbal provocations directed at the vehicle. The included illustrations show the "OK" hand symbol as an example of a rude gesture.
GM claims the system will be able to distinguish between a pedestrian inadvertently stepping into the road and one intentionally challenging the vehicle's braking system. Once adversarial behavior is detected, the system enables the autonomous vehicle to respond appropriately.
Basically, once the AV decides people are being mean, it goes through a decision tree to decide which response is most appropriate due to the level of risk it faces. The choices are visual/audio warnings like honking and flashing lights attempting to scare the threat away, phoning home to a call center alertering authorities, giving up and transferring control of one of its passengers, or simply backing away and finding an alternate route.
The problem is quite topical considering the slew of complaints emerging in recent months from Waymo passengers trapped in their autonomous cabs while they're vandalized, or in one case by a guy asking a woman for her number.
Pedestrian and autonomous vehicle interaction is a touchy subject for General Motors. In October 2023 one of its automated Cruise vehicles operating in San Francisco ran over and dragged a woman for 20 feet after she was involved in a hit-and-run with another vehicle. Cruise vehicles have recently returned to action in the Bay Area after the company halted operations following the incident.
Become an AutoGuide insider. Get the latest from the automotive world first by subscribing to our newsletter here.
An experienced automotive storyteller and accomplished photographer known for engaging and insightful content. Michael also brings a wealth of technical knowledge—he was part of the Ford GT program at Multimatic, oversaw a fleet of Audi TCR race cars, ziptied Lamborghini Super Trofeo cars back together, been over the wall during the Rolex 24, and worked in the intense world of IndyCar.
More by Michael Accardi
Comments
Join the conversation