Energy
Friday, July 8th, 2022 11:05 am EDT
The biggest concern most people have about autonomous vehicles is safety. We not only want the vehicle to take us places without killing us, but we also don’t want the difficult job of sorting through the Trolley Problem. In other words, we don’t want the trip to directly cost anybody their life. But, that’s not the only thing that can go wrong with autonomous vehicles. Even when everyone gets to go home at night, some crazy things can still happen.
A little over a week ago, some residents of San Francisco got to see this first hand.
Normally, GM’s Cruise cars drive around the city giving people rides. Nobody’s driving the modified Chevy Bolt EVs, but that doesn’t mean they aren’t connected to a central server room somewhere that tells them where to go, looks out for problems, etc. When someone fires up their Cruise app and requests a ride, these servers send the car to go get the passenger, and then send it to the passenger’s destination.
But, as we all should know by now, sometimes things don’t go according to plan. Some sort of computer error told dozens of Cruise cars in the city to all go to the same street. But, none of the cars were ready to deal with encountering their clones, with the earlier arrivals blocking the way of those that showed up later.
To add to a night of technical oddities there are three Cruise vehicles, all (literally) driverless, stuck at and partially blocking the corner of Geary and Mason ? pic.twitter.com/ypBze8nrnW
— Smerity (@Smerity) June 21, 2022
One guy on the scene posted his account of this strange traffic jam on Twitter:
“There were originally four Cruise vehicles but one eventually made a grand escape. The leading Cruise vehicle has been there at least fifteen minutes as that’s how long I had to wait for fast food. Occasionally one of them would lurch forward a little just for added suspense ?
“Other than the ‘blocking road’ aspect it was pretty cute. One partially inebriated fellow was cheering the cars on, telling them they could do it, and together we informed a driver waiting patiently to turn right that the cars ahead were driverless and they’d need to overtake ^_^”
Some of the vehicles had enough presence of artificial mind to send an error message to support personnel, and the self-driving software was disabled on these vehicles. The interior screens, visible from the outside, said: “A Cruise support specialist is on the way to help in person. First responders should contact Cruise at [phone number]. Self driving off, we’ve parked the car while the issue is resolved.”
Like all Cruise cars and your credit card during a meme scam, the vehicle displaying the error message had a name: Melon. But, the other cars behind Melon were having a rougher time and couldn’t phone home the way Melon could. They had their normal display going, and would occasionally lurch forward slightly when stuck behind the disabled cars.
Eventually, people from Cruise came and manually drove the cars back home.
Some Good Things We Can Take Away From This
Plenty of people on social media bagged on the Cruise cars, especially Tesla fans who wanted people to see that other autonomous vehicles in development can have problems, too. But, it wouldn’t really be fair to make that kind of comparison.
First off, the vehicles all arrived to the erroneous destination safely, and with no human supervision. They didn’t try to drive in front of any trains, stop in front of people for no reason, randomly hit the brakes, or do other things that would land you in a collision. They made it to the destination safely, even if there was a problem with the chosen destination.
Once they were there, they didn’t do anything dangerous. No system will be perfect, but being able to identify that there’s a problem and phone home for support is better than just blundering into trouble. The cars seemed to be as safely stopped as they could in that situation, and didn’t hurt any of the people running around checking them out.
Finally, we can see that thought has gone into preparing stalled out Cruise cars to interact with law enforcement. They can’t take direct commands or anything, but they do have a message for anyone happening upon them, including a phone number that police, fire, or EMS workers can call to get assistance dealing with the vehicles. This means that instead of having to train people to deal with the cars, the cars are prepared to meet public safety personnel where they are.
What all of this adds up to is at least part of a “failover” system, or a system that can keep a vehicle from becoming a safety hazard when it’s too much for the autonomous vehicle without a safety driver right there ready to take over. The cars pulled as far to the right as possible, stopped, engaged hazards/flashers, and sat still for the most part. They didn’t circle the block or end up in anybody’s lawn.
Really, any vehicle with autonomous features that isn’t prepared to at least pull over (out of major traffic lanes if possible) and call for help can’t ever call itself an autonomous vehicle. Why? Because the lack of such a feature means that a human must be present to serve as backup, and that makes a vehicle very much non-autonomous.
While having a bunch of Cruise cars show up and clog one street isn’t great, we did get to see that GM’s autonomous vehicles are at least able to fail gracefully instead of ending up as a hood ornament on a train or serving as an unplanned gate in an irrigation system (in a ditch).
I don’t want to give GM too much credit here, though. Like Waymo, we don’t see Cruise cars everywhere. They operate in geofenced areas where there’s a lot of preparation for them to do the right thing. You can’t drop one of these weird-looking Bolts on a random street in Bleiblerville, Soda Springs, or Truth or Consequences and expect it to do as well as a Tesla with FSD Beta would do in that situation.
What we ultimately have here are two very different approaches to autonomy that are eventually going to converge, and that’s a good thing as long as nobody gets hurt.
Featured image by Reddit user seansinha (Fair Use).
Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
Advertisement
This post has been syndicated from a third-party source. View the original article here.