Cops Pulled Over a Driverless Car and Their Dumbfounded Reaction Has People Cracking Up
Updated Oct. 1 2023, 11:38 a.m. ET
One thing that's sure to make your heart drop is seeing the flashing blue lights of a police car behind you when you're driving. Even when you know you weren't doing anything wrong, a million things start racing through your head. Will the officer hit you with some arbitrary ticket and now you're going to have to pay a bunch of money you didn't plan on spending, or waste a day of work contesting the claim in court?
Or maybe you were speeding but keeping up with the flow of traffic but you were singled out for whatever reason and have to suffer the consequences? Heck, even if you were driving poorly and in an unsafe manner, getting a ticket and being pulled over by a cop still isn't fun, despite being in the wrong legally.
But what happens to a car that gets pulled over that doesn't have a driver? What are the legalities surrounding that? It seems like new autonomous driving car technology befuddled, at least temporarily, two San Francisco PD police officers who pulled over a Cruise driverless car for not having its lights on at night.
The clip has been making the rounds on social media and various local news outlets. In the video, the car can be seen recognizing that it is being pulled over and stops for officers.
However, the car then drives further up the street to a more safe location. The cops again follow the Cruise driverless vehicle and inspect its interior, apparently not knowing how to respond to a traffic stop of a car that doesn't have a human at the wheel.
It's hard to blame them, as autonomous driving isn't exactly a widespread practice, something that Cruise, a General Motors backed autonomous Taxi service is looking to change with its range of Chevy Bolt electric vehicles.
Currently they're in operation in San Francisco, Calif. and Chandler, Ariz., but the story of one of its autonomous vehicles being pulled over by police has made national news.
The company responded to the exposure by offering up additional details on the incident along with its protocols. Cruise also mentioned that the car didn't receive a citation and that they "work closely with the SFPD on how to interact with our vehicles, including a dedicated phone number for them to call in situations like this."
On average, 38,000 people die in car crashes in the United States every single year. It's believed that autonomous driving technologies using software-based approaches and cameras/sensors will help reduce that number. Tesla's full-self driving has been receiving mixed reviews, with some users claiming their vehicles ignored road closure signs or crashed/careened into rocks and poles.
I own a Tesla model 3 and personally have had issues with its autopilot and cruise control. Sometimes the car's "safety" features will steer the vehicle in directions I don't want, and I've had the car stop on the highway while using adaptive cruise control on some occasions. I've yet to hear back from the company myself with a crash report despite repeated attempts at trying to get one.
Cruise vehicles have been involved in accidents, but no fatalities for their passengers. Several Tesla owners have also documented and attributed the vehicle's autopilot safety features at gauging the road and predicting accidents, preventing damage in the process.