Uber vs. California - Robot cars not ready for roads?

“Thanks to LiDAR, the test cars aren’t reliant on the sun shining, nor cameras detecting painted white lines on the asphalt,” says Ford’s technical leader for autonomous vehicles Jim McBride. "In fact, LiDAR allows autonomous cars to drive just as well in the dark as they do in the light of day."

Doesn’t seem to be true, this also doesn’t look like a suicide or a sudden jump in front of the car either just someone not paying attention walking across the road with a bike.

Yeah, that’s really, really off.

Hmmm. Hard to tell from the video really. Human eyes have are much better dynamic contrast than visible light camera, so an observant human may have noticed something earlier. The car did not even appear to brake at all, while with repeat viewing I am leaning towards thinking that same observant human may have got a foot on the brake at least. As to the other sensors onboard…???

More damning is the interior footage. The human driver does not appear all that engaged in caretaking. Moments before the collision his eyes are downcast for a good five or six seconds. Inexcusable, but autonomous vehicles will by their very nature breed complacency from caretaking drivers.

Humans as failsafe is the dumbest aspect of the inevitable autocar revolution.

Damn, I actually agree with something Adam_B says! :)

In early stages, when you’re not at all sure the machines can even navigate the car, maybe, but when you are asking Joe Driver to step in and backstop cutting edge sensor and processing power, um, you have to wonder.

How did uber ever get a license to test on public roads in the first place? :(

If a human driver had hit that cyclist I would not blame them. But darkness shouldn’t make a big difference to the automatic system.

Money, jobs, and the chance to become “the next Silicon Valley”.

I think a lot of articles are jumping to conclusions about that dash cam footage as we don’t know what she is looking down at. If you have ever seen the demonstrations Google/Waymo has put on they have monitors in the front that show all the data that the sensors see. So it doesn’t seem far fetched to me that she might have been watching the sensor data, not necessarily playing on her phone.

It’s really a mystery to me why the lidar didn’t pick this person up several seconds earlier. I had a couple of possibilities in mind, but looking at the video again, it just doesn’t make sense.

I don’t think it matters whether the operator was playing on her phone, stargazing, or looking at a telemetry readout on the dash. The whole point of the in-car backup is that the person is supposed to be there to take over in case of an emergency. That person should be looking at the road.

I think @inactive_user is right though. The whole human backup idea is dumb. Humans will never be able to just sit in a car and passively watch for emergency events. They’re going to zone out or do other things if they’re not actively driving.

Clearly there should be sensors in the car to detect when the human backup is distracted and give them an electric shock.

I think the less silly human backup idea is to allow for testing on real roads when the system is not fully mature. (i.e. this scenario) Unfortunately that only works if the companies pay high salaries and monitor their backup drivers to ensure they are paying attention.

Driving instructors are supposed to be able to sit in a car and watch for emergency events (although there’s a lot more interactivity there), so this isn’t entirely new territory.

Then we can have a human monitor that system to make sure there aren’t erroneous shocks given, then we can set up another system that shocks that human when they’re distracted, and

Caddilac has something in production already that monitors drivers:

To ensure you’re always aware of what’s going on, Super Cruise uses an attention-detection system to ensure the human behind the wheel can retake control in the event a handoff is requested. Infrared sensors in the steering wheel and a video camera in the top of the steering column keep tabs on the driver’s eye movements and head position. Misbehave or block the camera and the car goes through various warning levels. If the driver continues to be unresponsive, it will ultimately bring the vehicle to a halt within the lane of travel, an aspect of Super Cruise that has already raised safety concerns from federal regulators.

https://blog.caranddriver.com/hands-free-caddy-2018-cadillac-ct6-launches-super-cruise-semi-autonomous-feature/

Seems like the test vehicle must have something like that.

Arizona.

Right, but that’s a scenario in which the expectation is that the student/test driver has a good chance of making mistakes and the pro/instructor is not just watching for an emergency, they’re actively teaching and grading.

These automated drivers are being sold as full replacements for human piloting because they’re supposed to be more safe and efficient than people. Heck, the end goal of some of these projects is a cockpit area with no wheel at all.

Please don’t respond to only half my post and take it out of context. I’m talking about backup drivers during the development and testing phase. Obviously one these systems are ready for sale they can’t rely on user supervision.

Ok. My post applies to the backup drivers in the testing phase. She should be watching the road.

I have “adaptive cruise control” as part of my car, which has a vision system for seeing approaching vehicles/obstacles, breaking for me, etc.

While it can be nice in very slow traffic on the highway so I don’t have to gas/break a million times, it’s in an almost “uncanny valley” sort of effect where it’s just useful enough to lull you into an accident. I find sometimes I’m MORE tense when I have the thing on at full speed, as it tends to break harder and later than I do.

I can completely understand why you might feel like you don’t need to pay attention while in a level 3 car, because you aren’t needed 99% of the time, but it probably makes it incredibly difficult to snap back to attention in a split instant to make a difference.

Here’s what happens when you are asked to do a repetitive, boring thing over and over again with the rare emergency exception:

According to officials briefed on the results of a recent Homeland Security Inspector General’s report, TSA agents failed 67 out of 70 tests, with Red Team members repeatedly able to get potential weapons through checkpoints.