Robotic voice: No driver in car. No driver in car. No driver in car.
Seems like it would be closer to the passenger air bags. I believe those use weight to determine if small people are in the seat right, or closer to the reverse but it’s basically weight.
Well then no one has to reinvent the wheel. We’ve already got this.
How does it work while wearing sunglasses?
I only know what I read in the article I linked above, but I imagine it works based on head orientation rather than pupils.
I see what you did there.
I recall a blurb from Popular Mechanics (I think, or PoP Sci) from sometime late 80s - early 90s of a system which would monitor if a driver was awake or asleep.
No robot drivers so the concept included lighting up a signal on the car roof if the operator nods off. Needless to say, it went nowhere.
If I was the designer the system would’ve involved electric shocks. Would’ve made it useful.
A collar with a cattle prod just under the chin.
The car did detect the pedestrian in the fatal Arizona incident, but the software chose to ignore her.
Yeah they say the same thing, mostly.
The only possibilities that made sense were:
A: Fault in the object recognition system, which may have failed to classify Herzberg and her bike as a pedestrian. This seems unlikely since bikes and people are among the things the system should be most competent at identifying.
B: Fault in the car’s higher logic, which makes decisions like which objects to pay attention to and what to do about them. No need to slow down for a parked bike at the side of the road, for instance, but one swerving into the lane in front of the car is cause for immediate action. This mimics human attention and decision making and prevents the car from panicking at every new object detected.
The sources cited by The Information say that Uber has determined B was the problem.
Having great eyes only works if you know what to do with them.
Maybe the pedestrian was a bad person.
I guess I learned something in grad school after all.
Yah grad school! ;-)
Google keeps suggesting these stories to me. I think they’re trying to groom me to buy robot insurance.
This time the article doesn’t even know if auto-pilot was engaged, but the story was reported simply because the vehicle has the feature. Newsworthy? Who knows. Apparently they’ll know more on Monday.
SOUTH JORDAN, Utah (AP) — A Tesla sedan with a semi-autonomous Autopilot feature has rear-ended a fire department truck at 60 mph (97 kph) apparently without braking before impact, but police say it’s unknown if the Autopilot feature was engaged.
Uber lays off 300 Arizona employees and ceases their self-driving operations there.
Most of those impacted by the layoffs are vehicle operators, who were paid to supervise the vehicles during tests.
Arizona had been widely regarded as the most welcoming state for self-driving vehicles. In December 2016, Arizona governor Doug Ducey said in a statement, “Arizona welcomes Uber self-driving cars with open arms and wide open roads.”
But following the fatality, Ducey suspended Uber’s ability to test on state roads.
The fatal crash in Tempe has raised questions about the safety of self-driving vehicles. A recent AAA study revealed Americans are increasingly afraid of riding in self-driving vehicles. About 73% of participants said they would be scared to ride in a fully autonomous vehicle.
There are some interesting tidbits in the preliminary NTSB report
According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). 2 According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.
In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review
The inward-facing video shows the vehicle operator glancing down toward the center of the vehicle several times before the crash. In a postcrash interview with NTSB investigators, the vehicle operator stated that she had been monitoring the self-driving system interface. The operator further stated that although her personal and business phones were in the vehicle, neither was in use until after the crash, when she called 911.
Jeezus, that’s insane. So emergency braking is up to the driver, but the driver is also supposed to be monitoring the center screen? That’s such a monstrous conflict of interest right there.
Look, I’m no engineer, but this sounds nuts.
No wonder Uber settled with the family within days of the actual incident. Any competent lawyer would tear this system apart at trial, and the jury would hammer the company in damages.