Uber vs. California - Robot cars not ready for roads?

The question is will the accidents happen more often with self driving trucks than with human truck drivers who are tired from long drives and the monotonous road trips.

Semis get into accidents today on the clearest of days with almost no traffic issues, so if self driving trucks can reduce the rate that happens I’ll be happy.

We can always have high risk routes (mountain passes and areas prone to heavy icing in the winter) flagged as not suitable for automation yet. It’s not a zero sum game and it can be rolled out in waves (starting with the low risk routes first).

I just don’t know why we’re thinking we should test this new tech with the heaviest loads on the road. These are vehicles that, when they strike another vehicle, that second vehicle is very unlikely to have anyone in it who will survive.

It’s being tested with loads of all sizes. Every vehicle from scooters to semis are undergoing huge amount of R&D for self-driving. The only difference is there’s a much bigger economic incentive to automate middle mile shipping than automating individual cars.

Bad weather is expected on basically all routes. You can’t drive across a continent without hitting bad weather. And parking in the middle of Nebraska and waiting for someone to drive out and help doesn’t seem very cost-effective.

Not to mention the mechanical issues that big rigs deal with constantly. I don’t really see much support for turning the system over to a machine. It would be akin to turning air travel over to them. I could see some sort of hybrid system happening though, where the driver basically sits there and lets the rig drive. You could pay them next to nothing. “Oh cause sitting in a seat listening to music is such hard work, you get min wage.”

The problem is that you’re right back to the situation in which the driver is going to catch the blame (certainly, no manufacturer or company vehicle owner is going to step up when there’s a driver right there in the robo-car) but this theoretical minimum wage driver is going do exactly what you’d expect any minimum wage driver tasked with babysitting a robot to do: Not give one shit about watching what’s going on.

This is actually what the truck platooning systems are supposed to address. You could have a convoy of three trucks driving 24/7 with only three drivers; at any one time, two could be off duty; the on-duty driver would directly control one of the trucks and the others would follow its lead with minimal autonomy. In exceptional conditions you could use all three drivers so you could still make progress. These are much closer to real-world use than self-driving cars (but still a few years out).

A better solution to the problem you suggest is for the “driver” to actually be a mechanic who gets paid whatever the smallest amount allowable is to be physically present while the truck drives itself, and is only officially “on the clock” when there’s a maintenance issue, difficult road situation, or other task that requires a physical solution the computer can’t attempt. Really, though, how many rig drivers are also fixing the rig as they go? I thought they were paid to drive and say funny things on the CB, not to fix the engine.

EDIT: Or what antlers said

I think we’re talking about very different types of “bad weather”. I reckon that the robots would very likely drive in a typical rain shower. If caught in a massive downpour (the likes of which Florida sees daily in the later summer), the robot would probably pull over and wait until minimum save visual distance was restored… which honestly is what humans should do in such a situation. But we don’t.

I think the “bad weather” Menzo was referencing was stuff like driving snow, unsafe winds, freezing rain, etc. That type of weather is common too… but most truckers would also pull over in high winds, blizzard conditions, freezing rain, etc., so it’s not too far from the baseline here either.

Oh, I’ll bet it’s far more cost effective to have a couple of recovery teams operating in Nebraska than paying the salary, medical, insurance, etc. for a few hundred truckers. Especially if they were contractors on standby for a bunch of firms.

[quote=“ShivaX, post:44, topic:127557”]Not to mention the mechanical issues that big rigs deal with constantly. I don’t really see much support for turning the system over to a machine. It would be akin to turning air travel over to them.
[/quote]
Funny - drone air travel is potentially more within reach than drone trucks… especially for cargo planes Robot ships too. Far fewer variables.

On the maintenance thing, a case can be made that robotic control would result in fewer mechanical issues than human-control. Smoother shifting, more even brake usage, that type of thing.

Other questions. Have these things figured out how to deal with road construction, with traffic flaggers?

How about boarding a ferry?

That would be a feature for the employers though. If the driver doesn’t pay attention and something happens, it’s his fault and you saved a ton of money.

Sure. Like I said, no company owner or manufacturer is going to willingly fall on their sword for a robo-car plowing into an intersection. It’s going to be “operator error” every time.

The issue is that no minimum wage robo-car operator/minder is going to actually do that job with any amount of care or watchfulness. It’s just not going to happen.

Oh they’ll find these employees. It will be from the same group running around in their personal cars not realizing that their non-commercial car insurance might not cover any incident their incur while doing something commercially… aka Lyft, Uber, etc.

Many semi accidents are weather or sleepy driver caused, because whatever they are hauling has to be there on time.

I don’t understand the technology but I have read that the first widespread use of driverless vehicles will probably come in the shipping industry. Seems crazy to me that driverless would somehow be safer than human driven. But is it safety, or the bottom line that actually determines when new technology is accepted.

I am aware of those issues in the industry, but the other part of the problem is exactly as I said, smaller vehicles not behaving properly around large vehicles that can take a full football field length to stop, and that doesn’t include the ones that carry liquids. You know those giant signs on the back of the trailers that tell you if you can’t see their mirrors they can’t see you… why do you suppose they put those there?

Thing is the AI driving the truck would probably respond better than a real driver would. And if you don’t blame the semi’s human driver for the stupidity of others, how can you blame the AI.

But it does just “feel” wrong.

I don’t know if you read my entire post up above, but i can understand if you missed it. It’s not about blaming who’s at fault. Semi Truck drivers know they sometimes cannot avoid an accident. Ideally they’re going to try and go for the scenario that kills the fewest people.

So in that scenario, is the AI going to drive down the side of a hill rather than plow into the cars in front of them? Will they avoid causing the vehicle to tip and potentially cause a 100 car pile-up and just… take out the car that got in front of them? How will the AI make that kind of a decision? These are not light vehicles we’re talking about here.

For example, if the vehicle in front of the car is a school bus full of children, I am assuming most humans will take option B over hitting that bus even if it means to roll the truck over. We tend to value children above even a 100 parked cars with adults and some children in it.

Eh, only to obvious, objective situations or ones based on reaction time. Not to mention accidents that can’t be avoided and how the AI would deal with those. Dude pulls out in front of you, you can’t stop, does the AI swerve? What if there is a kid standing on the side of the road? Does it choose the kid over the idiot? Does it even know the kid is there?

Then you get into other issues like people actively fooling AI’s either to be dicks or to steal the cargo/whatever.

In a word, no.

Self-driving is pretty much at level 3 now in the SAE chart shown in the Forbes article linked above (level 2 is the kind of collision avoidance systems that are now commonly available). That’s pretty much an autopilot that can handle only the simplest traffic conditions along with a GPS. A complicated lane change or non-standard signage requires human intervention. Level 3 is probably worse than useless for safety for passenger cars, because driver attention is still required but not continuously demanded, so the temptation to be inattentive is high. Level 3-type technology does allow for truck platooning, and so could be economically viable that way. (The fact that companies are spending a lot of money developing platooning is one sign that more advanced autonomous vehicles won’t be available in the near term)

Level 4 are vehicles that in some sub-set of conditions can really drive themselves. These are in the prototype stages now, but the conditions in which they can drive themselves are so severely limited (only certain neighborhoods with minimal traffic; specific infrastructure requirements) that their applications are similarly limited. The industry is saying that Level 4 is “right around the corner” but IMO driverless vehicles with applications beyond the very easiest cases (campus/retirement community shuttles) are still years away.

Has history shown that in those instances we can depend on the human driver to make the “right” decision?

Like I said above, I am not comfortable with autonomous cars or trucks. If trains still need drivers I don’t see how a car or truck on a freeway doesn’t need one.

Has history shown us if the AI can be that… complex? if the AI is told don’t choose one car over 10… does it recognize school bus is different, can we rely on a developer to remember that? Human’s make a mistake but Scuzz, you’re daughter is dead because the developer forgot that one. How do you feel?