Waymo robotaxi hits a child near an elementary school in Santa Monica
148 points
4 hours ago
| 25 comments
| techcrunch.com
| HN
BugsJustFindMe
3 hours ago
[-]
From the Waymo blog...

> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.

> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.

> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.

I honestly cannot imagine a better outcome or handling of the situation.

reply
jobs_throwaway
3 hours ago
[-]
Yup. And to add

> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”

It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.

reply
jjav
52 minutes ago
[-]
> It's likely that a fully-attentive human driver would have done worse.

We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.

The car reacted quickly once it saw the child. Is that enough?

But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?

If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.

The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.

reply
oakesm9
41 minutes ago
[-]
As someone who lives on a residential street right by a primary school in the UK, the majority of drivers are going over 20mph even at the peak time when there are children everywhere.

While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.

reply
2b3o4o
26 minutes ago
[-]
According to the article the car was traveling at 17 miles an hour before it began braking. Presumably this was in a 25 mph school zone, so it seems the Waymo was already doing exactly what you describe - slowing down preemptively.
reply
recursive
9 minutes ago
[-]
This is close to a particular peeve I have. Occasionally I see signs on the street that say "Slow Down". I'm not talking about the electronic ones connected to radar detectors. Just metal and paint.

Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.

But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.

You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.

reply
Aloisius
11 minutes ago
[-]
It was going 17 mph. That is rather slow.

To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.

reply
Sparkle-san
12 minutes ago
[-]
There's a bus stop right behind my house. I routinely hear the driver honking and yelling at people who ignore when the stop sign is extended (which is a misdemeanor in my state). So forgive me for not assuming a human would have done better.
reply
usefulposter
43 minutes ago
[-]
Precisely. Environmental context is not considered in Waymo's "peer-reviewed model" (I encourage reflexive commenters to first read it: https://waymo.com/safety/collision-avoidance-benchmarking), only basic driver behavior and traffic signal timings.

Note the weaselly worded "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.

If they already do, why isn't it factored in the model?

reply
jefftk
32 minutes ago
[-]
How is "immediately detected the individual as soon as they began to emerge" worded weaselly?
reply
JKCalhoun
1 hour ago
[-]
I think my problem is that it reacted after seeing the child step out from behind the SUV.

An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.

(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)

reply
coryrc
1 hour ago
[-]
First, it's still the automobile's fault.

At least it was already slowed down to 17 mph to start. Remember that viral video of some Australian in a pickup ragdolling a girl across the road? Most every comment is "well he was going the speed limit no fault for him!" No asshole, you hit someone. It's your fault. He got zero charges and the girl was seriously injured.

reply
WheatMillington
26 minutes ago
[-]
You seem to be implying that there are no circumstances in which a vehicle can hit a pedestrian and the driver not be at fault... which is absurd.
reply
fennecbutt
38 minutes ago
[-]
You mean the Aussie one where the guy was going an appropriate speed for the area and when the cops arrived the parents and their neighbors LIED TO THE POLICE and said he was hooning down the road at excess speed and hit the kid? And that he was only saved from prison by having a dash cam that proved the lies to be lies? That one?

That logic is utter bs, if someone jumps out when you're travelling at an appropriate speed and you do your best to stop then that's all that can be done. Otherwise by your logic the only safe speed is 0.

reply
yibg
54 minutes ago
[-]
I don't see how that's feasible without introducing a lot of friction.

Near my house, almost the entire trip from the freeway to my house is via a single lane with parked cars on the side. I would have to drive 10 MPH the entire way (speed limit is 25, so 2.5x as long).

reply
hombre_fatal
47 minutes ago
[-]
It's hard to consider it "lots of friction" in a vehicle where you press a button to go faster and another button to slow down.

A single lane residential street with zero visibility seems like an obvious time to slow down. And that's what the Waymo did.

reply
yibg
55 seconds ago
[-]
That's why the speed limit is 25 (lower when children are present in some areas) and not 35 or 40 etc. It's not reasonable to expect people to drive at 40% of the posted speed limit the entire way. We're also not talking about zero visibility (e.g. heavy fog). We're talking about blind spots behind parked cars, which in dense areas of a city is a large part of the city. If we think as a society in those situations the safe speed is 10 mph, then the speed limit should be 10mph.
reply
jeffbee
44 minutes ago
[-]
I mean, you are putting your finger right on the answer: the whole car thing doesn't work or make sense, and trying to make autonomous vehicles solve the unsolvable is never going to succeed.
reply
jakewins
57 minutes ago
[-]
Aye, and to always look for feet under and by the front wheel of vehicles like that.

Stopped buses similarly, people get off the bus, whip around the front of them and straight into the streets, so many times I’ve spotted someone’s feet under the front before they come around and into the street.

Not to take away from Waymo here, agree with thread sentiment that they seem to have acted exemplary

reply
fennecbutt
36 minutes ago
[-]
You can spot someone's feet under the width of a bus when they're on the opposite side of the bus and you're sitting in a vehicle at a much higher position on the opposite side that the bus is on? That's physically impossible.
reply
WheatMillington
24 minutes ago
[-]
I think you're missing something though, which I've observed from reading these comments - HN commenters aren't ordinary humans, they're super-humans with cosmic powers of awareness, visibility, reactions and judgement.
reply
fennecbutt
41 minutes ago
[-]
>reacted after seeing the child step out from behind the SUV.

Lmao most drivers I see on the roads aren't even capable of slowing down for a pedestrian crossing when the view of the second half of the crossing is blocked by traffic (ie they cannot see if someone is about to step out, especially a child).

Humans are utterly terrible drivers.

reply
Sparkle-san
9 minutes ago
[-]
They don't even stop when it's a crosswalk with a flashing light system installed and there are no obstructions.
reply
kakacik
1 hour ago
[-]
Yes and no. Tons of situations where this is simply not possible, whole traffic goes full allowed speed next to row of parked cars. If somebody unexpectedly pops up distracted, its a tragedy guaranteed regardless of driver's skills and experience.

In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.

We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.

reply
jsrozner
44 minutes ago
[-]
I slow down considerably near parked cars. And I try to slow down much earlier approaching intersections where there are parked cars blocking my view of cross walk entries. I need to be able to come to full stop earlier than intersection if there happens to be a pedestrian there.
reply
JKCalhoun
1 hour ago
[-]
I kind of drive that way. I slow down, move as far away in my lane from the parked cars as possible. It's certainly what I would expect from a machine that would claim to be as good as the best human driver.
reply
chaboud
3 hours ago
[-]
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.

Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.

So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.

reply
jobs_throwaway
3 hours ago
[-]
Certainly, I'm not against constructive criticism of Waymo. I just think it's important to consider the counterfactual. You're right too that an especially prudent human driver may have avoided the scenario altogether, and Waymo should strive to be that defensive.
reply
veltas
3 hours ago
[-]
Absolutely, I can tell you right now that many human drivers are probably safer than the Waymo, because they would have slowed down even more and/or stayed further from the parked cars outside a school; they might have even seen the kid earlier in e.g. a reflection than the Waymo could see.
reply
mlyle
2 hours ago
[-]
It seems it was driving pretty slow (17MPH) and they do tend to put in a pretty big gap to the right side when they can.

There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.

reply
drunner
6 minutes ago
[-]
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?

As described by the nhtsa brief:

"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"

The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.

reply
tintor
4 minutes ago
[-]
"fully attentive human driver ..." is Waymo's claim, and it could be biased in their favor.
reply
torginus
3 hours ago
[-]
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).

This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.

reply
mikkupikku
1 hour ago
[-]
> especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?)

Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."

Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.

reply
ndsipa_pomu
2 hours ago
[-]
I appreciate your sensible driving, but here in the UK, roads outside schools are complete mayhem at dropping off/picking up times. Speeding, overtaking, wild manoeuvres to turn round etc.

When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.

reply
zdragnar
1 hour ago
[-]
Most US states cap speed limits around schools at 15mph when children are present. There may also be blinking lights above these signs during times that will be likely.

I'm not sure how much of that Waymo's cars take into account, as the law technically takes into account line of sight things that a person could see but Waymo's sensors might not, such as children present on a sidewalk.

reply
jefftk
20 minutes ago
[-]
> Most US states cap speed limits around schools at 15mph when children are present.

Are you sure? The ones I've seen have usually been 20 or 25mph.

Looking on Image Search (https://www.google.com/search?q=school+zone+speed+limit+sign) and limiting just to the ones that are photos of real signs by the side of the road, the first 10 are: 25, 30, 25, 20, 35, 15, 20, 55, 20, 20. So only one of these was 15.

reply
cucumber3732842
41 minutes ago
[-]
School pick up and drop off traffic is just about the worst drivers anywhere. Like visibly worse than a bunch of "probably a little drunk" people leaving a sports stadium. It's like everyone reverts to "sixteen year old on first day behind the wheel" behavior. It's baffling. And there's always one token dad picking up his kid on a motorcycle or in a box truck or something that they all clutch their pearls at.
reply
shaky-carrousel
14 minutes ago
[-]
A fully attentive human would've known he was near a school and wouldn't have been driving at 17 mph to begin with.
reply
ahahahahah
8 minutes ago
[-]
You clearly don't spend much time around a school measuring the speed of cars. Head on down and see for yourself how often or not a human driver goes >17mph in such a situation.
reply
micromacrofoot
3 hours ago
[-]
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.

Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.

reply
IncreasePosts
1 hour ago
[-]
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?

Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.

reply
scarmig
3 hours ago
[-]
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
reply
kilotaras
1 hour ago
[-]
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.

UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.

[0] https://www.safedrivingforlife.info/free-practice-tests/haza...

reply
mlyle
59 minutes ago
[-]
Many states in the US have the Basic Speed Law, e.g. California:

> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.

The speed limit isn't supposed to be a carte blanche to drive at that speed no matter what; the basic speed law is supposed to "win." In practice, enforcement is a lot more clear cut at the posted speed limit and officers don't want to write tickets that are hard to argue in court.

reply
matt-attack
2 hours ago
[-]
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:

Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.

Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.

Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.

reply
webdood90
1 hour ago
[-]
How many human drivers do you think would pass the bar you're setting?

IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.

reply
mlyle
56 minutes ago
[-]
> How many human drivers do you think would pass the bar you're setting?

How many humans drivers would pass it, and what proportion of the time? Even the best drivers do not constantly maintain peak vigilance, because they are human.

> IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.

In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust. And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.

I think "better than the average performance of a 75th or 90th percentile human driver" might be a good way to look at things.

It's going to be a weird thing, because odds are the distribution of accidents that do happen won't look much like human ones. It will have superhuman saves (like that scooter one), but it will also crash in situations that we can't really picture humans doing.

I'm reminded of airbags; even first generation airbags made things much safer overall, but they occasionally decapitated a short person or child in a 5MPH parking lot fender bender. This was hard for the public to stomach, and if it's your kid who is internally decapitated by the airbag in a small accident, I don't think you'll really accept "it's safer on average to have an airbag!"

reply
mlyle
2 hours ago
[-]
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast

Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.

But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.

(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).

reply
pastage
2 hours ago
[-]
Swedish schools still have students who walk there. I live near one and there are very few cars that exceed 20km/h during rush hours. Anything faster is reckless even if the max over here is 30 km/h (19 mph).
reply
mlyle
2 hours ago
[-]
The schools I'm thinking of have sidewalks with some degree of protection/offset from street, and the crossings are protected by human crossing guards during times when students are going to schools. The posted limits are "25 (MPH) When Children Are Present" and traffic generally moves at 20MPH during most of those times.

There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.

reply
drcongo
2 hours ago
[-]
Whoa! You're allowed to double park outside a school over there?!
reply
recursive
1 hour ago
[-]
Wait, is double parking allowed anywhere?
reply
something765478
1 hour ago
[-]
Pretty common at airports; of course, the `parking` only lasts a few minutes at most.
reply
acdha
58 minutes ago
[-]
It’s common but almost always illegal based on the posted signage.
reply
dboreham
1 hour ago
[-]
People loitering in their cars waiting for a space to pick up their kid. So not actually parked.
reply
trollbridge
35 minutes ago
[-]
More like standing, and quite common in a school zone.

I would not race at 17 MPH through such an area. Of course, Waymo will find a way to describe themselves as the heroes of this situation.

reply
calchris42
3 hours ago
[-]
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.

The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.

reply
barbazoo
56 minutes ago
[-]
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
reply
alphazard
1 hour ago
[-]
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions. That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
reply
WheatMillington
23 minutes ago
[-]
I dont think Waymo is interested in using a video of their car striking a child as marketing.
reply
jonas21
26 minutes ago
[-]
Presumably, it would be better marketing if they didn't hit the kid.
reply
boh
29 minutes ago
[-]
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
reply
moomoo11
16 minutes ago
[-]
The general public is stupid.

That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.

And why they can be ignored and just fed some slop to feel better.

I could lie but that’s the cold truth.

reply
random_duck
3 hours ago
[-]
They they are being very transparent about it.
reply
direwolf20
3 hours ago
[-]
As every company should, when they have a success. Are they also as transparent about their failures?
reply
dylan604
2 hours ago
[-]
How is hitting a child not a failure? And actually, how can you call this a success? Do you think this was a GTA side mission?
reply
trillic
1 hour ago
[-]
Why didn't sully just not hit the birds?
reply
recursive
4 minutes ago
[-]
Skill issue, presumably.
reply
direwolf20
2 hours ago
[-]
Immediately hitting the brakes when a child suddenly appears in front of you, instead of waiting 500ms like a human, and thereby hitting the child at a speed of 6 instead of 14 is a success.

What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?

reply
recursive
1 hour ago
[-]
I don't know the implementation details, but success would be not hitting pedestrians. You have some interesting ideas on how to achieve that but there might be other ways, I don't know.
reply
gruez
23 minutes ago
[-]
>I don't know the implementation details, but success would be not hitting pedestrians.

So by that logic, if we cured cancer but the treatment came with terrible side effects it wouldn't be considered a "success"? Does everything have to perfect to be a success?

reply
recursive
14 minutes ago
[-]
If you clearly define your goals in advance, then you can make success whatever you want. What are Waymo's goals?
reply
dylan604
13 minutes ago
[-]
This isn't Apollo 13 with a successful failure. A driverless car hit a human that just happened to be a kid. Doesn't matter if a human would have as well, the super safe driverless car hit a kid. Nothing else matters. Driverless car failed.
reply
direwolf20
10 minutes ago
[-]
If failure is defined such that failure is the only possible outcome, I don't think it's a useful part of an evaluation.
reply
orwin
1 hour ago
[-]
17 mph is way too fast near a school if it's around the time children are getting out (or in).
reply
seanmcdirmid
1 hour ago
[-]
The limit is 20 MPH in Washington state, in California the default is 25 MPH, but is going to 20 MPH soon and can be further lowered to 15 MPH with special considerations.

The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.

reply
jerlam
54 minutes ago
[-]
On street parking is so ingrained into the American lifestyle that any change to the status quo is impossible. Cars have more rights on public property than people. Every suburban neighborhood has conflicts over people's imagined "ownership" of the street parking in front of their house. People rarely use their garages to store their car since they can just leave it on the street. There are often laws that prevent people from other neighborhoods from using the public street to park. New roads are paved as wide as possible to allow both street parking and a double-parked car to not impede traffic. And we've started building homes without any kind of parking that force people to use the street.
reply
JumpCrisscross
38 minutes ago
[-]
> On street parking is so ingrained into the American lifestyle that any change to the status quo is impossible

Plenty of American cities regulate or even eliminated, in various measures, on-street parking.

reply
seanmcdirmid
3 minutes ago
[-]
Europe is much better at this than we are. Even when you have on street parking, they make sure there are clearances around cross walks and places where there are lots of pedestrians. Most US cities don't even care, even a supposedly pedestrian friendly one like Seattle.
reply
the_other
38 minutes ago
[-]
In the UK we have a great big yellow zig-zag road marking that extends 2/3rds the width of an average car across the road. It means "this is a school, take your car and fuck off". You find it around school gates, to a distance of a few car lengths either side of the gate, and sometimes all along the road beside a school.

It doesn't stop all on street parking beside the school, but it cuts it down a noticeable amount.

reply
trollbridge
34 minutes ago
[-]
If it had no parking, then the parents would be parked somewhere else and loading and unloading their kids there, and then that would need to be a no-parking zone too.

I guess you could keep doing that until kids just walk to and from school?

reply
seanmcdirmid
18 minutes ago
[-]
Our local school has them unload a block away unless they are handicapped. A kid isn't going to die walking a block. But its pointless because they still allow residential on street parking around the school, and my son has to use a crosswalk where cars routinely park so close to, I had to tell him that the traffic (pretty heavy) on the road wouldn't see him easily, and he should always ease his way into a crosswalk and not assume he would be easily seen.
reply
dboreham
1 hour ago
[-]
This isn't universal. The schools in our Montana town have pickup lanes and short term parking areas for pickup. Stopping on the road isn't allowed.
reply
trollbridge
33 minutes ago
[-]
Same for my tiny town. Stopping on the road is 100% not allowed, and parking isn't allowed there either. The school has its own parking area to park and pick up/drop off kids, and cars in there creep at 2 or 3 MPH.
reply
parl_match
1 hour ago
[-]
"and thereby hitting the child ... is a success."

> What else to you expect them to do, only run on grade–separated areas where children can't access?

no, i expect them to slow down when children may be present

reply
direwolf20
1 hour ago
[-]
how slow?
reply
autoexec
1 hour ago
[-]
They've gone to the courts to fight to keep some of their safety data secret

https://www.theverge.com/2022/1/28/22906513/waymo-lawsuit-ca...

reply
BugsJustFindMe
2 hours ago
[-]
Well, as a comparison, we know that Tesla has failed to report to NHTSA any collisions that didn't deploy the airbag.
reply
voidUpdate
3 hours ago
[-]
Is this a success? There was still an incident. I'd argue this was them being transparent about a failure
reply
TeMPOraL
2 hours ago
[-]
Being transparent about such incidents is also what stops them from potentially becoming a business/industry-killing failures. They're doing the right thing here, but they also surely realize how much worse it would be if they tried to deny or downplay it.
reply
xnx
1 hour ago
[-]
> they also surely realize how much worse it would be if they tried to deny or downplay it.

Indeed. Waymo is a much more thoughtful and responsible company than Cruise, Uber, or Tesla.

"Cruise admits to criminal cover-up of pedestrian dragging in SF, will pay $500K penalty" https://www.sfgate.com/tech/article/cruise-fine-criminal-cov...

reply
direwolf20
3 hours ago
[-]
They handled an unpredictable emergency situation better than any human driver.
reply
micromacrofoot
3 hours ago
[-]
as far as we know
reply
dcanelhas
2 hours ago
[-]
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.

What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself

reply
aaomidi
2 hours ago
[-]
17 mph is pretty slow unless it’s a school zone
reply
dcanelhas
2 hours ago
[-]
Indeed, 15 or 25 mph (24 or 40 km/h) are the speed limits in school zones (when in effect) in CA, for reference. But depending on the general movement and density and category of pedestrians around the road it could be practically reckless to drive that fast (or slow).
reply
Teknoman117
1 hour ago
[-]
If my experience driving through a school zone on my way to work is anything to go off of, I rarely see people actually respecting it. 17 mph would be a major improvement over what I'm used to seeing.
reply
mholt
1 hour ago
[-]
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)

This is the fault of the software and company implementing it.

reply
recursive
2 minutes ago
[-]
What's the success rate of this intuitive sense that humans have? Intuitions are wrong frequently.
reply
BugsJustFindMe
18 minutes ago
[-]
> Humans have this intuitive sense.

Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.

reply
rdudek
3 hours ago
[-]
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
reply
dyauspitr
3 hours ago
[-]
It’s great handling of the situation. They should release a video as well.
reply
dust42
2 hours ago
[-]
Indeed. Rather than having the company telling me that they did great I'd rather make up my own mind and watch the video.
reply
croes
1 hour ago
[-]
We should take their reporting with grain of salt and wait for official results
reply
veltas
3 hours ago
[-]
EDIT: replies say I'm misremembering, disregard.
reply
chaboud
3 hours ago
[-]
That was Cruise, and that was fixed by Cruise ceasing operations.
reply
seanmcdirmid
3 hours ago
[-]
I don’t think that was Waymo right? Cruise is already wound down as far as I know.
reply
lostlogin
1 hour ago
[-]
> I honestly cannot imagine a better outcome or handling of the situation.

If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.

reply
anovikov
3 hours ago
[-]
Most humans in that situation won't have reaction speed to do shit about it and it could result in a severe injury or death.
reply
gensym
2 hours ago
[-]
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.

The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.

reply
pastage
2 hours ago
[-]
Building on my own experience I think you have to own that if you crash with someone you made a mistake. I do agree that car and road design for bicycles(?) makes it almost impossible to move around if you do not risk things like that.
reply
jayd16
3 hours ago
[-]
Humans are not going to win on reaction time but prevention is arguably much more important.
reply
lokar
3 hours ago
[-]
How would standard automatic breaking (standard in some brands) have performed here?
reply
aanet
12 minutes ago
[-]
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.

Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.

Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.

This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.

[1] Yes, I'm an AV safety expert

[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...

(edit: verbiage)

reply
maerF0x0
2 hours ago
[-]
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.

I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .

> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.

A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.

reply
Veserv
5 minutes ago
[-]
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.

However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.

[1] https://afdc.energy.gov/data/10315

[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...

reply
maerF0x0
3 minutes ago
[-]
If that's the case, then that's great info. Thank you for adding :)
reply
dlg
53 minutes ago
[-]
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.

While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.

If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.

reply
sowbug
35 minutes ago
[-]
When my kids were school age, I taught them that the purpose of crosswalk lines is to determine who pays for your funeral.

They got the point.

reply
joefarish
23 minutes ago
[-]
This is a very good way of putting it.
reply
trollbridge
32 minutes ago
[-]
I do not like the phase "it's the kid's fault" for a kid being hit by a robot-car.

It is never a 6 year old's fault if they get struck by a robot.

reply
blell
22 minutes ago
[-]
Exactly. It’s his parents fault.
reply
Zigurd
46 minutes ago
[-]
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
reply
aimor
55 minutes ago
[-]
The school speed limit there is 15 mph, and that wasn't enough to prevent an accident.

https://www.yahoo.com/news/articles/child-struck-waymo-near-...

https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac

reply
JumpCrisscross
36 minutes ago
[-]
The interesting thing is a 12 mph speed limit would be honored by an autonomous vehicle but probably ignored by humans.
reply
nkrisc
10 minutes ago
[-]
Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do to, though not everyone of course, incredibly.
reply
jsrozner
23 minutes ago
[-]
So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.

Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?

reply
cucumber3732842
3 minutes ago
[-]
> Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?

Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?

As if 2mph would have fundamentally changed this. Pfft.

A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.

The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.

reply
simojo
3 hours ago
[-]
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.

This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.

reply
energy123
3 hours ago
[-]
From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.
reply
Dlanv
1 hour ago
[-]
With above-average human reflexes, the kid would have been hit at 14mph instead of 6mph.

About 5x more kinetic energy.

reply
samrus
48 minutes ago
[-]
But would a human be driving at 17 in a school zone during drop off hours? Id argue a human may be slower exactly because of this scenario
reply
JumpCrisscross
35 minutes ago
[-]
> would a human be driving at 17 in a school zone during drop off hours?

In my experience in California, always and yes.

reply
Bukhmanizer
3 hours ago
[-]
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
reply
jayd16
3 hours ago
[-]
It honked at you? But local laws dictate that it angrily flashes its high beams at you.
reply
pengaru
1 hour ago
[-]
In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.

The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.

They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.

reply
koolba
44 minutes ago
[-]
> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?

reply
sweezyjeezy
34 minutes ago
[-]
These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.
reply
sowbug
37 minutes ago
[-]
You're right: a quick search shows that pedestrian fatalities are 43% higher on Halloween.
reply
Rudybega
34 minutes ago
[-]
That's probably more a function of more people being in the road than people not understanding what object they're about to hit.
reply
rullelito
36 minutes ago
[-]
Lidar would pick up a moving object in 3D so unlikely to just keep going.
reply
Rudybega
35 minutes ago
[-]
"Oh that obstructing object doesn't look like a child? Gun it, YOLO." Lmao.

I suspect the cars are trying to avoid running into anything, as that's generally considered bad.

reply
pmontra
1 hour ago
[-]
Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?
reply
ssl-3
15 minutes ago
[-]
When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.

In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.

(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)

reply
jeffbee
1 hour ago
[-]
A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.
reply
hiddencost
1 hour ago
[-]
Are you thinking of civil liability or criminal liability?

Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.

For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.

reply
trollbridge
29 minutes ago
[-]
Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.
reply
NoGravitas
1 hour ago
[-]
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
reply
WarmWash
3 hours ago
[-]
Oddly I cannot decide if this is cause for damnation or celebration

Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

reply
Filligree
3 hours ago
[-]
> Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.

reply
micromacrofoot
3 hours ago
[-]
> Waymo is held to a significantly higher standard than human drivers.

They have to be, as a machine can not be held accountable for a decision.

reply
dragonwriter
29 minutes ago
[-]
Waymo is not a machine, it is a corporation, and corporations can, in fact be held accountable for decisions (and, perhaps more to the point, for defects in goods they manufacture, sell, distribute, and/or use to provide services.)
reply
JumpCrisscross
33 minutes ago
[-]
> They have to be, as a machine can not be held accountable for a decision

This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.

reply
TeMPOraL
2 hours ago
[-]
The promise of self-driving cars being safer than human drivers is also kind of the whole selling point of the technology.
reply
micromacrofoot
1 hour ago
[-]
Sure, but the companies building them are just shoving billions of dollars into their ears so they don't have to answer "who's responsible when it kills someone?"
reply
myrmidon
1 hour ago
[-]
What? No? The main selling point is eliminating costs for a human driver (by enabling people to safely do other things from their car, like answering emails or doomscrolling, or via robotaxis).
reply
CaliforniaKarl
1 hour ago
[-]
For reference, here's a link to Waymo's blog post: https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
reply
alkonaut
3 hours ago
[-]
And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.

I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

reply
jillesvangurp
2 hours ago
[-]
I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.

As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.

Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.

Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.

Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.

reply
alkonaut
2 hours ago
[-]
> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.

I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.

reply
trillic
1 hour ago
[-]
It will also never get worse. This is the worst the algorithms from this point forward.
reply
jerlam
39 minutes ago
[-]
I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.

I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):

https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...

reply
sowbug
20 minutes ago
[-]
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs
reply
trollbridge
27 minutes ago
[-]
Has this been true of other Google products? They never get worse?
reply
jonas21
3 hours ago
[-]
> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

Do you mean like this?

https://waymo.com/safety/impact/

reply
alkonaut
2 hours ago
[-]
Yes but ideally from some objective source.
reply
xnx
50 minutes ago
[-]
reply
trollbridge
27 minutes ago
[-]
Maybe an objective source that isn't on the waymo.com domain?
reply
WarmWash
3 hours ago
[-]
If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.

I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.

reply
recursive
1 hour ago
[-]
That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.
reply
crazygringo
1 hour ago
[-]
If we adopted that level of risk, we'd have 5mph speed limits on every street with parking. As a society, we've decided that's overly cautious.
reply
mhast
45 minutes ago
[-]
But with waymos it would be possible. Mark those streets as "extremely slow" and never go there unless you are dropping someone off. (The computer has more patience than human drivers.)

If that's too annoying then bad parking by school areas so the situation doesn't happen.

reply
crazygringo
40 minutes ago
[-]
I don't know if you've been to some cities or neighborhoods but almost every street has on-street parking in many of them.

And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.

reply
xnx
48 minutes ago
[-]
Second-order benefit: More Waymos = fewer parked cars
reply
recursive
16 minutes ago
[-]
In high parking contention areas, I think there's enough latent demand for parking that you wouldn't observe fewer parked cars until reduce demand by a much greater amount.
reply
alkonaut
2 hours ago
[-]
Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.
reply
maerF0x0
1 hour ago
[-]
> The situation of "kid running out between cars" will likley never be solved

Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.

One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.

reply
toast0
1 hour ago
[-]
How do you add a chain link fence between the parked and driving cars for on-street parking?
reply
maerF0x0
1 hour ago
[-]
there's still an inlet and outlet (kinda like hotel pickup/drop off loops). It's not absolutely perfect, but it constrains the space of where kids can dart from every parked car to 2 places.

Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.

reply
toast0
43 minutes ago
[-]
Ok, that's not really a simple change anymore, because you need more space for that. Unless it's really just a drop off queue, but then it's not parked cars, since a parked car blocks the queue.

We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.

reply
Archio
36 minutes ago
[-]
>We accept the risks with humans because those humans accept risk.

It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?

reply
trollbridge
27 minutes ago
[-]
I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.

In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.

reply
sowbug
29 minutes ago
[-]
Even in terms of plain results, I'd say the consequences-based system isn't working so well if it's producing 40,000 US deaths annually.
reply
JumpCrisscross
33 minutes ago
[-]
> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it

It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.

reply
criddell
3 hours ago
[-]
Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?
reply
alkonaut
2 hours ago
[-]
"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.
reply
criddell
1 hour ago
[-]
If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?
reply
crazygringo
1 hour ago
[-]
People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.
reply
zamadatix
43 minutes ago
[-]
Apart from a minority of car related deaths resulting in jail time, what kind of person wants many more people to die just so they can point at someone to blame for it? At what point are such people the ones to blame for so many deaths themselves?
reply
renewiltord
1 hour ago
[-]
Do they go to jail?

That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...

The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.

This driver will be back on the street right away.

reply
xnx
46 minutes ago
[-]
Ugh. That is so despicable both of the driver and as a society that we accept this. Ubiquitous Waymo can't come soon enough.
reply
xnx
56 minutes ago
[-]
> Self driving needs to be orders of magnitude safer for us to acknowledge it

All data indicates that Waymo is ~10x safer so far.

"90% Fewer serious injury or worse crashes"

https://waymo.com/safety/impact/

reply
jtrueb
3 hours ago
[-]
Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.
reply
lokar
3 hours ago
[-]
I generally agree the bar is high.

But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.

There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.

reply
cameldrv
3 hours ago
[-]
That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?
reply
Dlanv
1 hour ago
[-]
Basically Waymo just prevented a kids potential death.

Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.

reply
Petersipoi
39 minutes ago
[-]
You just invented a hypothetical situation in your head then drew conclusions from it. In my version, the other car misses the kid entirely.
reply
Archio
40 minutes ago
[-]
It's hard to imagine how any driver could have reacted better in this situation.

The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.

reply
fortran77
35 minutes ago
[-]
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.

> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”

BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.

Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!

reply
trollbridge
29 minutes ago
[-]
Waymo will 100% go down a route human drivers avoid because it will have "less traffic".
reply
tekno45
47 minutes ago
[-]
can we just get waymo tech in busses?

Big vehicles that demand respect and aren't expected to turn on a dime, known stops.

reply
xnx
1 hour ago
[-]
Alternate headline: Waymo saves child's life
reply
recursive
5 minutes ago
[-]
In this timeline, we want our headlines to somehow reflect the contents of the story.

Saved child from what? From themselves. You can't take full credit for partially solving a problem that you, yourself, created.

reply
bpodgursky
3 hours ago
[-]
A human driver would most likely have killed this child. That's what should be on the ledger.
reply
toast0
1 hour ago
[-]
That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:

> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries

Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.

[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...

reply
xnx
45 minutes ago
[-]
There's a 50/50 chance that a distracted driver wouldn't have slowed at all and run the child over.
reply
thatswrong0
47 minutes ago
[-]
> To estimate injury risk at different impact speeds, IIHS researchers examined 202 crashes involving pedestrians ages 16 or older

A child is probably more likely to die in a collision of the same speed as an adult.

reply
gortok
3 hours ago
[-]
For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.
reply
emptybits
1 hour ago
[-]
Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.

John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.

reply
seanmcdirmid
1 hour ago
[-]
> John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.

Honestly, I'd much rather be hit by a Waymo than John.

reply
emptybits
47 minutes ago
[-]
I see. Thank you for sharing. Insurance here is mandatory here for all motorists.

If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.

Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?

Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?

I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.

reply
seanmcdirmid
23 minutes ago
[-]
Insurance here is also mandatory for all motorists. Doesn't matter if the rules aren't actually enforced.

Waymo has deep pockets, so everyone is going to try and sue them, even if they don't have a legitimate grievance. Where I live, the city/state would totally milk each incident from a BigCo for all it was worth. "Hit and run" by a drunk waymo? The state is just salivating thinking about the possibility.

I don't agree with you that BigCorp doesn't have any skin in the game. They are basically playing the game in a bikini.

reply
xnx
43 minutes ago
[-]
> John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of

https://en.wikipedia.org/wiki/Judgment_proof

reply
asystole
56 minutes ago
[-]
>John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!

reply
emptybits
28 minutes ago
[-]
Yes, that is the specific deterrence effect.

The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.

reply
jobs_throwaway
3 hours ago
[-]
So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me
reply
entuno
2 hours ago
[-]
Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.
reply
bpodgursky
3 hours ago
[-]
Personally I'm a lot more interested in kids not dying than in making income for injury lawyers. But that's just me.
reply
rationalist
3 hours ago
[-]
Your comment implies that they are less interested in kids not dying. Nowhere do they say that.
reply
bpodgursky
3 hours ago
[-]
I'm not interested in the policy question.
reply
rationalist
3 hours ago
[-]
Then don't reply??

That still doesn't excuse trying to make them look bad.

reply
bpodgursky
2 hours ago
[-]
It was a reply to my comment.
reply
boothby
1 hour ago
[-]
No, "the ledger" should record actual facts, and not whatever fictional alternatives we imagine.
reply
direwolf20
1 hour ago
[-]
Fact: This child's life was saved by the car being driven by a computer program instead of a human.
reply
NoGravitas
1 hour ago
[-]
Instead of a human who was driving exactly the same as the Waymo up until the instant the child ran out. Important distinction.
reply
frankharv
3 hours ago
[-]
Would have. Could Have. Should have.

Most humans would be halfway into other lane after seeing kids near the street.

Apologist see something different than me.

Perception.

reply
axus
3 hours ago
[-]
Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.

Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues

"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."

reply
AnotherGoodName
2 hours ago
[-]
We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.

We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.

To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.

Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.

reply
henning
3 hours ago
[-]
Q: Why did the self-driving car cross the road?

A: It thought it saw a child on the other side.

reply
direwolf20
3 hours ago
[-]
That's Tesla. Waymo seems mostly ok.
reply
whynotminot
3 hours ago
[-]
I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.

Any accident is bad. But accidents involving children are especially bad.

reply
dylan604
2 hours ago
[-]
That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.
reply
trollbridge
26 minutes ago
[-]
Well, I'm a human and I figure out how to avoid school zones.
reply
whynotminot
2 hours ago
[-]
It might not be possible for a lot of places — I don’t really know.

But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.

Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)

reply
dylan604
2 hours ago
[-]
Most people prefer the shortest ride. Circling around school zones would be the opposite of that. Rides are charged based on distance, so maybe this would interest Waymo, but one of the big complaints about taxi drivers was how drivers would "take them for a ride" to increase the fare.
reply
whynotminot
2 hours ago
[-]
Seems like a solvable problem: make it clear on the app/interior car screens that a school zone is being avoided — I think most riders will understand this.

You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.

At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.

Again, this all seems very solvable.

reply
ripped_britches
1 hour ago
[-]
Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company
reply
joshribakoff
3 hours ago
[-]
> The vehicle remained stopped, moved to the side of the road

How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.

reply
callumgare
3 hours ago
[-]
My reading of that is that they mean stopped the progression of the journey rather that made no movement whatsoever.
reply
lokar
3 hours ago
[-]
I agree, it’s poorly worded but I think that’s what they mean.

I also assume a human took over (called the police, moved the car, etc) once it hit the kid.

reply
BugsJustFindMe
3 hours ago
[-]
They mean the vehicle didn't drive away. It moved to the side of the road and then stopped and waited.
reply
jsrozner
29 minutes ago
[-]
So many tech lovers defending waymo.

If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.

Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.

The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.

reply