_______ __ _______
| | |.---.-..----.| |--..-----..----. | | |.-----..--.--.--..-----.
| || _ || __|| < | -__|| _| | || -__|| | | ||__ --|
|___|___||___._||____||__|__||_____||__| |__|____||_____||________||_____|
on Gopher (inofficial)
HTML Visit Hacker News on the Web
COMMENT PAGE FOR:
HTML US probes Waymo robotaxis over school bus safety
PeterStuer wrote 9 hours 1 min ago:
The news for me was: Yahoo still exists?
tialaramex wrote 12 hours 48 min ago:
IIRC There's a principle in Judaism about deliberately not doing things
which you know aren't forbidden but might reasonably be interpreted
(wrongly) as forbidden by any observers. So that not only are you
behaving correctly but also onlookers see you behaving correctly. For
example if you're not supposed to eat bacon, the fact this product
looks like bacon means you shouldn't eat that, even though it's not
bacon - because if you do and somebody else sees that, what they saw
was you eating bacon.
In this case it may well be safe for the Waymo to pass a bus but, the
rule says not to pass a bus because humans will assume if the Waymo can
pass a bus so can they and that's false.
m0llusk wrote 21 hours 43 min ago:
This is a great technology and has clearly made great strides, but at
this time it is hard to trust. These vehicles have had many problems
that human drivers do not. Problems with maps can cause dozens of them
to collect in dead end alleys. They may stop on busy one lane roads.
They consistently fail to react appropriately to responders and
emergency situations. And even if the supposedly reliable recording
and reporting work out it is not always clear who is responsible when
things go wrong. Simply not killing as often as humans is not good
enough for mass deployment of this technology.
atleastoptimal wrote 22 hours 10 min ago:
On net, Waymos are safer than human drivers. Really all that matters is
deaths per passenger mile, and weighted far less, injury/crash per
passenger mile.
Waymos exceed human drivers on both metrics, thus it is reasonable to
say that Waymos have reduced crashes compared to the equivalent average
human driver covering the same distance.
Mistakes like this are very rare, and when they do happen, they can be
audited, analyzed with thousands of metrics and exact replays, patched,
and the improved model running the Waymo is distributed to all cars on
the road.
There is no equivalent in humans. There are millions of human drivers
currently driving who drive distracted, drunk, recklessly, or
aggressively. Every one of them who is replaced with a Waymo is
potentially many lives saved.
Approximately 1/100 deaths in the US are due to car fatalities. Every
year autonomous drivers aren't rapidly deployed is just unnecessary
deaths.
6stringmerc wrote 8 hours 25 min ago:
So what will these humans alternatively perish from? Old age? How is
that fiscally possible?
bloppe wrote 1 hour 33 min ago:
Probably heart disease
YeGoblynQueenne wrote 8 hours 26 min ago:
>> Really all that matters is deaths per passenger mile, and weighted
far less, injury/crash per passenger mile.
That's not exactly right. You need to take into account how likely it
is for accidents to happen, not just the number of miles travelled.
If the low probability of accidents is taken into account it turns
out it takes many more millions or even billions of miles than
already travelled for self-driving cars to be considered safe. See:
Driving to Safety
How Many Miles of Driving Would It Take to Demonstrate Autonomous
Vehicle Reliability?
Given that current traffic fatalities and injuries are rare events
compared with vehicle miles traveled, we show that fully autonomous
vehicles would have to be driven hundreds of millions of miles and
sometimes hundreds of billions of miles to demonstrate their safety
in terms of fatalities and injuries. Under even aggressive testing
assumptions, existing fleets would take tens and sometimes hundreds
of years to drive these miles â an impossible proposition if the
aim is to demonstrate performance prior to releasing them for
consumer use. Our findings demonstrate that developers of this
technology and third-party testers cannot simply drive their way to
safety. Instead, they will need to develop innovative methods of
demonstrating safety and reliability.
HTML [1]: https://www.rand.org/pubs/research_reports/RR1478.html
tencentshill wrote 8 hours 46 min ago:
It just looks really bad when that one in a million death is caused
by something stupid a human would never do. So far these Waymos are
only replacing taxi and Uber drivers, which have a lower rate of
accidents than the general population.
"Uber reported 0.87 fatalities per 100 million vehicle miles traveled
(VMT) in 2021â2022"
HTML [1]: https://insurify.com/car-insurance/insights/rideshare-driver...
ozgrakkurt wrote 13 hours 30 min ago:
Taxi drivers or bus drivers are also much safer than regular people
if you interpret it like that
bloppe wrote 1 hour 33 min ago:
Which is also true
blub wrote 13 hours 34 min ago:
Actually, your assumptions that human drivers cannot be improved are
wrong. Modern cars have a lot of safety features to help avoid
accidents:
* lane keeping with optional steering
* pedestrian and obstacle detection at the front
* pedestrian and obstacle detection when reversing with automatic
braking
* assisted driving with lane keeping and full stop / driving on in
case of traffic jam
Waymos are just fancy taxis. And taxis havenât replaced all human
drivers or solved traffic accidents.
ErroneousBosh wrote 17 hours 15 min ago:
> Approximately 1/100 deaths in the US are due to car fatalities.
Every year autonomous drivers aren't rapidly deployed is just
unnecessary deaths.
You could improve driver training. American drivers are absolutely
terrifying.
javagram wrote 13 hours 33 min ago:
âCouldâ is doing a lot of work there I think.
I suspect most problematic American drivers already know they
arenât supposed to text, drink, or watch or record TikToks while
they drive, but simply do it anyway because they are aware these
laws are under-enforced.
iambateman wrote 22 hours 18 min ago:
This is as close to functional as any car discussions getâ¦citizens
reported some issues, the government is checking on it, and itâs
going to get fixed.
Avi-D-coder wrote 22 hours 43 min ago:
At some point self driving cars will need their own loser driving laws.
Perhaps allowing them to drive around school buses is not a good idea,
although personally I have felt far safer biking or walking in front of
a Waymo than a human. But rules few humans follow, like rolling stops,
and allowing them to go 5 over seems like a no-brainer. We have a real
opportunity here to br more sensible with road rules; letâs not mess
it up by limiting robots to our human laws.
platevoltage wrote 21 hours 4 min ago:
What do we have to gain by allowing self driving vehicles to roll
through stop signs?
lingrush4 wrote 13 hours 9 min ago:
Faster commutes and less wasted energy. This is obvious to anyone
even moderately intelligent.
ninalanyon wrote 14 hours 9 min ago:
We, the general public, gain nothing.
Corporations gain control of public spaces by allowing corporations
to cast other road users as incompetent. Much the same as GM,
etc., did with jay walking laws in the US.
Distinguishing between human and robot drivers in this way benefits
only corporations and the politicians they pay.
anitil wrote 1 day ago:
I have a question about the rules of school busses (I'm not American).
It seems like the expectation is that _all_ traffic is required to stop
if a bus is stopped, is that correct? If so, why?
Here (Australia) the bus just pulls over and you get off on to the
sidewalk, even children, why is it not the case in the US?
jofzar wrote 15 hours 9 min ago:
I'm an Australian also, this is the video that blew my mind. [1] It's
a long video but the tldr is that Americans don't have foot paths.
You would think they would but nope, it's not like Australia where
everywhere you walk has a path and down paths to the road.
Even directly around schools no footpaths, and it's all because it's
no one's responsibility other then the home owner.
HTML [1]: https://youtu.be/lShDhGn5e5s
brainwad wrote 14 hours 47 min ago:
Plenty of Australian suburbs have no footpaths either. The footpath
appearing and disappearing thing also happens.
jofzar wrote 14 hours 1 min ago:
Yes but not like America has it.
daemonologist wrote 1 day ago:
As mentioned, in a lot of suburban areas in the US where school buses
are common there are no crosswalks or traffic lights (or sidewalks or
physical bus stops, for that matter). Most of the time there isn't
so much traffic that stopping all of it is a huge burden.
Also, there's generally an exception for divided highways - if the
road has a physical median or barrier, the oncoming traffic doesn't
have to stop. I assume the bus route accounts for this and drops
kids off on the correct side of the road.
Loughla wrote 1 day ago:
(St)Roads where the kids have to cross a busy road to get to the
other side where their house is.
In my case, a rural highway where traffic goes 55mph.
Is better to stop all traffic than force kids to figure out how to
frogger through traffic.
Dylan16807 wrote 20 hours 23 min ago:
> (St)Roads where the kids have to cross a busy road to get to the
other side where their house is.
That's pretty different from my experience.
Almost all the school bus stops around here are on small low-speed
residential streets.
And while there are surely some stops on faster 2-lane roads...
A stroad or major road would mean 4+ lanes, which in my state means
the school bus only stops traffic on one side. No kids will be
crossing at those bus stops.
anitil wrote 1 day ago:
Ah there might be some assumption here that I didn't realise.
Typically we'd have a cross walk or traffic light near the bus stop
where you'd cross. I'm in Sydney so I don't know of anywhere that
you'd be going that fast that would also have bus stops (they max
exist I'm just not aware of them)
strken wrote 21 hours 51 min ago:
The kids near me (in Melbourne, about 10km outside the CBD) just
take the same public transport system as everyone else. You don't
see school bus systems unless you're in the far outer suburbs, a
regional/rural area, or maybe some other special cases.
Growing up, our school bus stop was on a service road off a
100km/h highway, but it had good visibility in both directions
and most of the kids over the other side got dropped off by their
parents while they were young.
Loughla wrote 1 day ago:
Rural areas especially, but most small towns in the US don't have
crosswalks.
The closest crosswalk to my bus stop as a kid was about 45 miles.
standardUser wrote 1 day ago:
San Francisco is the crucible (by US standards) of dealing with
pedestrians and I'm still shocked they launched there so early. But
with something as distinct and vulnerable as school busses, it's time
to think about hardware installation so automated vehicles can "see"
farther ahead.
platevoltage wrote 20 hours 59 min ago:
I'm sure it's only a matter of time before the tax payers get to
subsidize these hardware installations instead of our own public
transit.
llsf wrote 1 day ago:
I cannot wait for the school bus to be a waymo, that could tell the
other waymos around that it is full of vulnerable and unpredictable
little humans, and to be on the watch out.
beeflet wrote 17 hours 39 min ago:
I can't wait until every car on the road is required by law to be
self driving. You could have cars with no adults in them just driving
puppers around, and it can tell the other cars hey watch out I've got
a couple of good pupperinos inside so watch out!
The future is gonna be awesome. I fricking love science! Once we
unlock self driving car technology, we will finally be able to move
people and things from one place to another. All we have to do is
force everyone on the road to install a transponder in their car that
allows the government to track their location at all times, and
develop a massively complex computer-camera system inside of the car
that phones home and controls what the car is allowed to do.
Animats wrote 1 day ago:
There's a video of the actual incident.[1] (Yahoo posted some file
photo).
The Waymo was entering from a side street, in front of the school bus.
It clearly recognized that it was in an iffy situation and slowed to
creeping speed, rather than blocking the intersection. No children are
visible.
If the school bus has a dashcam, much better info may be available.
This video starts too late.
HTML [1]: https://www.youtube.com/watch?v=uSjwolFxvpc
andoando wrote 1 day ago:
I dont get it. It looks like it just made a left turn in front of a
stopped school bus? That's illegal?
In any case it seems like tiny issue. Illegal or not it didnt do
anything dangerous
cryptoegorophy wrote 20 hours 16 min ago:
And even that. Speed of the sensors and computer power will
outperform humans if it were to drive and a child would be
sprinting behind the bus.
spike021 wrote 22 hours 44 min ago:
the point of a bus having lights flashing and the stop sign
extended is that kids could be coming or going from any direction
and especially when least expected. it's certainly a minor issue
until the worst case scenario happens.
bloppe wrote 1 hour 30 min ago:
And it would be an even bigger issue if the driver did not have
perfect 360 degree spacial awareness and could react to a child
in single digit milliseconds
globular-toast wrote 16 hours 18 min ago:
This implies the absence of a school bus with flashing lights
means kids can't be coming and going from any direction when
least expected. It's a horrible solution and just another example
of reducing drivers' responsibility on the road and effectively
making it the victim's fault for being there.
The Waymo is going to be on high alert at all times, regardless
of any flashing lights or stop signs.
DangitBobby wrote 21 hours 4 min ago:
Lots of people make this mistake around school buses. It's
probably time for a different system if we are worried about
children's safety.
wiether wrote 17 hours 5 min ago:
We have some nice initiatives here.
Either completely removing cars from streets near schools, or
blocking cars when children are coming or leaving school.
HTML [1]: https://fr.wikipedia.org/wiki/Rues_aux_%C3%A9coles_%C3...
DangitBobby wrote 3 hours 27 min ago:
The issue in the US is not about safety near schools. School
buses often have to go pretty far to drop evyone off, so most
of their stops are not near the school. For route
optimization they'll drop kids off on the opposite side of
the road from where they need to be, not at a bus stop and
not anywhere near a crosswalk. Also kids in the US tend to
not be very mindful of how dangerous the road can be, so they
are liable to run into the street unpredictably. To make sure
kids don't get hit by a car that didn't see them, when
stopping a school bus deploys a stop sign from its side that
all drivers going either direction on the road must abide by
(usually, there are exceptions, which makes matters worse).
Drivers occasionally accidentally run these stop signs and
very rarely intentionally run them.
spike021 wrote 18 hours 53 min ago:
Sounds more like a testing problem to me. Honestly I can't even
remember if this particular rule was on the license exam when I
took it. I know it because I put more care into remembering
driving laws but many people don't.
DangitBobby wrote 3 hours 43 min ago:
Possibly an occasional refresher would help. I think it's
just a weird thing to have. A roving stop sign that appears
and disappears conditionally is going to have some people not
see it (people accidentally run stationary stop signs on
occasion), especially if you don't encounter school buses
often. It's been maybe a decade since I needed to stop for
one myself, I honestly cannot remember the last time.
cco wrote 23 hours 0 min ago:
haha great proof that humans don't follow the law all the time just
like Waymo.
Yes, if you see a school bus with its flashers on, you may not pass
it. Period.
FireBeyond wrote 10 hours 28 min ago:
See my sibling's comment about lanes and medians, but in general,
yes.
In fact, a school bus with red flashers on is, in my state,
passing it is the only thing we cannot do in an emergency vehicle
(in my case, ambulance and fire engine), even in "emergency mode"
(lights and sirens both active).
I've only ever had this happen twice though, and in both cases
the bus drivers stopped the process and turned their lights off
for us.
alasdair_ wrote 22 hours 5 min ago:
That depends on how many lanes there are and if there is a
median.
sjsdaiuasgdia wrote 23 hours 57 min ago:
> That's illegal?
The school bus' stop sign was extended and had red lights flashing.
With the proximity to the intersection, it's most appropriately
treated as an all-way stop.
Regardless of whether the bus' stop sign applies to cross streets,
at some point in the turn the car is now in parallel with the bus,
and the sign would apply at that point.
Also, you're blind to anyone who may be approaching the bus from
the opposite side of the intersection.
SoftTalker wrote 1 day ago:
"approached the school bus from an angle where the flashing lights and
stop sign were not visible"
I call bullshit on that. Yes the stop sign is only on the left side but
the flashing lights are on all four corners of the bus. You'd need to
be approaching the side of the bus from a direct right angle to not see
the flashing lights.
DangitBobby wrote 3 hours 20 min ago:
Nah, if you want people to stop reliably the stop sign needs to be
visible from all directions you care about. Just add it to the list
of reasons why the roving random stop sign deployment solution for
school buses is a bad one.
trhway wrote 1 day ago:
there have been increase of "aggressiveness" of autonomous cars. My
earlier comment - [1] . May be that aggressiveness is sold internally
as some optimization enabled by the higher skills of the
robot-driver.
HTML [1]: https://news.ycombinator.com/item?id=45609139
netsharc wrote 1 day ago:
Off-topic... what poor writing:
> a Waymo did not remain stationary when approaching a school bus with
its red lights flashing and stop arm deployed.
Because it's physically possible to approach something while remaining
stationary?
jmpman wrote 1 day ago:
Iâm also curious about school zones. The one near my house has a
sign, âSchoolâ
âSpeed Limit 35â
â7:00AM to 4:00PM School Daysâ
Now, how does a robotaxi comply with that? Does it go to the district
website and look up the current school year calendar? Or does it work
like a human, and simply observe the patterns of the school traffic,
and assume the general school calendar?
I suspect it continues in Mad Max mode.
ninalanyon wrote 14 hours 15 min ago:
It can just read the sign surely? My ancient Tesla S can read simple
speed limit signs and in France distinguishes between those that
apply to it and those that apply only to lorries because of the
notice below the speed limit sign.
Then it really would be as simple as looking up the calendar or
simply erring on the safe side that all weekdays are schooldays.
Waymo only operates on fully mapped roads anyway so I think that
Waymo could be reasonably expected to include such abilities.
renewiltord wrote 1 day ago:
Are school days ever Sundays? If not, perhaps all drivers just treat
every non-Sunday as school day. If so, they probably just slow every
day.
buckle8017 wrote 1 day ago:
They should just always observe the lower speed limit.
The difference is usually 5 or maybe 10 mph.
Which over the distance of a school zone is nothing.
bink wrote 1 day ago:
It can be dangerous though. In my area we have roads with speed
limits of 45 that drop to 25 "when children are present". My EV
always assumes children are present as it has no real way to
determine if they are. Driving 25 in a 45 is dangerous for many
reasons.
buckle8017 wrote 21 hours 16 min ago:
Neither do you, the lower speed limit applies when children are
present inside the school.
A building that looks the same with it without children inside of
it.
izacus wrote 1 day ago:
Wait, how does that work? Every person in your city needs to know the
exact calendar of that school?
ErroneousBosh wrote 17 hours 5 min ago:
In the UK we have a sign saying something like "20MPH WHEN LIGHTS
ARE FLASHING". During term time when pupils are entering or leaving
the school (say between 8am and 9:30am, around lunchtime, and from
around 3pm to 4:30pm) someone at the school switches the lights on.
Usually it's one of the "lollipop men" who stand at crossing points
that are not otherwise marked, and hold out a sign to stop traffic
to let children cross, but often it's just programmed into some
timer somewhere.
It's pretty simple.
You don't need clever software or self-driving cars, you just need
to lift your right foot a little near schools. [1] Here is an
example of one that just lights up with a 20mph limit when it's
needed, from near where I grew up. Pretty high-tech for a remote
part of the world, eh?
HTML [1]: https://maps.app.goo.gl/34QgN2KTQmGML2Ae8
izacus wrote 16 hours 3 min ago:
Ok, I get it. Here we just have 30kph limit with speed bumps at
all times.
toast0 wrote 23 hours 52 min ago:
Where I am, the school zone signs fold up; during the off season,
they're folded and say things like drive nice; during the on
season, they are unfolded and present the limit.
anitil wrote 1 day ago:
In NSW (Australia) that's exactly how it works. And it includes
'pupil-free' days where there are no students present. My old
school even had a pedestrian bridge and barriers so that it wasn't
even possible to get to the road.
It's so silly, when the obvious solution is to make school zones
40km/hr (25mi/hr) at all times, or to fix the road design. Typical
speeds here are 60km/hr (40mi/hr), so anyone making the argument
that it would 'slow traffic' is being dramatic.
(There is one exception that I know of - our east coast highway
used to go near a school, which forced a change from 110km/hr
(70mi/hr) to 40km/hr. In this case I will concede the speed is not
the issue, the highway location is the issue)
Dylan16807 wrote 20 hours 48 min ago:
> (There is one exception that I know of - our east coast highway
used to go near a school, which forced a change from 110km/hr
(70mi/hr) to 40km/hr. In this case I will concede the speed is
not the issue, the highway location is the issue)
They couldn't just put up a fence?
askvictor wrote 1 day ago:
In Victoria there is usually (not certain if it's always) a
changeable sign and flashing lights if it's reduced to 40
phyzome wrote 1 day ago:
You can always just slow down for 30 seconds if you're not sure.
hamdingers wrote 1 day ago:
You are not penalized for failing to go over 35 on non-school-days.
School zones are sufficiently small that the time penalty for
complying on a summer weekday isn't that much of an inconvenience.
RaftPeople wrote 1 day ago:
In the area I live, the wording is frequently "when children
present" so you don't need to know school schedule.
username223 wrote 1 day ago:
This is the one most familiar to me. Usually the signs have
flashing orange lights to indicate when they're active, but
sometimes not. You generally know roughly when the kids are in
school (maybe look at the school?), and follow what other drivers
are doing. Things like this are why I think fully autonomous
driving basically requires AGI.
themafia wrote 1 day ago:
Present means "present in the school." It's not always
observable while driving by if you need to obey the reduced limit
or not. California does it and I find it absurd.
Many other states setup a flashing yellow light and program the
light with the school schedule. Then the limit only applies
"when light is flashing." Far more sensible.
slavik81 wrote 1 day ago:
Yes, that's how it works in Alberta. It's particularly confusing
because not all schools have the same academic calendar (e.g., most
schools have a summer break, but a few have summer classes).
Unlike the sibling comment, there are no lights or indications of
when school is in session. You must memorize the academic calendar
of every school you drive past in order to know the speed limit. In
practice, this means being conservative and driving more slowly in
unfamiliar areas.
ninalanyon wrote 14 hours 13 min ago:
Is it really such an imposition to simply drive at the posted
limit at all times when passing a school? It only takes a few
seconds even if you slow to a crawl.
jefftk wrote 1 day ago:
This is another example of something where, at least if you want
to get all the way to completely correct operation, it's easier
for an driverless car than a human. A person can't memorize the
schedules of every school district they might pass, but an
automated system potentially could. Of course something like
Google Maps could solve this too, for both humans and Waymo.
daseiner1 wrote 1 day ago:
the sign says the hours for the reduced speed limit or, more
commonly in my experience, has a light that activates during school
hours.
AlotOfReading wrote 1 day ago:
The light and often even the sign itself are typically considered
informational aids rather than strict determinants of legality.
The driver is expected to comply with all the nitpicky details of
the law regardless of whether the bulb is burned out or the
school schedule changes.
Needless to say, most people regularly violate some kind of
traffic law, we just don't enforce it.
daseiner1 wrote 1 day ago:
of course. i'm confident slowing down near a school is pretty
intuitive for the vast majority of drivers, though.
AlotOfReading wrote 1 day ago:
Sure, but the context here is a discussion about how a
computer can know all of these "intuitive" rules humans
follow.
The answer is encoded in the map data in this case, but it's
an interesting category of problems for autonomous vehicles.
tanseydavid wrote 1 day ago:
Have you had the experience of riding in a Waymo making a
left hand turn against incoming traffic -- and how it
handles the eventual yellow light?
I was very impressed about the decision making in this
situation. Seems very intuitive (at least superficially).
eep_social wrote 23 hours 31 min ago:
it wasnât at first but I suspect they received a ton of
feedback and fixed it.
in my estimation the robo driver has reached a
median-human level of driving skill. it still doesnât
quite know how to balance the weight of the car through
turns and it sometimes gets fussy with holding lanes at
night but otherwise it mimics human behaviors pretty well
except where theyâre illegal like rolling through the
first stop at a stop sign.
fragmede wrote 1 day ago:
Now Iâm imagining the Waymo Driver calling out to Gemini
to determine "school hours" by looking it up on the
Internet, and wondering about the nature of life.
isodev wrote 1 day ago:
Arenât they supposed to read signs? Otherwise theyâd also ignore
the overhead speed limits on the highway for traffic jams / air
quality adjustments during the day.
daemonologist wrote 1 day ago:
GP is saying that reading the sign is insufficient to determine
whether it is a school day. You have to either guess based on the
presence of students or busses, the lights being on, etc., or you
have to source the school calendar somehow.
isodev wrote 19 hours 18 min ago:
I don't know how it is there, but here those signs near schools
light up and blink during school hours (really can't miss it).
And for signs that do not, I think school days are pretty fixed,
shouldn't be difficult to program... and a default of just
slowing down would be just fine too.
krisoft wrote 1 day ago:
To be honest. I think this is one of the strengths of autonomous cars.
With humans when they do this at max we can punish that individual. To
increase population wide compliance we can do a safety awareness
campaign, ramp up enforcement, ramp up the fines. But all of these cost
a lot of money to do, take a while to have an effect, need to be
repeated/kept up, and only help statistically.
With a robot driver we can develop a fix and roll it out on all of
them. Problem solved. They were doing the wrong thing, now they are
doing the right thing. If we add a regression test we can even make
sure that the problem won't be reintroduced in the future. Try to do
that with human drivers.
kiba wrote 15 hours 22 min ago:
Road designs play an important role as well, it's not just enforcing
the law.
Some roads are going to be safer simply because drivers don't feel
safe driving fast. Others are safer simply because there's less
opportunities to get into a collision.
Wide street in cities encourage faster driving which doesn't really
save a lot of time while making the streets more dangerous, for
example.
heavyset_go wrote 16 hours 15 min ago:
Unless there is one car that everyone drives, it will never be this
easy.
And if there is one car that everyone drives, it's equally easy for a
single bug to harm people on a scale that's inconceivable to me.
goobatrooba wrote 13 hours 29 min ago:
Maps and routing errors would likely lead to masses of deaths, an
entire motorway population rather than individuals not paying
attention..
Like the various "unfinished/broken bridge" deaths that have
happened with Google maps involved (not saying to blame.. but
certainly not innocent either) [1]
HTML [1]: https://www.bbc.com/news/world-us-canada-66873982
HTML [2]: https://www.bbc.com/news/articles/cly23yknjy9o
manwe150 wrote 9 hours 19 min ago:
Granted having all warning signs and barricades removed by
vandals seems like the more major issue there, which drivers
usually do pay attention to
falcor84 wrote 15 hours 51 min ago:
Well, maybe not "this easy", but if we can all agree on an
extensive test suite that all autonomous cars have to follow to be
allowed on the road, it'd be almost like that, without the risk of
a single bug taking down all of them.
tgv wrote 18 hours 36 min ago:
> With a robot driver we can develop a fix and roll it out on all of
them. Problem solved.
I find that extremely optimistic. It's almost as if you've never
developed software.
I am curious about Waymo's testing. Even "adding a regression test"
can't be simple. There is no well defined set of conditions and
outputs.
> Try to do that with human drivers.
At least where I live, the number of cars and car-based trips keeps
increasing, but the number of traffic deaths keeps falling.
krisoft wrote 8 hours 9 min ago:
> It's almost as if you've never developed software.
I do develop software. In fact I do develop self driving car
software.
Yes it is not easy. Just talking about this particular case. Are
the cars not remaining stationary because the legally prescribed
behaviour is not coded down? Or are they going around school busses
because the "is_school_bus" classifier or the
"is_stop_arm_deployed" classifier having false negative issues? If
we fix/implement those classifiers will we see issues caused by
false positives? Will we cause issues where the vehicles suddenly
stop when they think they see a stop arm but there isn't one
actually? Will we cause issues if a bus deploys a stop arm as we
are overtaking them? What about if they deploy the stop arm while
we are 10 meter behind them? 20? 30? 40? 100?
And that's just one feature. How does this feature interact with
other features? Will we block emergency vehicles sometimes? What
should we do if a police person is signalling us to proceed, but
the school bus's stop arm is stopping us? If we add this one more
classifier will the GPU run out of vram? Will we cause thread
thrashing? Surely not, unless we implement it wrong. In which case
definitely. Did we implement it right? Do we have enough labeled
data about stop arms of school buses? Is our sensor resolution good
enough to see them far enough? Even in darkness? What about fog? Or
blinding light? Do every state/country uses the same rules about
school busses?
> I am curious about Waymo's testing
They do publish a lot. This one is nice overview but not too
technical: [1] Or if you want more juicy details read their papers:
HTML [1]: https://downloads.ctfassets.net/sv23gofxcuiz/4gZ7ZUxd4SRj1...
HTML [2]: https://waymo.com/safety/research/
userbinator wrote 19 hours 48 min ago:
What a dystopian view.
Spooky23 wrote 23 hours 55 min ago:
I disagree about the fixing, because ultimately self driving services
will have political power to cap their liability. Once they dial in
the costs and become scaled self sustaining operations, the incentive
will be reduced opex.
I think the net improvements will come from the quantitative aspect
of lots and lots of video. We donât have good facts about these
friction points on the road and rely on anecdotal information, police
data (which sucks) and time/morion style studies.
JumpCrisscross wrote 20 hours 21 min ago:
> ultimately self driving services will have political power to cap
their liability
You're fighting an objectively safer future on the basis of a
hypothetical?
Also, we already have capped liability with driving: uninsured and
underinsured drivers.
Spooky23 wrote 13 hours 11 min ago:
The real cap is the operator ultimately is accountable.
When a software defect kills a bunch of people, the robot
operatorâs owners will subject to a way lower level of
liability. Airlines have international treaties that do this.
An objectively safer future is common carriers operating mass
transit. Robot taxi will creating a monster that will price out
private ownership in the long term. Objectively safer remains to
be seen, and will require a nationwide government regulatory body
that wonât exist for many years.
JumpCrisscross wrote 1 hour 59 min ago:
> real cap is the operator ultimately is accountable
Which is in practice lower than what a large operator would
pay, particularly if they also write the software.
> will require a nationwide government regulatory body
It doesnât require any such thing. That would be nice. But
states are more than capable of regulating their roads.
bobthepanda wrote 20 hours 25 min ago:
even if we had good data, the major problem in the US is that the
funding liabilities of transportation agencies generally massively
outweighs revenues, particularly if legislators keep earmarking
already limited funds for yet more road expansion in their
districts.
1970-01-01 wrote 1 day ago:
It's a strength if you catch the bug and fix it before it injures
anyone. If anything, this proves edge-cases can take years to
manifest.
tehjoker wrote 1 day ago:
Why accept the company's say so without any proof being offered or
even a description of the fix? If it's been years and this kind of
thing, described in regulations so clearly some attention was paid
by engineers, still happens, then maybe fixing it isn't trivial.
kelnos wrote 23 hours 17 min ago:
Sure, perhaps we shouldn't accept the company's say-so, but this
seems like a fairly easy thing for a third party to verify. If
that's not being done, that's not Waymo's fault; lobby the local
regulatory body or legislature to get that sort of thing
required.
tehjoker wrote 16 hours 59 min ago:
Maybe verifying isn't trivial either? Sometimes bugs only
appear with a lot of interactions.
dangus wrote 1 day ago:
As a counterpoint, a large fine or jail time as a deterrent actually
has meaning.to an individual.
For a company, it's a financial calculation. [1] .
(Add the period to the end of the link, HN won't do it)
HTML [1]: https://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co
05 wrote 1 day ago:
HTML [1]: https://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co%2E
themafia wrote 1 day ago:
> we can develop a fix and roll it out on all of them.
You have to know what you're fixing first. You're going to write a
lot of code in blood this way.
It's not that people are particularly bad at driving it's that the
road is exceptionally dynamic with many different users and use cases
all trying to operate in a synchronized fashion with a dash of strong
regulation sprinkled in.
krisoft wrote 22 hours 42 min ago:
> You have to know what you're fixing first.
In this case the expected behaviour is clearly spelled out in the
law.
> You're going to write a lot of code in blood this way.
Do note that in this case nobody died or got hurt. People observed
that the autonomous vehicles did not follow the rules, the company
got notified of this fact and they are working on a fix. No blood
was spilled to achieve this result.
Also note that we spill much blood on our roads already. And we do
that without much of any hope of learning from individual
accidents. When George runs over John there is no way to turn that
into a lesson for all drivers. There is no way to understand what
went wrong in Georgeâs head, and then there is no way to adjust
all driverâs heads so that particular problem wonât happen
again.
Earw0rm wrote 19 hours 28 min ago:
There are ways, but our individualistic, consumerist,
convenience-first society is reluctant to implement them - as,
same as gun control, they're incompatible with certain notions of
freedom.
jfoster wrote 22 hours 10 min ago:
And "much blood" is (globally) to the tune of ~1.2 million lives
lost, and many more injuries.
Compared to that, autonomous vehicles have barely harmed anyone.
Also they will probably save most of those lives once they become
good.
The "least harm" approach is to scale autonomous vehicles as
quickly as possible even if they do have accidents sometimes.
Earw0rm wrote 19 hours 23 min ago:
That's true at least once they surpass human drivers in
collisions per driver mile under equivalent conditions.
It seems like we're pretty close to that point, but the numbers
need to be treated with care for various reasons. (Robotaxis
aren't dealing with the same proportions of conditions - city
vs suburban vs freeway - and we should probably exclude
collisions caused by human bad-actors which should have fallen
within the remit of law enforcement - drink/drugs, grossly
excessive speed and so on).
sdenton4 wrote 18 hours 48 min ago:
Why should we exclude the cases of human bad-actors? That's
explicitly a major case solved by getting rid of the human
behind the wheel...
manwe150 wrote 9 hours 28 min ago:
At least some of them will likely still occur as those
people may decide to override the robot drivers safer
choices to save 30 seconds or have fun
Earw0rm wrote 11 hours 2 min ago:
Because the baseline of human-operated safety is "get law
enforcement to do their job of getting rid of the bad
actors."
sdenton4 wrote 10 hours 3 min ago:
Why is that the baseline? Actual human performance as it
exists today gives us tens of thousands of road
fatalities per year in the US. We have not solved that
problem despite decades of opportunity to introduce
regulations and enforcement. Getting rid of human drivers
looks like a very promising way forward.
inglor_cz wrote 17 hours 14 min ago:
This is a tradeoff, in which the original case might have
been the less dangerous one.
Autonomous fleets have a major potential flaw too, in form
of a malicious hacker
gaining control over multiple vehicles at once and wreaking
havoc.
Imagine if every model XY suddenly got a malicious OTA
update and started actively chasing pedestrians.
sdenton4 wrote 10 hours 9 min ago:
Hm, so you would put a hypothetical scenario on the same
footing as thousands of actual deaths caused by drunk
drivers each year? 30% of us road fatalities involve a
drunk driver each year...
I seriously doubt that the "mass takeover and murder"
scenario would ever actually happen, and further doubt
that it would cause anywhere near 10k deaths if it did
occur.
inglor_cz wrote 4 hours 44 min ago:
"I seriously doubt that the "mass takeover and murder"
scenario would ever actually happen"
OK, so you are optimistic. My own specialization is
encryption/security, so I am not. State actors can do
such things, too, and we've already had a small wave of
classical physical-world sabotages in Europe that
everyone suspects Russia of.
"further doubt that it would cause anywhere near 10k
deaths"
This is something I can agree upon, but you have to
take into account that human societies don't work on a
purely arithmetic/statistical basis. Mass casualty
events have their own political and cultural gravitas,
doubly so if they were intentional.
Sinking of the Titanic shocked the whole world and it
is still a frequent subject for artists 100 years
later, even though 1500 deaths aren't objectively that
many. I don't doubt that way more than 1500 people
drowned in individual accidents worldwide in April 1912
alone, but the general public didn't care about those
deaths.
And a terrorist attack with merely 3000 dead put the US
on a war footing for more than a decade and made it
spend a trillion dollars on military campaigns, even
though drunk American drivers manage the same carnage
in five months or so.
rightbyte wrote 18 hours 30 min ago:
I don't think we are better off putting Elon Musk behind
every wheel.
TOMDM wrote 17 hours 40 min ago:
Good thing no one is suggesting that
rightbyte wrote 17 hours 38 min ago:
I was a bot hyperbolic but having Teslas steer by wire
with remote code execution is close enough to an Elon
Musk behind every wheel. What was the name of the
movie, "Leave the World Behind"?
harperlee wrote 17 hours 31 min ago:
Not sure about a movie but that reminded me of the
"Driver" short story in the "Valuable Humans In
Transit and Other Stories" tome by QNTM ( [1] ).
I'd recommend to buy the book, but here's an early
draft of that particular story:
HTML [1]: https://qntm.org/vhitaos
HTML [2]: https://qntm.org/frame
kelnos wrote 23 hours 39 min ago:
> You're going to write a lot of code in blood this way.
Maybe? In this particular case, it sounds like no one was injured,
and even though the Waymos didn't follow the law around stopping
for school buses, it exercised care when passing them. Not great,
certainly! But I'd wager a hell of a lot better than a human
driver intentionally performing the same violation. And presumably
the problem will be fixed with the next update to the cars'
software. So... fixed, and no blood.
seanmcdirmid wrote 1 day ago:
I haven't dealt with a school bus in....maybe 20 years, and it
would definitely be an exception if I had to deal with one
tomorrow. I kind of know what I should do, but it isn't instinct at
this point.
A waymo, even if it drove in urban Seattle for 20 years where
school buses aren't common, it would know what to do if it was
presented with the exception tomorrow (assuming it was
trained/programmed correctly), it wouldn't forget.
MindSpunk wrote 1 day ago:
> You have to know what you're fixing first. You're going to write
a lot of code in blood this way.
This is exactly how the aviation industry works, and it's one of
the safest ways to travel in the world. Autonomous driving enables
'identify problem -> widely deployed and followed solutions' in a
way human drivers just can't. Things won't be perfect at first but
there's an upper limit on safety with human drivers that autonomous
driving is capable of reaching past.
It's tragic, but people die on roads every day, all that changes is
accountability gets muddier and there's a chance things might
improve every time something goes wrong.
ninalanyon wrote 14 hours 22 min ago:
But other countries have far fewer accidents than the US so it
isn't quite so black and white. The gain from autonomous
vehicles will be much less in the UK for instance.
If you really want to reduce accident rates you need to improve
road design and encourage more use of public transport and
cycling. This requires no new vehicles, no new software, no
driver training, and doesn't need autonomous vehicles at all.
heavyset_go wrote 16 hours 5 min ago:
Planes maintain vertical and lateral separation away from
literally everything. Autonomy is easier in relatively controlled
environments, navigating streets is more unlike flying than it is
similar.
gazook89 wrote 1 day ago:
Also, humans will intentionally act counter to regulations just
to be contrarian or send a message. Look at ârolling coalâ,
or people who race through speed meters to see if they can get a
big number. Or recently near me they replaced a lane to many a
dedicated bus lane, which is now a âdrive fast to pass every
rule followerâ lane.
Earw0rm wrote 19 hours 18 min ago:
For some reason law enforcement seem to be particularly
reluctant to deal with this kind of overtime dumbfuckery when
it involves automobiles.
If you try something equivalent with building regs or tax
authorities, they will come for you. Presumably because the
coal-rolling dumbasses are drawn from the same social milieu as
cops.
fendy3002 wrote 1 day ago:
But you still don't have autonomous flying, even though the case
is much simpler than driving: take off, ascend, cruise, land.
It isn't easy to fix autonomous driving not because the problem
isn't identified. Sometimes two conflicting scenario can happen
on the road that no matter how good the autonomous system is, it
won't be enough
Though I agree that having different kind of human instead will
not make it any safer
Nextgrid wrote 8 hours 53 min ago:
At least one reason for intentionally not having fully
autonomous flying is that you want the human pilots to keep
their skills sharp (so they are available in case of an
emergency).
Spooky23 wrote 23 hours 49 min ago:
Flying is the âeasyâ part. Thereâs a lot more wood behind
the arrow for a safe flight. The pilot is (an important) part
of an integrated system. The aviation industry looks at
everything from the pilot to the supplier of lightbulbs.
With a car, deferred or shoddy maintenance is highly probable
and low impact. With an aircraft, if a mechanic torques a bolt
wrong, 400 people are dead.
inetknght wrote 23 hours 50 min ago:
> But you still don't have autonomous flying, even though the
case is much simpler than driving: take off, ascend, cruise,
land.
Flying is actually a lot more complicated than just driving.
When you're driving you can "just come to a stop". When you're
flying... you can't. And a hell of a lot can go wrong.
In any case, we do have autonomous flying. They're called
drones. There are even prototypes that ferry humans around.
JumpCrisscross wrote 20 hours 20 min ago:
> When you're driving you can "just come to a stop". When
you're flying... you can't
Would note that this is the same issue that made autonomous
freeway driving so difficult.
When we solve one, we'll solve the other. And it increasingly
looks like they'll both be solved in the next half decade.
fendy3002 wrote 21 hours 6 min ago:
a bit unclear from my statement before but that's the point.
Something that feels easy is actually much more complicated
than that. Like weather, runway condition, plane condition,
wind speed / direction, ongoing incidents at airport, etc.
Managing all that scenario is not easy.
the similar things also applied in driving, especially with
obstacles and emergency, like floods, sinkhole in Bangkok
recently, etc.
Dylan16807 wrote 21 hours 7 min ago:
Being unable to abort a flight with a moment's notice does
add complication, but not so much that flying is "a lot more
complicated" than driving. The baseline for cars is very
hard. And cars also face significant trouble when stopping.
A hell of a lot can go wrong with either.
programjames wrote 1 day ago:
The human traffic code is also written in blood. But humans are
worse at applying the patch universally.
hamdingers wrote 1 day ago:
We don't even try. In the US you demonstrate that you know the
rules at one point in time and that's it, as long as you never
get a DUI you're good.
For instance, the 2003 California Driver's Handbook[1] first
introduced the concept of "bike lanes" to driver education, but
contains the advice "You may park in the bike lane unless signs
say âNO PARKING.â" which is now illegal. Anyone who took
their test in the early 2000s is likely unaware that changed.
It also lacks any instruction whatsoever on common modern roadway
features like roundabouts or shark teeth yield lines, but we
still consider drivers who only ever studied this book over 20
years ago to be qualified on modern roads.
1.
HTML [1]: https://dn720706.ca.archive.org/0/items/B-001-001-944/B-...
kelnos wrote 23 hours 20 min ago:
> Anyone who took their test in the early 2000s is likely
unaware that changed.
That's silly. People become aware of new laws all the time
without having to attend a training course or read an updated
handbook.
I took the CA driver's written test for the first time in 2004
when I moved here from another state. I don't recall whether
or not there was anything in the handbook about bike lanes, but
I certainly found out independently when it became illegal to
park in one.
kevincox wrote 10 hours 20 min ago:
I don't doubt that many people are aware of many of the new
laws. But I strongly suspect that a very significant number
of drivers are unaware of many new laws.
Natsu wrote 1 day ago:
Some places will dismiss a traffic ticket if you attend a
driver's education class to get updates, though you can only do
this once every few years. So at least there have been some
attempts to get people to update their learning.
hamdingers wrote 1 day ago:
This only happens if you get a traffic ticket, which is rare
and getting rarer.
Ironically this means the people with the cleanest driving
record are least likely to know the current ruleset.
Detrytus wrote 23 hours 21 min ago:
Which, ironically, would mean that knowing the current rule
set is not needed to drive safe.
jefftk wrote 1 day ago:
> You're going to write a lot of code in blood this way.
Waymo has been doing a lot of driving, without any blood. They
seems to be using a combination of (a) learning a lot from close
calls like this one where no one was hurt even through it still
behaved incorrectly and (b) being cautious so that even when it
does something it shouldn't the risk is very low because it's
moving slowly.
warkdarrior wrote 1 day ago:
Waymo operates in San Francisco, Phoenix, Los Angeles, Austin,
and Atlanta so I am sure they encountered school buses by now and
learned from those encounters.
themafia wrote 1 day ago:
Waymo operates in a very limited scope and area. I would not
attempt to extrapolate anything from their current performance.
kelnos wrote 23 hours 33 min ago:
I absolutely would, since operating in a slowly growing limited
scope and area is a part of the safety strategy.
andoando wrote 1 day ago:
Very limited scope and area is now the whole of a few major
cities. [1] This is actually the one technology I am excited
about. Especially with the Zoox/mini bus /carpool model, I can
see these things replacing personal cars entirely which is
going to be a godsend for cost, saftey and traffic
HTML [1]: https://support.google.com/waymo/answer/9059119?authus...
seanmcdirmid wrote 1 day ago:
> Waymo operates in a very limited scope and area. I would not
attempt to extrapolate anything from their current performance.
This is less and less true every year. Yes, it doesn't drive in
the snow yet, no, I don't drive in the snow either, I'm ok with
that.
mastax wrote 1 day ago:
If you were trying to evaluate that code deployed willy nilly
in the wider world, sure. But that code exists within a
framework which is deliberately limiting rollout in order to
reduce risk. What matters is the performance of the combined
code and risk management framework, which has proven to be
quite good.
Airbus A320s wouldnât be very safe if we let Joe Schmo off
the street fly them however he likes, but we donât. An A320
piloted within a regulated commercial aviation regime is very
safe.
What matters is the safety of the entire system including the
non-technological parts.
dmix wrote 1 day ago:
I'm just curious to see how they handle highways more broadly
which is where the real danger is and where Tesla got in
trouble in the early days. Waymo avoided doing that until
late last year, and even then it's on a very controlled
freeway test in Phoenix, not random highways
HTML [1]: https://waymo.com/blog/2024/01/from-surface-streets-...
thechao wrote 12 hours 49 min ago:
They drive on highways here, in Austin, all the time. They
do just fine. My kids love to wave to Waymo.
lclarkmichalek wrote 22 hours 36 min ago:
Highways are pretty safe. The road is designed from start
to finish to minimise the harm from collisions. Thatâs
not true of urban streets
trollbridge wrote 1 day ago:
⦠assuming the GiantCorp running the robotaxis cares about
complying with the law, and doesnât just pay a fine that means
nothing to them.
whimsicalism wrote 1 day ago:
the discourse around âcorporationsâ has gotten absolutely
ridiculous at this point, especially on this website.
terminalshort wrote 13 hours 46 min ago:
It really has turned into a bitter losers bitch fest in here
dmix wrote 1 day ago:
South Park had a good satire on this sort of generic
anti-corporation comment. paraphrasing
"Corporations are bad"
"Why?"
"Because, you know, they act all corporate-y." [1] (sorry googles
first result was titktok)
HTML [1]: https://www.tiktok.com/@plutotvuk/video/7311643257383963...
paganel wrote 1 day ago:
I agree, much of the people here are still way too lenient when
it comes to big corps.
renewiltord wrote 1 day ago:
We feared the advent of LLMs since they could be used as
convincing spam tools. Little did we know that humans would often
do the same.
varenc wrote 1 day ago:
It's ironic given this forum began as a place for aspiring
startup (Delaware C-Corp) founders.
paganel wrote 1 day ago:
Some of us came here because we were finding
programming.reddit.com too mainstream (after all this thing was
written in Arc! of which almost no-one knew any details for
sure, but it was Lisp, so Lisp cool), for sure we weren't
visiting this place in order to become millionaires.
Even though I agree, there was a time and a place (I'd say
2008-2010) when this forum was mostly populated by "I want to
get rich!" people, maybe that is still the case and they've
only learned to hide it better, I wouldn't know.
saalweachter wrote 11 hours 24 min ago:
I feel like crypto has absorbed a lot of the "I want to get
rich!", so that instead of posts about "Look at my
burrito-as-a-service React app!" it's all "Invest in my
Burritocoin!", which all kind of fades into the background
like the ad banners our eyes pass over without seeing them.
d4mi3n wrote 1 day ago:
Itâs not an unreasonable take given historic behavior. Rather
than decrying the cynicism, what steps can we take to ensure
companies like Tesla/Waymo/etc are held accountable and
incentivized to prioritize safety?
Do we need hasher fines? Give auto regulators as much teeth as
the FAA used to have during accident investigations?
Genuinely curious to see how addressing reasonable concerns in
these areas can be done.
trollbridge wrote 1 day ago:
Right. We have a precedent for how to have an ridiculously safe
transportation system: accidents are investigated by the NTSB,
and every accident is treated as an opportunity to make sure
that particular failure never happens again.
terminalshort wrote 1 day ago:
Why isn't allowing people to sue when they get hurt and general
bad PR around safety enough? Did you see what happened to
Boeing's stock price after those 737 crashes?
d4mi3n wrote 22 hours 26 min ago:
Iâd counter that with the Equifax breach that raised thei
stock prices when it became clear they werenât being fined
into oblivion. Suing is also generally only a realistic
option if you have money for a lawyer.
bluGill wrote 1 day ago:
The first fines should be meaningless to the company. If the issue
isn't fixed the fines should get higher and higher. If the company
fixes one issue but there is a second discovered quickly we should
assume they don't care about safety and the second issue should
have a higher fine than the first even though it is unrelated.
Companies (and people) have an obligation to do the right thing.
lawlessone wrote 1 day ago:
>The first fines should be meaningless to the company.
Why?
hyghjiyhu wrote 7 hours 41 min ago:
The goal should be to make them appropriately cautious. Not
careless but also not paralyzed by fear. Escalating fines have
the property that they are self-tuning. They basically say "go
ahead and try it! But if there are issues you have to fix them
promptly"
Dylan16807 wrote 20 hours 58 min ago:
Because 100 million dollars isn't a reasonable fee for a
traffic violation.
platevoltage wrote 21 hours 11 min ago:
Fines are completely useless if they are small enough to be
considered "the price of doing business".
AlotOfReading wrote 1 day ago:
What do you mean by "second issue"? A second instance of the same
underlying problem, or a different underlying problem? The way
you phrase it as unrelated suggests the latter to me.
It's pretty wild to jump straight to "they don't care about
safety" here. Building a perfect system without real world
testing is impossible, for exactly the same reason it's
impossible to write bug-free code on the first try. That's not a
suggestion to be lax, just that we need to be realistic about
what's achievable if we agree that some form of this technology
could be beneficial.
bluGill wrote 1 day ago:
The courts get to decide that. Often it is a "I know it when I
see it". The real question is did they do enough to fix all
possibly safety issues before this new one happened that was
different. If they did "enough" (something I'm not defining!)
then they can start over.
daseiner1 wrote 1 day ago:
> a fine that means nothing to them
Yes, this is often the case. In this instance, though, endangering
children is just about the worst PR possible. That's strong
leverage.
tanseydavid wrote 1 day ago:
This^^^ -- the impact of positive vs negative PR is unusually
huge with this type of tech.
xbar wrote 1 day ago:
Waymo seems more interested in delivering a true solution than I
have seen elsewhere.
paxys wrote 1 day ago:
I'm not complaining, but like..maybe also do this for the vast majority
of human drivers who also flout these rules.
righthand wrote 1 day ago:
Yeah fair treatment for billion dollar corporations and robots and
all. Who could forget. Waymo is such a lovely person, why would
anyone ask them to do better?
bigstrat2003 wrote 1 day ago:
I mean, we do. The problem is that you need to be physically present
to catch and deal with those people, and you can only really deal
with one party (others will do their thing while a police officer is
dealing with the first driver they stop). Not to mention that drivers
change their behavior if they see the police around, so it's harder
to catch them in the act. So for a variety of reasons it's harder to
solve the human driver problem.
lotsofpulp wrote 1 day ago:
With how cheap high definition cameras are, I donât see why
society needs a person to be physically present.
EvanDotPro wrote 12 hours 28 min ago:
They already have ticketing cameras on school busses in some
areas, at least around Syracuse, New York. Google "school bus
camera ticket" for details.
Unsurprisingly, the rollout was quickly followed by news of 40+
false tickets from busses that were parked at a school. My
understanding is that they were not loading or unloading kids,
did not have their stop sign extended or blinking lights on, but
just happened to be close enough to the adjacent street for the
ticket cameras think the bus was stopped on the street and issue
tickets to the innocently passing cars.
Those tickets were dropped and they're apparently fixing that,
but not a confidence-inducing start to say the least.
r0m4n0 wrote 22 hours 24 min ago:
Yea so as someone who lives on a busy road with daily visibility
into how many people flaunt the law I basically did this to force
the city to make changes to the street. There really isnât much
you can do to the folks who break the law and drive away but high
def video of daily shenanigans is great ammo for other types of
solutions that force drivers into making better decisions.
andoando wrote 1 day ago:
I dont get why cities dont just put up a couple of drones
terminalshort wrote 1 day ago:
I prefer the risk of death over constant surveillance
wffurr wrote 1 day ago:
If only we could live in completely separate jurisdictions.
whimsicalism wrote 1 day ago:
probably the minority on HN, but i don't. i think traffic
enforcement cameras are good and should be expanded
lotsofpulp wrote 1 day ago:
A) everyone is already constantly surveilled via mobile
networks and license plate readers, so surveillance is a moot
point. We might as well get something out of it.
B) the system can be setup to purge and/or record only at
relevant times or during infractions
terminalshort wrote 8 hours 54 min ago:
That's kind of like justifying invading a country by saying
"we have this big military so we might as well get something
out of it."
BeFlatXIII wrote 7 hours 21 min ago:
"If you have a big gun, shoot it." (a.k.a. how the Great
War started)
hamdingers wrote 1 day ago:
If you're want to operate a deadly vehicle in public you need
to compromise, sorry.
terminalshort wrote 1 day ago:
Sounds like you're the one who needs to compromise because
most people agree with me, or we would already have such a
system.
hamdingers wrote 1 day ago:
Sounds like you're confused about the world you live in if
you believe there aren't millions of cameras, many with
ALPR capabilities, pointed at the street already.
I propose they be made actually useful instead of merely
surveillance for surveillance sake, but I can see how that
would feel oppressive to drivers accustomed to getting away
with murder.
kjkjadksj wrote 1 day ago:
âWasnât me in the carâ
hamdingers wrote 1 day ago:
"Don't care, car is registered to you, pay up"
This is only an issue because traffic code violations are
treated like criminal acts instead of... code violations. We
don't have this issue with parking tickets, there's no reason
we should have it with automated red light and school bus
cameras.
BeFlatXIII wrote 7 hours 20 min ago:
> "Don't care, car is registered to you, pay up"
Answers like this are what drives the populace to support
domestic terrorism.
lotsofpulp wrote 1 day ago:
Hence high definition camera. Most states have tints on
windshield and dark tints on front windows as illegal. Also,
the license plate is all that is needed, ticket the owner and
they will readily give up the driver.
Other countries have no issues with camera based traffic law
enforcement.
kjkjadksj wrote 1 day ago:
At least in socal with the way camera based traffic
enforcement it has basically no teeth and plenty of ways like
my quote to weasel out. You can actually ignore the ticket
that is mailed to you. Iâm not even sure HD cameras would
help here. You even have options built into the ticket to say
it wasnât you driving or that it was someone else you know
of in a sort of check a couple boxes and mail it back
fashion. However if you actually look up the status of your
ticket with the ticket number on the web portal, then it
counts as being served a ticket and you do have to pay or
show up in court.
Seems the way the law works is it needs some piece of two way
communication. It doesnât seem to work on a one way basis
like it might in other countries. Maybe it is because most of
our laws concerning technology are very much still structured
for an analog world. E.g how in this case the old ritual of
you being identified to have acknowledged the ticket by the
cop writing it and handing it to you is preserved by you
having to show youâve actually received the ticket and
consent to its validity viewing its status online.
basisword wrote 1 day ago:
Human drivers can be seen and stopped by police and given an
appropriate punishment. Self-driving cars have nobody to take
accountability like that so you need to go back to the source.
mitthrowaway2 wrote 23 hours 36 min ago:
Their license to operate can be taken away, which is what happened
to Cruise.
some_random wrote 1 day ago:
Yeah but in many cases they're not, traffic enforcement went way
down during Covid and it's still down.
daseiner1 wrote 1 day ago:
Most large cities I've lived in, general traffic enforcement
essentially only exists on that month's/quarter's designated
ticket-writing day. i.e., when highway patrol and city police
just write speeding tickets all day to juice revenue
some_random wrote 1 day ago:
I don't know what large cities you've lived in, but that's not
what any experts seems to be saying in any piece I've ever
read.
HTML [1]: https://www.nytimes.com/interactive/2024/07/29/upshot/...
GiorgioG wrote 1 day ago:
I'm going to call bullshit on this. Most human drivers do not flout
these rules.
criddell wrote 1 day ago:
I think maybe they meant that the majority of vehicles that flout
the rules are human-driven.
trollbridge wrote 1 day ago:
No kidding. Try doing this once or twice and the driver will record
your information and youâll get a nice visit from the police.
kotaKat wrote 1 day ago:
Out here in rural nowhere it doesnât â it just gets the
sheriff on the local news begging people to stop instead of
solving the actual problem at hand by placing patrols on the
routes.
HTML [1]: https://www.wwnytv.com/2025/02/12/absolutely-terrifying-...
trollbridge wrote 1 day ago:
Out here in rural nowhere it most certainly does. The school
bus driver will record your number plate, and school buses have
the equivalent of dashcams now.
DIR <- back to front page