HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Tesla Autopilot Hits a Deer (and I think it will happen again.)

iPad Rehab · Youtube · 41 HN points · 3 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention iPad Rehab's video "Tesla Autopilot Hits a Deer (and I think it will happen again.)".
Youtube Summary
Human brains are "too excellent" at filtering out cognitive noise. This is a huge problem for Tesla's imperfect autopilot technology.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Oct 02, 2021 · 41 points, 92 comments · submitted by shaicoleman
temikus
If only we paid the same attention to human driver accidents. It's good to keep the company and technology accountable but I still believe that autopilot is _miles_ better than your _average_ driver.

Like in this particular case - it was on a bend, there was an oncoming car approaching - should the autopilot have hit the brakes at full and risk swerving into the oncoming car? Should it have swerved around the deer and hit the oncoming car? I would have probably slowed down a bit but still hit it as human reaction speed is not that fast in reality. The deer becomes clearly visible ~3:53 and contact is made at ~3:56. Studies[1] show that the average reaction speed is 1.5 seconds. This leaves 1.5 seconds to come to a full stop. Would you be able to come to a dead stop within that timeframe at 70mph?

Every time a Tesla thread comes up I see so many "couch racers" in comments that pretend they're formula 1 drivers with split second reactions.

In reality this is a life-saving technology - imagine how much death, loss and suffering will self-driving cars prevent by taking drunk people off the road alone.

[1] https://roads-waterways.transport.nsw.gov.au/saferroadsnsw/s...

leereeves
Moving too fast to react isn't an excuse for a human driver; that's just driving too fast. Applying the same standard to Autopilot, if the Tesla was going so fast that it didn't have time to stop for an obstacle as far away as that deer was at 3:53, it was going too fast.

But she says the Tesla didn't brake at all, which suggests it just didn't recognize the deer or doesn't have a rule saying that an animal walking toward the road might be dangerous.

temikus
That’s not how this works - cars go at a speed limit set by the road they’re on. The speed limit is selected based on safety factors present on the road.

By your logic, one needs to move 30mph on the interstate to make sure they can react to whatever jumps out at them. Or am I misunderstanding something here?

timerol
Speed limits are a maximum, not a minimum. With limited visibility in fog or snow, it is extremely common to see cars going 10+ mph under the speed limit.

The speed limit is supposed to be the maximum appropriate speed in ideal conditions. In practice almost no one treats it that way, but that's the philosophy by which it was designed. They take into account the clear area on the side of the road, sight lines, and the curve of the road itself.

leereeves
I'm not talking about situations where something jumps out of nowhere or a meteor falls out of the sky onto the road. This deer didn't jump out onto the road. It slowly walked out onto the road; it was visible near the edge of the car's vision; and it was easy to anticipate that it might walk in front of the car.

I've encountered many situations like this without hitting the animal. I would slow down and move aside to give the deer room, starting as soon as I saw the deer at 3:53. If the deer had moved faster and taken the whole lane I'd have passed it in the other lane. The oncoming car wasn't that close.

But you must also be able to stop in this distance, because the obstacle might not be a deer you can steer around, it might be a fallen tree that blocks the whole road.

You're certainly right that many human drivers would not do better than the Tesla here, but we should expect the Tesla to be better than a bad human driver.

mkr-hn
It's interesting to watch the continued narrative shift away from the original sales pitch of self-driving as a solution to human fallibility even at low autonomy levels to stuff like "you wouldn't expect a human to do better in the same situation." It's like that scene in The Simpsons with Homer chasing the grill as the pig atop it connects with increasingly nasty things as he yells "it's still good!"
indecisive_user
The actual recording of hitting the deer starts at 3:50

https://youtu.be/jrWH_0YA5XM?t=230

shaicoleman
I updated my video description comment to include the timestamp
geoduck14
Soo... I'm an idiot. Heads up for other people: this contains a video of a car killing a deer.
leahbarton
Isn't this the exact behavior we are taught as human drivers?
yjftsjthsd-h
I'm pretty sure humans are supposed to at least slow down if they can possibly do so, which in this case appears to have not happened
Ekaros
Brake hard, swerve if possible, that is if no oncoming traffic. Pretty much standard training in driving school here, were we expect deers and moose to be on road.
poniko
Small animals you drive through, bigger animals you try to avoid by breaking, elk/moose you actively try to aim right and behind the path if you have time to react, mainly because those things will kill you when they slam through the window.
mkr-hn
Leaving a dead small animal will draw out larger animals who see an easy meal and do real damage when they connect. It's bad advice.
wongarsu
The car would have had time to come to a stop if it had started breaking when the deer became visible. That said the collision only became inevitable when the deer dashed forward in the last second (literally the last second of its life).

The car could have reacted better and likely prevented the hit, but if a human had been the only driver nobody would have even suggested that the driver was at fault.

None
None
jeroenhd
Honestly, I disagree. A human could've spotted the deer from far away and could've made the decision to slow down or stop so that the deer could cross safely.

When there are other cars around or when the animal would've jumped in front of the car before you could react, I'd say the driver can't be blamed for hitting the animal because there was no reasonably alternative. In this case, breaking and turning on the hazards would've been a perfectly fine solution. This deer didn't have to die and a human driver could've (should've) avoided a collision.

The deer is just an example of the system failing, though. It shows that Tesla's wannabe-self driving is incapable of dealing with animals near the road. A small deer like this could be hit safely, but when a car doesn't try to break for something bigger and heavier, people could easily die. You run over birds and bunnies, but don't even try to drive through a horse. That thing is going straight through your windshield when you hit it, so you'd better risk getting rear-ended when a runaway horse walks onto the road. You're going to need one of those death mobiles like a Cybertruck that doesn't do those fancy "crumple zones" or "safety glass" to make it through anything larger than an adult boar without risking a crash, so any full self-driving system should be able to anticipate large animals near the road (or shut down in areas where large animals are known to cross roads).

InTheArena
I've totaled a car because of a deer jumping in front of my vehicle. Swerved, and flipped the car right over. That's not out of the realm of possibility in this video. If the car had swerved, and hit a decline, it takes remarkably little to flip a car.

Well not a tesla. You have to really work to flip them, but the point stands.

This despite being taught about danger of animals, and living in a place with a large wild population.

Your brain does not process images outside of the immediate distance in front of you nearly as fast as you think it does. I had a good four seconds that I could have "seen" the dear, but it was only when it was directly in my path that I realized what was going on.

When you are watching youtube, you are staring right at it. When you are driving a car, chances are you looking straight ahead.

smoyer
I've hit several deer but only killed one animal precisely because I slow down when I see them walking. I've NOT hit dozens that have wandered into my lane so I'd say I have a pretty good track record.
dogorman
No!? You're supposed to not swerve, but you sure as shit are supposed to brake! If you can't safetly slam on the brakes because somebody was tailgating you or because the road surface is too slippery, then you already fucked up before the deer appeared!
dzhiurgis
The purpose of ABS is its ability to safely brake AND swerve. Saved my ass many times, especially on icy conditions. ESC is also magical.
dogorman
As I understand it, the purpose of ABS is to let you slam on your brakes and NOT swerve. If your wheels lock up then you lose all steering authority. ABS lets you slam your brake to the floor and still be able to keep your car on the road using the steering wheel. If your car doesn't have ABS, then you should know that and know to "pump your brakes" instead, to maintain control of the car.

If a child jumped in front of my car, I'd swerve even if that meant risking a tree. If I ran over a child I don't know that I'd be able to live with myself anyway, so the choice is clear: I risk myself to improve the odds of the child making it away unscathed. But a deer? No way in hell. I'll slam on the brakes, since my car has ABS, but I would not take my chances with swerving.

None
None
InTheArena
I was taught that unless you are really really sure that there is no impediments and/or cliffs to either side of you, to hit it straight on, and only to dodge if you were sure you could make it.

Of course, they also said to break as much as possible but hit it straight on. You would total the car, but cars are designed for that, versus your body not being able to take a deer going through the windshield if you clip it on the side. The engine block is padding, use it.

Looking at the video, it looks exactly like the videos of horrible car crashes I saw in school (less blood). As usual with Tesla, the headlines are sensational, the hacker news comments breathless, but it's a fairly straight forward physics and recognition problem. 300+ people die a year, and billions of dollars in cars are destroyed each year. A neural network mimics the way human brains work. If a animal goes in a unexpected direction (or if it wasn't recognized at all), its going to be a problem.

heavyset_go
> As usual with Tesla, the headlines are sensational

The headline is "Tesla Autopilot Hits a Deer" which is as bland and unsensational as you can get while being true.

> A neural network mimics the way human brains work.

This is like saying a drawing of a circle mimics the way the sun works. NNs are vaguely inspired by a high-level neuron topology, but the similarities pretty much end there. Gradient descent and backpropogation don't take place in the brain at all, and we don't know enough about the brain to mimic the way it works.

farmerstan
Lol you were taught to drive right into deer? I wonder what the Yelp reviews are for that driving school.
jacquesm
Yes, you don't swerve to avoid an animal, that can turn a major problem into a fatal one, for you and the other occupants of the vehicle.
farmerstan
What about slowing down? The car in the video wasnt going that fast. I live in an area with a lot of deer and the first thing you do is slow down.
iab
For sure; alternative headline: "Tesla swerves into tree to avoid deer".
dogorman
Cars have this handy peddle on the floor that lets you slow down the car without power-sliding it. You should give that a try.
klyrs
It doesn't rain or snow where you live, then? I guess you've never had a deer jump out in front of your car, either.
dogorman
Of course it does, and that's why I was taught that if the condition of the road would make slowing your vehicle unsafe at the speed you are moving, then you were going too fast. The speed limit is not a "safe in all conditions" speed; you are expected to drive substantially slower than the speed limit if road conditions are poor. Didn't they teach you that? Do you even have a drivers license?

I grew up in rural PA, learned to drive there and lived there for several years afterwards. I've seen rain and snow. I've seen more deer than you can shake a stick at. I've never driven into one, though once one slammed into the side of my car. It isn't rocket science; if the road or visibility conditions are poor, slow the fuck down. Poor visibility includes brush or tall grass by the side of the road, because deer hide in that shit. If you see it, particularly at night, slow the fuck down.

Edit:

Did you even watch the video Jacques? The deer did not jump into the side of the vehicle, the deer jumped well in front of the car, with plenty of time to stop. I swear people on this site will pretend to be complete retards to defend Tesla. "Use your brakes to avoid a collision? I had no idea that was considered a best practice!" No. Fuck off.

jacquesm
> I've never driven into one, though once one slammed into the side of my car.

Clearly you weren't using your brakes timely.

Seriously, I don't get the 'holier than thou' attitude here and in other comments when it's clear that even you did not manage to avoid all close encounters with deer. The deer that hit your car in the side could have just as easily ended up in front and you'd have been unable to avoid it. This stuff simply happens and it is obvious that in some cases the driver isn't at fault but the deer is, unfortunately they aren't quite smart enough to understand the situation enough to keep themselves safe. So we humans will do what we can, but accidents can and will happen.

iab
Great answer - thankfully that’s always possible right?
dogorman
Yes, if you are driving responsibly then using the brakes is always an option. All cars have brakes; they're required to by law. If using the brakes is not an option then you already fucked up (e.g. you were going too fast on a slippery road.)
iab
This shows me that you have little real-world experience driving in challenging conditions
jacquesm
I've lived in deer country for five years. It's quite literally impossible to brake every time a deer hits the road in front of you because they have the ability to teleport into and out of the space in front of you with a speed that is frankly unbelievable until you have it happen to you. I managed to avoid all but one of these encounters and I chalk it up to luck rather than reaction speed or driving skill and that was at fairly low (< 40 mph) speeds.

Deer standing safely by the road side will happily panic and jump in front of your car at the last possible moment. In theory there is no difference between theory and practice, but in practice there is.

None
None
tialaramex
"Smash into a large animal, dying is good" ?

No. You should brake. The intended and authorised safety regime for automobiles assumes you will maintain a situation where you can reasonably brake before foreseeable impacts.

This is quite different from say, an airliner or a railway train which couldn't possibly brake, and so those need completely different safety environments. "Absolute block" is an example for railways, there is intended to be no possible way for two trains to be in the same section of track, so, you always have at least an entire section (it will often be more on fast trains) to brake even if you can't see far at all.

In a car you don't have that environment, you should always be prepared to stop in the distance you can see. If you're thinking, "But nobody drives like that" then you're part way to understanding why so remarkably many people die on roads.

jacquesm
Small correction: you should brake until just before impact if it is going to happen because you want your car's nose to be up as high as you can in this case because otherwise you might get a double helping of windshield+deer into your car.
falcolas
1) Deer can and do jump.

2) The difference between the nose of a car while braking and while coasting is an inch or two. Won’t make a lick of difference.

Source for 1) Having a deer jump over my car, leaving only fur in the windshield wiper blades. 2) Mythbusters, measuring for a “Wanted” car jump myth.

jacquesm
Thanks. This was advice dispensed by multiple locals in Canada where the deer were so thick on the road that these encounters would happen at least once every week, I didn't actually set up a scientific side-by-side test to see what the difference would be. Braking hard transfers a few hundred pounds of load (depending on the weight of the vehicle, typically about 25% or so) to the front axis, and how far it dips depends on how stiff your suspension is. A typical family car (say, a Caravan or Windstar) will drop quite a bit more than an inch or two but even then I agree it may not make much of a difference because of the way the nose of the vehicle is shaped.
InTheArena
Its worth noting that when I was taught, they also said not to break.

Ironically I did actually end up having a encounter with a deer. I did break, and swerve, but at the time I had a car without anti-lock breaks. I lost control of the vehicle and flipped it (which is also why they tell you not to swerve, unless you have good awareness).

smoyer
In fact, the law requires those behind you to be able to brake to avoid you regardless of what you do in front of them. If the car in front of you skids to avoid a family of ducklings, you're going to be the responsible individual if you rear-end them.

With large farm animals such as horses and cows, or with larger wildlife like moose and elk, your life is at stake - there are plenty of drivers killed just hitting moose (look at the size of the mother in the embedded video and you can easily imagine her sliding up the hood and through the windshield: https://www.adfg.alaska.gov/index.cfm?adfg=livewith.drivingm.... This video also shows something that's true of elk and deer as well - they have very little traction on pavement.

rvz
It didn't even see it. So also no night vision too? I thought this software had object detection even on autopilot.

Sounds like a very dangerous robo-taxi if that was going to be released as "Level 5" for 2020.

jacquesm
This is one of those cases where cameras alone simply don't cut it, other manufacturers add LIDAR for a reason: it works well when the cameras do not.
Ekaros
Infrared actually should be doing quite a wonders here. At least some tech demos of such cameras seem to be good enough. Then again it just might not be in training data...
rasz
It saw it all right, pixels dont lie, there was no problem with visibility, problem was with Vision algorithms. Its programmed to ignore things it doesnt "recognize". Anything that isnt a car (unless it has police blinkers on apparently) or upright human is clear road to Tesla. Example video given by someone actually defending autopilot :o

https://www.youtube.com/watch?v=q0_eKr8jnNo

@2:20 exact same situation as in Jessa Jones clip. "all of a sudden a deer ran out onto the road but autopilot didn't react at all, gracie's applied the brakes only moments before impact but it was too late " Living creature hit, $4500 bill an 2 weeks without the car.

@6:30 plows into road debris roughly size and shape of a human lying on the road.

@9:00 finally correctly recognizes human on a crosswalk.

@11:00 Coyote, zero reaction and human takes over to avoid accident.

heavyset_go
I wouldn't be surprised if, like the self-driving Uber system that killed a pedestrian[1], Tesla's Autopilot is programmed to ignore some detected objects and obstacles in order to have a "smoother" ride.

[1] https://www.nytimes.com/interactive/2018/03/20/us/self-drivi...

Ekaros
Maybe we should actually be fixing what people understand autopilot to be.

We try to make it some magic AI system, but instead it is dumb system that keeps speed and route potentially even between lines and it might even make turns. But that is pretty much it.

Now, FSD marketed by Tesla, should be something the whole company is punished with massive daily fines until they make it actually work or stop using the term and allow anyone to return vehicle for full price.

mkhpalm
Tesla is 100% image based autopilot or something? I have 2 vehicles with radar and I have no doubt both of them would have attempted to stop for this.
InTheArena
With LIDAR or RADAR?

No production car (AFAIK) uses LIDAR. Some have RADAR, but it doesn't usually help a lot here. This Tesla actually has radar in it. Tesla is actually moving to pure vision because they think they can respond faster to a all vision system and have fewer conflicts when radar randomly bounces back (this apparently was one of the reasons for phantom braking events in autopilot), but also apparently because radar interference with other cars is starting to become a thing.

clouddrover
> No production car (AFAIK) uses LIDAR

Various brands are putting LiDAR on their cars:

- Lucid: https://www.autocar.co.uk/car-news/new-cars/lucid-air-1065bh...

- Toyota and Lexus: https://global.toyota/en/newsroom/corporate/35063150.html

- XPeng: https://insideevs.com/news/537039/xpeng-p5-lidar-video/

- Volvo: https://www.cnet.com/roadshow/news/electric-volvo-xc90-lidar...

InTheArena
None of thoose are shipping en masse yet.

But boy, we do need that technology to come down in price, stat.

None
None
None
None
orwin
I've been in WV for a month, not encountering deers when the night falls is what's surprising. Also, a herd looking at your slow-driving car when the moon is hiding makes for a surreal vision (for some reasons, their eyes are _bright_ in the dark)
istjohn
Their eyes are so bright because they are retroreflective[1]. Just like road signs, they reflect light back in the direction it came from.

1. https://en.m.wikipedia.org/wiki/Tapetum_lucidum

purplejacket
Are. You. Fucking. Kidding me?
Geee
There could be strategies to unhabituate driver's brain. For example, randomly swerving the car into oncoming traffic, just for a short moment. Make it seem unsafe, while keeping it safe. Make it random enough to avoid habituation.
tablespoon
> There could be strategies to unhabituate driver's brain. For example, randomly swerving the car into oncoming traffic, just for a short moment. Make it seem unsafe, while keeping it safe. Make it random enough to avoid habituation.

Then what's the point of self driving? Thrills? No one would want it.

My Honda does something similar with its lane-centering feature (it will disengage after maybe 15 seconds of no driver input), but that feature is so limited it's not annoying.

gregmac
This has a strong possibility of making things worse by conditioning the human driver to react with "oh, it's just the auto-pilot testing me", and thus not reacting properly when it does actually malfunction.
tra3
While your specific example is a terrible idea, the overall thought makes sense.

Maybe limit autopilot so that for every mile, or hour spent on autopilot you have to drive an hour manually? Make it random?

Also stop calling it autopilot because it implies self driving capabilities that are not yet fully baked.

Geee
Yeah, that was kind of tongue in cheek, but I feel it should randomly scare you to actually be effective. It's probably a good idea if you had to drive manually for a few minutes randomly, but the switch should happen quite suddenly so that you need to be paying attention constantly.
shaicoleman
A few quotes from the video:

* "My Tesla was on autopilot and it plowed into a deer in my lane, there were no cars behind me, it did not attempt to slow down or stop."

* "As one Tesla autopilot mile becomes 50, 500, 5000, the behavior of your brain changes - it offloads the cognitive burden of driving to the autopilot, even when you have your hands on the wheel looking straight ahead"

* "Your brain cannot turn off habituation, it will put too much trust in autopilot, you can't do anything about that and that can be disastrous"

* Mentioned article: Tesla Drivers Using Autopilot Watch the Road Less, MIT Study Shows - https://www.caranddriver.com/news/a37727214/tesla-drivers-au...

* The footage when it hits a deer is at the 3:50 timestamp - https://youtu.be/jrWH_0YA5XM?t=230

c7DJTLrn
I don't get it, is the person advocating against using Autopilot then? What do they want Tesla do actually do? You can't have it both ways - can't pin it on Autopilot for making you less attentive and then continue using it only to point the finger when it doesn't compensate for lack of attention.
JshWright
It is irresponsible for Tesla to market something as "autopilot" when it definitely isn't that. Telsa is the one trying to have it both ways (selling something by giving the impression that it offers full "autopilot" performance, then passing the buck back to the driver when anything goes wrong).
Ekaros
I'm not against calling Tesla's system autopilot. In the end autopilot in reality is dumb system. Cruise control is also an autopilot, so is lane assist. Problem comes when people think autopilot is actually smart, when it really only does pretty basic things...

Selling FSD however is pure scam.

chucksta
Thats a pretty loose definition of autopilot; >noun: automatic pilot; plural noun: automatic pilots a device for keeping an aircraft on a set course without the intervention of the pilot.

Cruise control only maintains velocity, lane assist still requires interventions when turns need to be made or lines are unclear. That sounds like intervention to me

"without the intervention of the pilot" is a pretty pivotal part of everyone else's understanding of the word.

c7DJTLrn
I agree with you, but Tesla are very clear that the person in the driver's seat is 100% responsible for the vehicle when Autopilot is active. Until the law catches up with innovation, that will continue to be the case.

Most planes today can land themselves, yet pilots still often land manually to keep their attention and skills sharp. Tesla drivers which make use of Autopilot should consider doing the same -- since ultimately it's their ass on the line for now.

heavyset_go
> I agree with you, but Tesla are very clear that the person in the driver's seat is 100% responsible for the vehicle when Autopilot is active.

So when Tesla says[1]:

> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

The "legal reason" for having a person in the driver seat is so that they can act as a liability shield for Tesla?

[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...

None
None
DarmokJalad1701
You keep posting the same comment.

That is a specific marketing video from five years ago. The current system has basically nothing in common with what is shown there, hardware or software wise.

> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

Does it say that in the Autopilot terms of use? Maybe the owner's manual?

heavyset_go
> You keep posting the same comment.

No, I made two distinctively different comments, but used the same quote from Tesla in them, which is the same quote you've just quoted yourself.

> That is a specific marketing video from five years ago. The current system has basically nothing in common with what is shown there, hardware or software wise.

So it got worse over the last 5 years? Because Tesla said the car was driving itself, but now you're saying that's not true, and there's nothing in common with them.

LilBytes
Like one of the GP comments said,

>* "Your brain cannot turn off habituation'

The entire argument falls apart, and quickly. If the statement 'its the drivers responsibility' relies on us completely ignoring human behaviours and patterns.

People are falliable, and encouraging people to not concentrate and put trust into auto-pilot, and knowing with every kilometre travelled. People concentrate less and less. Who is at fault when the car misbehaves and breaks away from expected behaviours?

mkr-hn
Habituation is even a problem in ordinary cars. There's a word for it: highway hypnosis. First identified 100 years ago.

https://en.wikipedia.org/wiki/Highway_hypnosis

falcolas
Aircraft pilots - especially those who fly jets with auto-landing features - are exceptionally well trained and paid to keep that attention and skill sharp. And that’s not counting their backup - the co-pilots - who are typically just as well trained and compensated as the lead pilots.

I highly doubt Tesla will start putting drivers through several hundreds of hours of training every year (let alone pay them 6/7 figure salaries) to have properly safe and trained drivers in their cars.

userbinator
The other thing that people who bring up the aviation analogy don't often mention is the difference in scale of reaction times when flying in an aircraft; they are moving much faster, but the distance between them (and the ground below) is also much higher, more than proportional to cars; when the autopilot disconnects, there is at least a few seconds to react to the situation. In a car, reaction/handover times to avoid a collision are sub-second.
josephcsible
Do you agree with these two statements?

1. The feature marketed as "autopilot" on airliners is accurately named

2. The human pilots of airliners need to keep paying attention even when autopilot is active

Isn't that exactly analogous to how it works in a Tesla? Why do you then say "it definitely isn't that"?

heavyset_go
For reference, Tesla's marketing[1] is clear about what they mean when they say "Autopilot" and "Full Self-Driving":

> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...

josephcsible
They're saying that in those videos, the driver didn't need to provide any inputs to the car. They didn't say that you never will, or that he didn't have to pay attention and be ready to.
sidibe
There is absolutely no excuse for using that video to market Autopilot/FSD (it's still the main feature on tesla.com/autopilot). The driver is clearly not just there for legal reasons. Given that the video is a few years old and FSD beta is still screwing up every couple minutes in the most recent Youtube videos, that driver probably bailed it out in many bad takes before they got this one.
tuatoru
With commercial airliners, they moved away from auto-landing systems to head-up displays: assistive technology for the pilots.

To me it seems likely that putting in HUD-type assistance would be safer than attempting self driving, which is more complicated than landing a plane.

tialaramex
HUDs don't seem like they really impact auto-land. They do impact conditions in which take off is authorised because they can add a synthetic centre line so that pilots are comfortable taking off in low visibility.

For landing existing CAT III landing will complete the entire landing and rollout. Failures (e.g. loss of a flight computer or sensor problems) that would lock out CAT III autoland would likewise rule out attempts in a HUD.

The reason you don't see much real CAT IIIc bad weather landing isn't that it can't work or HUDs would be better, the problem is that sure, now you're safely on the ground but with zero visibility you're stuck on a runway, you can't very well taxi a jetliner when you can't see the concrete you're on let alone the signs and lights. Even a follow-me car won't help, you can't see it.

Garmin's emergency autoland for GA pilots has the same problem, but since it's an emergency feature "Oh dear we took an entire airport out of service for an hour" is an acceptable consequence. The alternative may well have been fatal in that case.

Anyway, people have added e.g. enhanced visibility and highway edge marker HUDs to high end vehicles before. But, that's going to be something you need to train with in order to use it effectively. An airline would send crew on the recommended two week course before they're allowed to fly the modified plane, but we know even private pilots routinely avoid recommended manufacturer training, and cars would definitely be worse. My guess: Tesla would sell a thousand cars with a HUD, fifty drivers would take the free training, ten of them would come away having learnt anything. All one thousand owners would be confident they can now drive at 60mph in dense fog even though the training said "You definitely must not drive at 60mph in dense fog".

tuatoru
Good points!
carlivar
Autopilot is just poor quality. Period. If it can't even try to avoid a deer like this, it's bad software (and hardware). What should Tesla do? Make better software and hardware, and not let people pay for it and use it until it is good. This isn't early access to a Steam game but that's how Musk seems to treat it. But then again he has stock and reputation making it a one way road; there's no going back now.

That leaves it to consumers and regulators. As consumers we can stop getting caught up in Musk's persona, and stop buying a dangerous product. And regulators should start doing their job for the first time in many years.

c7DJTLrn
Sure, I can acknowledge that the software should have reacted here, but so should've the driver. The driver is responsible for the vehicle and Tesla are clear about this. I don't think it's valid to blame inattentiveness on Autopilot either.
userbinator
The deer is already hit by the time the driver reacts.
carlivar
So you're ignoring all the points of the video? That autopilot lulls the human brain slowly away from attention even with knowledge of its necessity?

You're describing a flawed overall system. It has to be nearly perfect or it would be better not to exist at all. Middle ground is incompatible with our minds. You are asking for the impossible.

mjgoeke
Oh gosh no. I can see it now, google self driving car with built in “captcha” where it checks your response time to the “fire drill” alert tone to take over driving. That’ll train us. Probably the perfect time to show an advert too. Our attention is pulled, and the adrenaline is pumping…

Or would that just train us to ignore the ‘take over tone’?

chucksta
What is your point? If the car is actually self driving, then there is no need for human interaction. The issue is this is not, therefore the driver still needs to be engaged enough to respond
mjgoeke
Sorry, no strong point. I was just pondering that the issue is inattention, and imagining google, who spends the most money, ever, on capturing human attention, taking the next dystopian step.
sidibe
Google dropped this approach a long time ago precisely because of this problem: their employees were zoning out
Oct 02, 2021 · rasz on Rivian S-1
> Given that all Teslas are capable of fully autonomous driving

accidents. All Teslas are capable of fully autonomous accidents.

https://www.youtube.com/watch?v=jrWH_0YA5XM 'Tesla Autopilot Hits a Deer (and I think it will happen again.)'

oxplot
You will never hear about all the times it's saved people's lives and avoided accidents, so until you show me that it has in fact increased accident rates, a link to a video means nothing, unless you are a journalist and your newspaper is on the verge of going under, in which case you need some juicy drama to sell ads.
rasz
Video of Tesla emergency braking or avoiding hitting a pedestrian that ran in front of it would go viral instantly. Can you show me a few? maybe one?
oxplot
Sure thing:

https://www.youtube.com/watch?v=VttbZAr1TBc

https://www.youtube.com/watch?v=e7otagz2NS0

https://www.youtube.com/watch?v=qC1PoqyZc_U

https://www.youtube.com/watch?v=xbw2ZB0Lbk8

oxplot
These are FSD Beta videos — I posted the saves videos in a sibling reply.
oxplot
Oh whoops, I posted the wrong list of videos (those are FSD Beta ones). For emergency braking and "maneuvering" out of the way, Whaam Baam Tesla Cam channel has you covered. Here are a few videos with some examples (and no, they do not go viral - no accident and no drama does not sell ads):

https://www.youtube.com/watch?v=HSeIOg4SyOg

https://www.youtube.com/watch?v=krKkuIEMXHI

https://www.youtube.com/watch?v=ZebkknqGkxE

https://www.youtube.com/watch?v=q0_eKr8jnNo

https://www.youtube.com/watch?v=pHM6xY2Ys4U

Oh and I have personally avoided an accident so far when a whole bunch of cars on the highway slammed on the break and autopilot sounded an alarm and braked. So there you go.

rasz
Seems you didnt understand me. "pedestrian" is not a car. I meant avoiding hitting living things, not property.

1 Mountain lion runs across, cant see car slowing down at all despite what the description text says. Cat barely makes it and car keeps going.

2 Bird almost hits windshield. Owners own description is he took over.

3 Property and "funny home videos"

4 Well, what do you know. EXACT SAME situation as in Jessa Jones clip happens at @2:20 "all of a sudden a deer ran out onto the road but autopilot didn't react at all, gracie's applied the brakes only moments before impact but it was too late " Living creature hit, $4500 bill an 2 weeks without the car. @6:30 plows into road debris roughly size and shape of a human lying on the road. @9:00 finally a human, on a crosswalk, 50 meters in front. @11:00 Coyote, zero reaction and human takes over to avoid accident.

5 @3:30 At first looks like Tesla finally detects animal on the road, but no. It simply reacts to car in front (braking for the animal) and starts accelerating as soon as the car in front moves on, almost driving over the small deer in the process.

Tesla seems to be really good as spotting cars, and might even have a special case for humans on crosswalks, but every single clip with animal ends with accident or human taking over. 4 @6:30 shows what would happen if an unconscious person fell on the road.

Off topic: I love clip 4 @11:50. Hitting small wooden post at 20mph = totaled Tesla :o

oxplot
You are literally confirming what my grandparent post said:

Where do you hear about all the time Teslas saved lives? You don't, because even throwing evidence in people's face, they only list the fuck-ups.

I rest my case.

rasz
Not driving into a pedestrian slowly walking on a crosswalk is not saving lives. That was the _only_ example of Tesla reacting correctly to a living thing, all others ended in accidents, death or human taking over. And those were your "good" examples.
oxplot
> And those were your "good" examples.

That was me searching for a minute. If you actually care to learn something, you'd spend a few hours of your own time looking for what's actually true. If I was to go out of my way to show good examples (which I'm not going to do, for your lazy ass), you would say something like "oh great, my 10 year old nephew can do that".

So I'm not gonna bother with you pal — I'll convince people on the fence, they'll take care of convincing you.

Enjoy your video games.

rasz
Its not about humans being able to "do the same", its about Tesla not being programmed _at all_ to avoid anything other than Cars and maybe pedestrians on a crosswalk. It doesnt consider any other obstacles on the road and plows into them.
Marciplan
You drank quite a lot of that kool-aid, huh
oxplot
That's what well informed looks like from the outside!
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.