HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Elon Musk: Self-driving is way harder than I thought | Lex Fridman Podcast Clips

Lex Clips · Youtube · 31 HN points · 0 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Lex Clips's video "Elon Musk: Self-driving is way harder than I thought | Lex Fridman Podcast Clips".
Youtube Summary
Lex Fridman Podcast full episode: https://www.youtube.com/watch?v=DxREm3s1scA
Please support this podcast by checking out our sponsors:
- Athletic Greens: https://athleticgreens.com/lex and use code LEX to get 1 month of fish oil
- ButcherBox: https://butcherbox.com/lex to get offers & discounts
- InsideTracker: https://insidetracker.com/lex and use code Lex25 to get 25% off
- ROKA: https://roka.com/ and use code LEX to get 20% off your first order
- Eight Sleep: https://www.eightsleep.com/lex and use code LEX to get special savings

GUEST BIO:
Elon Musk is CEO of SpaceX, Tesla, Neuralink, and Boring Company.

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
Full episodes playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Clips playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41

SOCIAL:
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- Medium: https://medium.com/@lexfridman
- Reddit: https://reddit.com/r/lexfridman
- Support on Patreon: https://www.patreon.com/lexfridman
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Jan 02, 2022 · 31 points, 25 comments · submitted by tux1968
__m
so is the hyperloop, digging cheaper tunnels, electric trucks, solar shingles, making starlink economically viable, cave rescue missions, ventilators, going to mars, getting rid of covid, intercontinental rocket traveling etc.
sschueller
This is what happens when you ignore every one and experts. How much money and resources have been literally wasted because this man's ego of knowing better than anyone else?

And this man according to his fans is going to save the planet? What a joke.

otabdeveloper4
Elon's day job is making sure stock quote goes up, not rockets or AI. Keep this in mind.
netizen-936824
Which is super disappointing when its coming from a phsycist
Iolaum
Is it? He still ends up getting funding to execute on his goals. Tesla may be behind on QA for their cars but they are ahead in technology (re EV cars) and in supply chain compared to competitors. Which of those is easier to fix?
netizen-936824
I would hope that someone who has training in research would be able to understand what tech might be actually viable now or near future, instead of bs that only serves to inflate stock prices

Maybe that's just my ideologic biases, but I'm quite disappointed in Elon's ventures

QuadmasterXLII
Forget the planet, the US only survives the next 30 years if we focus our brightest minds back on manufacturing instead of finance and selling ads on social media.
sschueller
Yes if these moonshot ideas are as old as electricity and have been attempted by thousands over decades. Only because you think physics doesn't apply doesn't make it so no matter how much money you throw at it.
djohnston
You act as if he's some sort of scam artist who didn't just turn aerospace on its head.
pcbro141
Is money invested in moonshots instead of sitting in bonds really 'wasted'?
andrepd
It was not invested in moonshots, it was raised by misleading marketing hype by the marketing genius of the moment: Elon Musk. Honestly, I cannot stop being astounded by the guy, he literally goes online and mumbles incoherently about "full autopilot in 6 months" or some similarly insane declaration and Tesla stock jumps up. It's amazing.
interstice
At least I can go out today and order a Tesla, or point a telescope at the sky and see some sparkly satellites, that’s more than can be said about most vapourware these things are being compared to.
mikewarot
There's an ever changing list of things I keep in mind while I'm driving... the positions of all the vehicles around me (including the occluded ones), and other large objects that might kill or injure me. Also on the list is things that might be hiding, like Deer, Moose, Police, kids walking past the truck, pedestrians, etc. Then there are the constraints, like lanes, crossings, signals, speed limits, etc.

All updated in real time, as a background process once you get experienced enough... sometimes you forget where you're going, and end up at Work, or some other place. ;-)

---

Elon doesn't seem to have an adversarial team working against the AI. He should. This would help make up for the fact that situations requiring evasive action are incredibly infrequent, and thus underweighted in the training data by orders of magnitude.

WaffleIronMaker
Ignoring Elon Musk, I have been continuously impressed with Andrej Karpathy's work on autopilot.

The amount that the Tesla Autopilot engineering team has accomplished is really impressive, and I don't think it's worth giving up hope just yet.

I wouldn't anticipate Tesla shipping with worldwide FSD capability in the next 10 years, but I think within 20 to 40 years, this approach to self driving cars will be the most reliable and flexible, assuming development continues.

In terms of delivering a product, many LIDAR based self driving solutions will absolutely beat a fully AI driven design to market. However, the potential for adaptability in the Computer Vision approach is much greater than current alternatives.

Overall, I don't think it's worth dismissing Tesla entirely, but it's also important to keep in mind the long timeline for the development of adaquate computer vision technologies.

jmakov
Lidar is comming back
sAbakumoff
another day Elon Mask said that "wokeness" is one of the biggest threats to modern civilisation.The guy is so full of sh*t.
redis_mlc
None
djohnston
Wokeness is the coffin of the west.
more_corn
Obvs
iamleppert
What’s crazy to me is people are letting a car drive itself around with nothing more than a collection of consumer quality single board cameras. There are literally webcams with better specs (resolution, frame rate, dynamic range) on the market right now than what you get on a Tesla.
jmpman
If the Tesla vehicle were required to take a standard drivers vision test, would it score the required 20/40 uncorrected vision?

Right now, the camera app on the Tesla will only show the side cameras and the rear camera when actively driving. I’ve wondered, if they showed the front facing camera, would I be able to drive by looking solely at the camera output? Is the resolution sufficient, and the dynamic range sufficient for me to drive safely at all times that I can with my human eyes looking out the window? I suspect not.

And yes, the dynamic range is a huge issue. I’ve had my Tesla discontinue autopilot when driving a curvy mountain road at sunrise/sunset, as the car enters and exits areas with direct sun at a low angle. I’ve wondered how Tesla ever plans to achieve their self driving goals with the currently deployed hardware if it can’t pass that simple test. It wasn’t discontinuing autopilot because the software failed, but the sensors appeared incapable. As a human, the driving situation sucked, but I, and every other driver on this stretch of road at that time, were able to continue driving, with just our eyeballs as input. (I should have forced a camera save, so I could analyze the camera feeds later)

It seems like someone could simply prove there exists these specific real world scenarios, which Tesla cannot and will not be able to solve, in order to launch a class action lawsuit. In theory, Tesla could counter that they’ll provide a camera upgrade to resolve the limitations… and the charade will continue.

tails4e
It's not the cameras that worry me, it's the whole issue that AI is not remotely intelligent. Image recognition is step 1, and NNs are pretty good at it, but can still be fooled. However taking those images, properly interpreting them with no error, keeping track of temporary occluded objects, unusual circumstances, and then making decisions is not something I trust NNs to do, as there just is not enough training data, despite the valiant efforts to gather it.

One difference between true intelligence and what we have in NNs today is, if you showed an image or two of an object to a person they could identify that object in other orientations and lighting conditions pretty well. To do the same with NNs requires thousands of examples and counter examples to train. This is why with driving how could you have enough examples of strange issues to ensure the car responds safely?

The only advantage I see today for cars driving is they don't get bored or distracted or drunk, but they certainly are not intelligent.

md2020
> if you showed an image or two of an object to a person they could identify that object in other orientations and lighting conditions pretty well. To do the same with NNs requires thousands of examples and counter examples to train.

There’s work from 2019 on gauge equivariant NNs that looks to solve this, there’s a paper and explainer video from Qualcomm Research about it. Also, Tesla presumably does have thousands of examples for most driving scenarios. There’s also lots of work in the ML community on generalization—that’s arguably what all the research is about in some way or another. I don’t intend to be flippant, but I would bet that these are issues that the researchers at Tesla are well aware of.

tails4e
I agree, if an amateur like me can articulate an issue I've no doubt the experts are well aware of it. However it does not change the fact that there is no real intelligence, it's fancy pattern matching that works when given enough examples, and so I'd question what happens for edge cases is has not 'seen' before.
audunw
I've worked on CMOS camera sensors for this market. I'm not sure what's in the newer Tesla right now, but I'm quite sure it's not consumer sensors in there. I believe Tesla was a customer of one of the products I worked on.

First, you have to manufacture to automotive grade standards. So that right there is going to set you back a generation or two. You can't just pick some consumer grade sensor off the shelf.

Second, you now have to confirm to ASIL safety standards. A lot of the work we did on these sensors was to build in continuous self testing circuitry. For every frame the sensor would actually inject known values into the physical pixels, read them out and process them like a real picture, and compare with the reference before sending data out to the ISP.

I'd also like to know what webcams that aren't insanely expensive have better dynamic range. Is that really true? You list this as the third spec, but this was the most important spec to design for after safety. In part because dynamic range is also important for safety. And it has to be high dynamic range without too many artifacts. The sensor I worked on could do three exposures simultaneously on the same sensor and could either stitch them together on-die or stream out all three exposures to an external ISP. I've never heard of a consumer camera sensor with that kind of capability, but then I don't follow the market much anymore.

Sensitivity is obviously also important. I think the bayer pattern was RGBW and sacrificed color accuracy for sensitivity. Higher resolution would compromise sensitivity for a given die size.

Resolution and frame rate could be better, but frankly it's about good enough for what they're trying to do.

ithinkso
What does ISP mean in this context?
giobox
In context of camera sensors, commonly means “Image Signal Processor”.

https://en.wikipedia.org/wiki/Image_processor

Much of the advancement in modern digital cameras is at the ISP layer, for example Apple invest massively here each year to improve the iPhone’s cameras.

At the very least, the sensors on the Tesla appear to have no IR filter on key cameras (orange cast on side cameras etc) to improve low light performance. Virtually al consumer grade webcams will have an IR filter, so it’s not quite grab a crappy webcam and shove in the car as some comments suggest!

Tesla do also have a custom ISP, it’s part of the custom silicon chip in their standard fit “FSD computer” stack:

https://en.wikichip.org/wiki/tesla_(car_company)/fsd_chip

iamleppert
The $200 Logitech cameras have 3-exposure HDR support with high frame rate and sensitivity, including IR. No one knows what the specs of the sensors are in the car but there are physical limits on what you can do with pure electronic exposure control (fixed iris).

If the sensors on the Tesla were really that good, why did they blame them on the accident where it could not distinguish between a white tractor trailer and the horizon? In that case, plenty of light was available so it points to saturation or low dynamic range. If the sensor is saturated and has no way to physically limit the amount of light incident on it, there is no amount of clever software or filtering that can provide anything other than noise, but I’m sure you know this. It’s also a function of heat, all that light heats up the sensor and has to be removed and dissipated some how (and quickly) before the next reading is taken. The more heat, the more noise and depending on the noise process model, the less overall range. High sensitivity and high range are conflicting goals and subject to design trade offs that when both are necessary increase the overall system complexity.

The whole “humans can drive via vision” thing is a misnomer because human eyes are extraordinarily sensitive, ultra wide angle, stabilized (basically like having a gimbal), accurately temperature controlled via your body, and don’t have a fixed iris. It seems that in order to make that claim, you’d need to first develop a sensor that at least meets or exceeds the specs of the human eye, which no one has really done yet, at least not in an economically and reliable enough package to put on a car.

Given enough time, money and equipment it could be done but the sensor would look (and cost) more like something you’d see on an expensive astronomer’s telescope.

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.