Hacker News Comments on
Deep Learning State of the Art (2019) - MIT
Lex Fridman
·
Youtube
·
56
HN points
·
0
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.⬐ tomahuntThe main sections of the talk without note are:- Bert and Natural Language processing, - Tesla Autopilot Hardware v2+: NN at scale, - AdaNet: AutoML with Ensembles, - AutoAugmentation, - Training Deep Networks with Synthetic data, - Segmentation Annotation with Polygon-RNN++, - DAWNBench: Training fast and cheap, - Big GAN: state of the art in image sythesis, - Video to Video Synthesis, - Semantic segmentation, - AlphaZero and openAI Five, - Deep learning frameworks
⬐ soundsWas good to hear Lex Fridman's take on where we're at.Honest question: how have people's experience with OpenAI Five been so far? I haven't had the time to check it out in detail so I'm paying close attention to what others are saying.
⬐ visargaTL;DW: BERT and BigGAN⬐ cs702Nice work. I can think of only two things that are missing:* Normalizing flows - e.g., https://arxiv.org/abs/1605.08803 , https://arxiv.org/abs/1807.03039 , among many others
* ODEnets and continous normalizing flows - https://arxiv.org/abs/1806.07366
⬐ grejYeah strange that ODEnets were left off, and I’m glad you mentioned it. That has the opportunity to be a transformative approach to more efficient training and much better performance on time-series problems.