HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Linus Torvalds - Nvidia F_ck You!

Jim Feig · Youtube · 11 HN points · 13 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Jim Feig's video "Linus Torvalds - Nvidia F_ck You!".
Youtube Summary
Aalto Talk with Linus Torvalds - Nvidia F_ck You!

Full Length Video: http://www.youtube.com/watch?v=MShbP3OpASA
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
This is a good technical summary of the impact but I think misses the somewhat emotive history here.

See things like this infamous clip from Torvalds[0] for more context on the community sentiment around nvidia in general.

[0] https://www.youtube.com/watch?v=IVpOyKCNZYw

jackosdev
Don't even have to click to know what that is, first thing that came to mind when I saw this headline
You are not wrong

"Nvidia has been the single worst company we have ever delt with so Nvidia F_ck You!" -- Linus Torvalds [1]

[1] https://www.youtube.com/watch?v=IVpOyKCNZYw&t=86

Yeah, if you even need dedicated GPUs at all, as usually it's Gamers who are looking for those and they're usually on Windows systems anyways (really looking forward to installing Windows 7 on the laptop I mentioned if I do get it this year - kinda mean that both ironic and not-ironic...).

The built-in Radeon Vega 8 or 10 solutions on all mobile Ryzens are usually better and more efficient than current Intel counterparts, so if you don't need a dGPU, then that's one more reason to root for Team Green.

EDIT: In case anyone reading isn't aware, nVidia is a biaaatch when it comes to drivers for Linux. Anyone looking for powerful GPUs in Linux machines, no matter desktop or laptop, has an easier life with AMD solutions. That's the reason there's an infamous photo of Linus showing the finger to nVidia: https://www.youtube.com/watch?v=IVpOyKCNZYw

llukas
Nvidia works for me. Works for amazon (https://docs.aws.amazon.com/dlami/latest/devguide/gpu.html) and others as well.
klingonopera
Yeah, but you're referring to "super-computing" situations, where the OS of choice is Linux and nVidia would be shooting themselves in the foot if they only developed drivers for Windows.

When it comes to normal consumer hardware, it seems that nVidia drivers for Linux in 2019 are still a hit and miss:

https://www.reddit.com/r/linux_gaming/comments/bhfjnb/nvidia...

https://www.reddit.com/r/linux_gaming/comments/bol5uo/nvidia...

EDIT: Maybe my formulation of "powerful GPUs" led you on to that. Though I did say "no matter desktop or laptop" to clarify that...

llukas
> Maybe my formulation of "powerful GPUs" led you on to that.

No, "nvidia is a biaaatch" led me. I just stated that my experience is different. How about you share your actual experience?

I might be biased as I used GPUs for scientific computations and Nvidia reached level where it "just" works on Linux many years ago.

Also: * Steam and few games I have on my laptop (ubuntu) with nvidia mobile graphics work fine.

* My home workstation (gentoo) and workstation at work (ubuntu) also work fine.

y4mi
AMD "just works" even without installing the binary blob from the vendor.

as a matter of fact, there is no need to install said binary as its most likely already upstreamed to your distribution of choice.

but nvidia still reigns supreme if we're talking actual performance ... at least after you've installed said binary blob ;)

and i can say from personal experience that the nvidia 10x0 drivers were terrible in the first ~6 month after their release. fan kept jumping between 10-80% for example. Haven't had any issues in at least a year though, but i'd expect the same kind of issues on any new chipset by nvidia, as they havent open sourced their drivers

llukas
https://www.phoronix.com/scan.php?page=news_item&px=Radeon-V...
igneo676
Is there a decent thin-and-light AMD laptop? Looking to buy the successor to my Dell XPS 13 and would strongly prefer AMD if I can find something ultrabook-like with a QHD or above resolution screen
YMMV I suppose. Some people seem to have good experiences with their driver but mine have been terrible. I've used Ubuntu as my daily driver for 7 years now and Nvidia's driver has been the #1 source of crashes and configuration problems (half of the time I can't even get 3d accel working with it turned on, spend hours trying different versions that are reported on some forum to have fixed some problem but they don't, can't figure out why things aren't working, upon upgrading from 16.04 to 18.04 it borked the whole system and I had to switch to Nouveau, this list of Nvidia driver pain goes on forever, this is across multiple installs of Ubuntu).

All my experience has been on laptops and there are lots of reports of Optimus related problems on Linux so there's that. I gave up on the hope of having Optimus actually work years ago. I would like to just be able to enable the latest version of Nvidia's driver and see my laptop start using the discrete gpu. That would be a big step forward from the past 7 years.

FWIW there's Linus' famous Nvidia F-bomb :) "Nvidia has been the single worst company we ever dealt with" https://www.youtube.com/watch?v=IVpOyKCNZYw

Doxin
You should give the SGFXI[1] script a try. it's a bit of a hassle to use video drivers from outside the apt-repos but that script basically always manages to set up a working driver for me.

[1] https://smxi.org/docs/sgfxi-manual.htm

rincebrain
FWIW it's quite possible that's not Linux-specific, a bunch of people have problems with NVIDIA driver crashes on, say, Windows too.

There's a reason Microsoft shimmed the ability for the display driver to crash and restart without taking out the rest of Windows...

What's really stopping us here? For quite some time it was graphics drivers support: https://www.youtube.com/watch?v=IVpOyKCNZYw

Is this still the case? I hear from many people that at least on Ubuntu that's no problem anymore. Are the graphics cards still overheating? Any experience in that area?

And yes, Valve could've done things better, but it's not that bad, that you couldn't do anything with at. As said, at least on Ubuntu with Steam gaming should be fine. And if we got one thing from Steam it's that more and more games are deployed for Linux as well.

Really, why hasn't it lifted off now? Is it really that users are so stupid that they don't see a $0 offer is better than a $100 offer for an OS? Hard to believe.

grenoire
Not really. That $100 offer comes with better performance, support, and GPU drivers; users do care enough about those.
erikb
Is that still the case? Last I checked the graphics support was there. Exactly in this time frame since SteamOS announcement they pushed NVidia and friends to provide better Linux support as well.
flukus
Can't speak for nvidia, but AMD has been improving rapidly since they released full open source drivers. Ubuntu 16.04 won't work well with newer AMD cards, but 17.04 works extremely well and there are further improvements coming in future.

Anyone wanting to play games on linux should be buying AMD at this stage.

sgift
Yes, it's still the case. The Linux support is far better than it was a few years ago, but not even remotely comparable - NVidia (and I think AMD too, but I have a NVidia card) provide a new Windows driver with specific optimizations for almost every AA to AAA game. For example: The last Linux driver in the "short lived" branch came out at 09.05., the last Windows driver 29.06
erikb
I see. Sad to hear that.
Pica_soO
I always assumed the best approach would actually to have a console-hardware emulation for gaming.
You're asking a question with a very complex answer.

I'll start with a short answer - AMD wants to have at minimum a reasonably good baseline functionality on linux, out of the box. If that answer doesn't make much sense, please read on for the complex bits.

To provide some historical context, Linus gave a VERY public, VERY brutal rant on NVIDIA and their drivers four years ago.[0] At the time, NVIDIA's closed drivers were an opaque blob which basically reimplemented the entirety of OpenGL. These collided with everything else.

The open drivers (nouveau) were slow and far behind in features. The situation was bad enough that you couldn't necessarily even install a linux distro on a system with NVIDIA chip because the open drivers wouldn't work well enough for X and/or desktop environment to initialise properly, let alone remain up and functional.

In effect, if you wanted to run linux, you were best off without NVIDIA.

Fast forward year and a half. NVIDIA had come out and committed to improving the linux driver situation. They still couldn't open source their current-generation drivers, but they had realised the horrible reputation of their hardware on linux was going to be a persistent PR nightmare, and thus an existential threat to their growing mobile division. A space where they were up against Imagination Technologies. (Don't get me started on ImgTec, please...)

NVIDIA needed to get their hardware and software processes aligned in a way that they would work reliably out of the box on linux, everywhere. Even if the user only needed to keep the freshly installed or upgraded system up long enough for them to download and install the closed drivers, the system really should not break. Incidentally this meant that even the open drivers should be "good enough".

Over the past 3+ years, the situation has improved. NVIDIA has managed to shed their reputation of being completely broken on linux, they have worked closely with kernel folks to get their more recent hardware supported sensibly out of the box and in the process (I believe) they have managed to reduce the code delta between their windows drivers and linux drivers.

It helps that during the same 4 years we have had OpenGL on mobile drive the separation of duties between EGL and GLES (v2+). These changes, with all the refactorings, have also provided a cleaner split on desktop - to the point that it is no longer absolutely necessary to provide full OpenGL implementation. You can, for most parts, expect that EGL just works; that DRM and KMS both just work; and that your highly optimised GLES implementation can happily live on top of these layers.

As a result, your closed driver offering has less to override. Of course it's going to be less unstable!

Disclosure: in my previous job, I helped integrate couple of EGL+GLES driver stacks with Wayland on mobile systems. I learned to hate mobile GPU drivers with a passion.

P.S.: I haven't read enough about Vulkan to know if it improves things or not.

0: https://www.youtube.com/watch?v=IVpOyKCNZYw

aseipp
> In effect, if you wanted to run linux, you were best off without NVIDIA.

Depends on your choices. I guess if you throw Intel in there as an option, then maybe. But as of 4 years ago, if you were running a desktop -- Nvidia was pretty much the only acceptable choice if you wanted discrete graphics on Linux of any form, IMO, and driver quality was a massive part of why this is true.

Optimus, though -- that's what was really the debbie-downer for Linux/Nvidia, and Optimus is what inspired the question which lead to Linus's "Fuck you" rant, because Optimus support was so bad, and is still bad.

But the mainstream cards have worked fine for a long time, and, at least for me and everyone I know -- were the only acceptable ones for Linux, until relatively recently, and most certainly as of ~5yrs ago. As opposed to AMD, where I don't think I ever heard of a single instance of fglrx ever being anything but a nightmare.

mslusarz
Although you don't say it directly but by saying that "The open drivers (nouveau) were slow and far behind in features." and then that the situation improved because of Nvidia involvement you are not being honest.

Nvidia hasn't improved performance or contributed features to nouveau.

Their changes are limited to code that is mobile chip-specific and from time to time they contribute some reliability fix that affects non-mobile chips, but only if benefits mobile.

Since Maxwell 2 (9xx+) their chips are designed to be hostile to nouveau, by requiring firmware loaded by the driver to be signed by Nvidia (hardware refuses to load firmware that wasn't signed by them). It means that without Nvidia blessing nouveau for example can't change the fan speed (but still can change clocks! how ridiculous it is?).

Nvidia contibuted signed firmware loading for non-mobile chips (so called SecureBoot), but only because it's also required by mobile. And they still have not released enough firmwares for desktop cards to be usable...

bostik
> Although you don't say it directly but by saying that "The open drivers (nouveau) were slow and far behind in features." and then that the situation improved because of Nvidia involvement you are not being honest.

> Nvidia hasn't improved performance or contributed features to nouveau.

Fair point, I didn't realise it could be read that way. Thank you.

Nvidia has contributed enough fixes to make their hardware sort of respond out of the fox. Yes, primarily for devices on mobile space, and occasionally on non-mobile when the same fixes happen to apply. I believe this is directly a result of same hardware designs being used across the board.

I didn't mean NVIDIA were being particularly nice, although I didn't know about the active hostility against nouveau. (IIRC their driver employees are contractually prevented from contributing to nouveau, but I can't find reference. At least there I can understand the reason.)

Some background. https://www.youtube.com/watch?v=IVpOyKCNZYw

Shouldn't be a problem anymore under Linux as most distros today install Nouveau drivers by default. https://nouveau.freedesktop.org/wiki/

shmerl
Nouveau has no reclocking, so it's practically useless as is. If you want a working open driver - use AMD.
greydius
Nouveau has no CUDA support, unfortunately.
zanny
It can't, Nouveau is not developed by Nvidia.
valarauca1
CUDA is proprietary to Nvidia.

CUDA only exists because Nvidia is attempting to pretend OpenCL, Vulkan, and DX12 don't exist [1]. These require hardware scheduling on the GPU to switch shaders. Rather then dedicated X amount of chip hardware to Y shader for Z ms.

It should be noted for GPGPU compute Nvidia is not the correct choice. AMD RX 480 has 5.8TFLOPS @$200 ($37/TFLOP) vs Nvidia GTX1080 8.9TFLOPS @$600 ($67/TFLOP). In reality you should be doing GPU programming in OpenCL so you are GPU agnostic. You can switch vendors or platforms seamlessly (in most cases if you avoid proprietary extensions) even target AMD64, ARM, and POWER8/9 hardware.

That being said I own a boat load of Nvidia stock because their marketing is excellent. Really marketing is all 80% of people pay attention too. CUDA has some great marketing around it. In reality CUDA is slower then OpenCL (on Nvidia's platforms even) and no easier to work in.

[1] https://postimg.org/image/vsnidk8p5/

maksimum
I agree with your point about avoiding vendor lockins, something I experienced for myself with MATLAB. I also happened to buy a RX 480 recently, so I'm happy to hear it's good for GPGPU.

But I'm curious in how the FLOPS on these cards were measured. For example one concern I have is that presumably these two cards have slightly different levels of parallelism. So it may be more or less difficult to extract the full performance from a particular card due to parallelism overhead. Then there's driver overhead, ease of programming, etc.

valarauca1
FLOPS is always calculated via the simple formual

      F * (1/Hz) * 2 = FLOPS
Where F is # of FPU front ends (SIMD and scalar). This is wrong because scalar math often is slower then SIMD, and compute kernels rarely run on the scalar pipeline.

Where Hz is the well.. the clock rate, inverse to get cycles per second. This is wrong because stalls happen, memory transfers, cache misses etc. It is also wrong because the clock rate is throttled and you are not always at Maximum boost clock.

Then multiply by 2 for FMA (fused multiply add). This is wrong because well not every operation is a one cycle FMA. Division can be many (>100). Also scalar pipelines don't have FMA.

Ultimately all vendors use the same crappy calculation so we are comparing apples to apples. Just rotten apples to rotten apples. It gives you a good ideal circumstance you can optimize towards but never actually attain.

Kubuxu
There is difference in job scheduling between AMD and Nvidia. So if you want to optimize your OpenCL applications you can do it only for one of them or do it twice.

Sample applies to integer math, long double math and so on.

nitrogen
Do the power consumption numbers cancel out the up-front price advantage?
valarauca1
RX480 consumes less power then the GTX1080 so they'd amplify the initial price advantage.
nitrogen
Is that total power, or power per GFLOP?
valarauca1
Are you seriously asking me to do division for you? Do you own a calculator, cellphone or computer? Or are you actually that helpless?

AMD RX 480 has 5.8TFLOPS @$200 ($37/TFLOP) @231Watt Peak (2.5GFLOPS/Watt)

Nvidia GTX1080 8.9TFLOPS @$600 ($67/TFLOP) @318Watt Peak (2.8GFLOPS/Watt)

See Furmark benchmark for wattage values [1]

Typical KW-H in the US $0.12KW-H [2]. So the delta-cost of GTX1080 vs RX480 will be mitigated by GFLOPS/Watt efficiency savings in 4 years, 4 months. Which on a typical 2, 3, or even 4 year hardware replacement cycle the extra cost will NEVER be re-couped.

[1] http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1... for load wattage numbers

[2] http://www.npr.org/sections/money/2011/10/27/141766341/the-p...

nitrogen
Thanks for the info.
nhaehnle
It's worth pointing out that ROCm is basically AMD's answer to CUDA. Similar programming model and everything.

Let's hope it gets picked up by machine learning frameworks etc., because this market badly needs the competition, as your comparison of per-dollar raw performance numbers shows.

Linus gives the best answer: https://youtu.be/IVpOyKCNZYw

tl;dr: Nvidia, fuck you.

Apr 06, 2015 · 11 points, 0 comments · submitted by dataker
NVidia has a history of not playing nice with Linux. Allow me to point to a small comment Linus Torvalds made on Nvidia a while ago (NSFW):

http://www.youtube.com/watch?v=IVpOyKCNZYw&t=1m41s

All I can think of is when Linus said told Nvidia "F... you!" and flipped the bird.

http://youtu.be/IVpOyKCNZYw

This isn't just an isolated incident either, but I'm no shining example of acting professionally.

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.