HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
SGI's $250,000 Graphics Supercomputer from 1993 - Silicon Graphics Onyx RealityEngine²

Dodoid · Youtube · 312 HN points · 4 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Dodoid's video "SGI's $250,000 Graphics Supercomputer from 1993 - Silicon Graphics Onyx RealityEngine²".
Youtube Summary
An in-depth look at my Silicon Graphics Onyx RealityEngine², a $250,000 graphics supercomputer from 1993. Includes some history and background regarding the Onyx, a physical overview of the machine, a teardown and look at the hardware, and some games and demos.

TIMESTAMPS:
0:00 - 2:01 Musical Introduction Segment
2:02 - 3:10 Spoken Introduction
3:11 - 5:16 Onyx Background Information
5:17 - 11:11 Physical Overview
11:12 - 17:13 Teardown and Hardware
17:14 - 19:27 Software, Demos, and Games
19:28 - 21:38 Spoken Outro
21:39 - 21:52 Standard Dodoid Outro

That RealityEngine paper: http://go.dodoid.net/realityenginepaper
IRIX.cc: https://irix.cc

Intro Song: AdhesiveWombat - Distortotron

--- Contact Me ---
Please don't use any YouTube contact features, I never read them. If you need to get in touch, please contact me on Reddit as a private message to /u/DodoDude700 or comment on one of my videos.

--- How I make my videos ---
I use Final Cut Pro on my custom built Xeon E3 Hackintosh, and film with a Canon EOS 60D. I use a pair of large fluorescent studio lights for most of my work, but may use various other types if filming away from the room I usually film in. I have a really overfilled lab, which is usually where I dig up the tech seen in my videos.

--- ALL TRADEMARKS AND IMAGES BELONG TO THEIR RESPECTIVE OWNERS ---
I do not claim to own any of the trademarks mentioned in my videos. Some images may be obtained from third party sources. If you need to contact me for legal reasons, please use one of the above contact methods.

--- Music ---
The music used in the Dodoid and Dodoid Advent Calendar 2016 intros is AdhesiveWombat - Bombs. The music used in the Dodoid Advent Calendar 2017 intro is AdhesiveWombat - Tinybit. Other AdhesiveWombat songs are sometimes used.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
SGI is Silicon Graphics - they were the pioneers in 3D animation especially for the movies, research etc. Jurassic Park was entirely rendered in SGI workstations.

These workstations were incredibly powerful and extremely expensive (think > $250,000 for the higher end gear in the 1990s) but pretty amazing.

Also, for the time it had very advanced user interfaces. SGI Irix was the name of the OS which ran on these machines.

https://www.youtube.com/watch?v=Bo3lUw9GUJA

Fun vid that might bring back some memories :) https://www.youtube.com/watch?v=Bo3lUw9GUJA "SGI's $250,000 Graphics Supercomputer from 1993 - Silicon Graphics Onyx RealityEngine²"
Jan 05, 2020 · 6 points, 0 comments · submitted by turrini
Sep 02, 2018 · 306 points, 138 comments · submitted by bane
deerpig
I have a slightly newer Infinite Reality R10000 sitting beside my desk here in Phnom Penh. A professor in Berkeley gave it to us and somehow we managed to get it to this side of the world. It uses enough power to run an apartment block and yes it is noisy. I will eventually gut it and install a six node linux cluster inside.

I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.

SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...

spilk
please don't gut rare vintage hardware to install commodity x86 hardware.
Annatar
Yes, please don't do it, for no primitive intel-based system has ever come close to any SGI hardware in terms of elegance. It would be a sacrilege.
jsjohnst
10000% agree. Either sell it or even better, donate it to a computer history museum somewhere.
AriaMinaei
I wonder if it's still possible to put together hardware that's a few years ahead of current high-end products.

Like, can you arrange, say, ten flagship graphic cards for realtime rendering? Do we have game engines that can scale to that number?

jamesfmilne
The company I work for builds machines like that for film post production. I’ve been writing software for one with 8 Titan Xp GPUs, and with its disks arrays and GPU expansion chassis its about the size of an Onyx, and draws considerably more power :-)

Unfortunately I wouldn’t say it feels like the future, more like a normal CentOS Linux desktop.

You’ll struggle to get a PC whose BIOS can handle much more than that too.

We used to build clusters for the same thing in the past, but that was largely standard supercomputing stuff but very similar to how the InfiniteReality machines were used. I believe our software once ran on Onyx machines in the dim & distant past.

So in short I wouldn’t say having loads of GPUs is enough to make it feel futuristic.

erikpukinskis
I would approach it like this:

1) find some graphics problems which people say are not possible on any near-term hardware

2) study the algorithms and identify low level calculations which, if you could do orders of magnitude more of them, would allow you to solve the problem.

3) get a bunch of FPGAs and try to design a machine which can (very slowly) run that architecture

4) once you’ve got it working, slowly replace the FPGAs with ASICs

5) build a box with 16-64 of everything.

I would avoid polygons, since the current architectures are all extremely good at filling polygons. SDFs and raytracing are where you may find the “not on current gen” problems.

a1369209993
> SDFs and raytracing

An easy one would be: have each GPU raytrace a (say) 320x240 scene, each offset by fractions-of-a-pixel[0] or multiples-of-a-screen from each other, then have a final GPU stitch them together into a full-res video.

0: If you this with 60x1080 resolution, you might be able to replace the final GPU with a dumb hardware multiplexer, though that would make compositing painful at best.

jamesfmilne
That’s literally what we used to do for our cluster colour grading systems for films.

We had hardware that the would merge DVI from up to 8 GPUs, in separate nodes, and produce a single image.

gammatrigono
Step 0) is be an Intel or AMD. Nobody else has the money to do what you're talking about.
rys
Take a look at the game Claybook. Fully raytraced visuals at 60fps against a dynamically updated SDF scene representation, complete with fluid sim for the main game mechanics. All on current hardware, including stuff most people consider “slow” these days (Xbox One).
octorian
As others have said, the whole point of these machines is that they were NOT simply creative combinations of current high-end commodity products. The whole system architecture, from the CPU, memory, I/O, to the graphics hardware, was custom designed. That's something you really don't see very much of these days.
wowtip
And adding to the magic, SGI machines were a something most people would never get their hands on. A maxed out graphics workstation today is still just an "ordinary PC" but more expensive. A (big) upgrade, but not something with magic dust.
matt2000
Not in PCs, but I wonder if the iPhone would count as a largely custom designed computer. They do custom silicon for a lot of the system, and get much higher performance than commodity phones as a result. Perhaps the spiritual successor to these high end workstations in an odd way.
pasabagi
I think Linus mentioned something along those lines: on how the work on using Linux for supercomputers was the foundation of Linux for phones. It's a point Tannenbaum makes in Modern Operating Systems: what is old, is new, and what is new is soon to be old, as the bottlenecks move, ideas that were abandoned get resurrected.
pvg
You could but it's unlikely to impress as much - some fundamental transitions only happen once. I think in this case the big advance is realtime 3d graphics with lighting and texture mapping. Perhaps a really fancy VR setup might have a similar effect although to me, it's less obvious that VR will be something every computer has, the way it seemed obvious that 3d graphics is something every computer will get much better at, in the early 90s.
whywhywhywhy
Not quite 10 but here’s a blogpost on making a 7 gpu machine for 3D pathtracing rendering http://tomglimps.com/7_gpu_workstation-1000_octanebench/
brianpgordon
Supposedly you could play video games on one of Nvidia's more recent data center products:

https://www.nvidia.com/en-us/data-center/hgx/

It shows up to the host computer as one really big GPU. Of course, you're going to get worse performance than just a single Titan V because it can handle any game already and there's inevitably going to be latency added by doing work over NVLink/NVSwitch. Those massive GPU products are targeted toward offline rendering or machine learning applications, not so much realtime simulation.

tachyonbeam
I don't most software can scale to 10 GPUs out of the box. AFAIK, it would be hard to even find a motherboard that would fit them. However, a company could conceivably buy a workstation with 256GB RAM, two 32-core Threadripper CPUs and four Nvidia 2080Ti. That would definitely put you a few years ahead of the average "consumer PC" or next-generation console.

Sidenote: I've read that John Carmack and id Software liked to develop on workstations that were "ahead of the curve" that way. It gave them an edge, in that they were able to develop future games for hardware that didn't yet exist, but knowing that consumer PCs would eventually catch up.

I think what made these SGI computers really amazing at the time is that there was no such thing as accelerated 3D graphics in the consumer market at the time (or much real-time 3D for that matter). They also had a cool Unix operating system with a UI that was way ahead of anything you could get on a consumer PC. I can also imagine that it was a much much more comfortable development environment than developing on say, MS-DOS, which didn't even have multitasking.

nickpsecurity
That's a good point but underselling it. My favorite thing about them was doing a single, system image that removed bottlenecks and complexity for developer at the same time. The PC's were using slow or redundant buses to connect high-speed components. SGI removed redundancies, used fast interconnects (GB/s), made them low-latency (microseconds vs milliseconds), and NUMA-enabled them. The last part meant that sending data to nodes didn't take middleware like MPI: you might just do a load and store instruction like on single node. Hardware took care of communications with cache-coherency. You did have to design for good locality to minimize moving data across nodes, though.

Add the other features like reliability (esp hot-swapping), servicability, and security (Trusted IRIX) to have some incredible machines. I always wanted inexpensive hardware with hot-swap, RAID, and something like NUMAlink connecting it. Never quite got that. One company did make a NUMA for AMD and Intel:

https://www.numascale.com/Scale-up-products/

tachyonbeam
I guess that's the difference between a workstation that's designed for performance and versatility before cost, and a PC, which is made to be affordable first. When the PC industry started, it was very much about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000. Things have changed quite a bit since, but much of the do-it-cheap rushed-to-market type of compromises are still with us.
kragen
Sure, the PCs that started the PC industry — things like the Apple I and the MITS Altair — were indeed "about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000." But, long before 1993, most CPUs and components used in PCs were being produced specifically for PCs, with uses in things like lab equipment, industrial control, and workstations a rather smaller secondary market.
spitfire
Do you have contact information?
bcaa7f3a8bbc
>To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...

In the recent 3-5 years, there is a clear revival of the cyberpunk subculture online. Many related hobbyist websites appeared, many new cyberpunk-inspired independent art, music and games are composed, new communities are formed, etc.

Themes include a general nostalgia of the 80s, especially vintage computers, also the 90s early pre-Web 1.0.

The reason? We can clearly see. The lost future that never comes...

None
None
spitfire
Links? I loved Neuromancer, Snow Crash and others of the time.
SSLy
More aesthetic oriented: http://reddit.com/r/outrun/
currymj
The recent game EXAPUNKS is definitely like this, and worth playing, although it's assembly language programming puzzles so might be a sort of busman's holiday for a lot of HN readers.
bcaa7f3a8bbc
Another example is Neocities, the old 90s Geocities personal webpage culture brought to life again.
nailer
Check out Daniel Suarez daemon and freedom. Not saying more as I don't want to spoil anything.
aaaaaaaaaab
Are you talking about vaporwave?
SSLy
That's just one face of the movement.
erikpukinskis
It’s coming. Birthing something like that requires patience. And it always takes longer than you think.

It will come slowly at first, and then all at once.

bcaa7f3a8bbc
>The future is already here — it's just not very evenly distributed.

Nobody is denying it, indeed we see many related development.

However, it's not only about "the lost future that never comes", I think it's also due to the people being increasingly alienated about the current state of computing. Primarily, it's about the lost mentality of the future.

Cyberpunk promised a future where computing is the disruptive technology. Since 2006, the ever-increasing clock speed came to a halt. Since 2013, the general performance of Intel processors remained constant. Selling of PCs keeps declining. No major breakthrough in practical operating systems are made beyond Unix (http://herpolhode.com/rob/utah2000.pdf).

Cyberpunk promised a future where megacorps and governments being increasingly oppressive to the population with technology, it has occurred as predicted. Cyberpunk also promised the "cyberspace" is going to be the electronic frontier where technologists reclaimed liberty... Today, the damn web browser that runs all the craps written in Electron and JavaScript is hardly the "frontier", neither Facebook, Twitter and the endless timeline of buzz, which constitute 90% of the Internet activities count. Also, from 2000-2013, decentralization was killed, not built. Move importantly, today, we don’t even know how to waste time on the Internet anymore (https://news.ycombinator.com/item?id=17068138). Though, It, still, has arguably occurred to some extents too, but progressed slowly, even the basic tech like HTTPS was only deployed after Snowden.

A future where anything seemed possible and all of it was magical is definitely gone. But the new generation of developers, armed with decentralization, P2P, cryptography, and trustless system, no matter if it's going to be successful or not, would bring the Internet back to its cyberpunk ideals, revive the dream and set the history forward.

corysama
I was very fortunate. This is the machine on which I originally learned OpenGL. It was an undergrad lab job where I was paid $5/hour to eventually write http://paulbourke.net/geometry/polygonise/marchingsource.cpp But, in order to do that I had to learn how to 3D at all.

The most fun I had with the machine was playing GlQuake 1 at 1280x1024 on a T1 line and bragging about it on the chat where everyone thought I was lying. Of course, a year later the 3DFX Voodoo 1 would come out and deliver approximately 1/4 the fill rate for 1/1000 the price.

Besides that, it was a fine machine. Lots of cores that I didn’t know how to utilize. Occasional bugs in the OpenGL implementation that might have been me accidentally triggering edge cases. Command line compiles, gdb, lots of core dumps.

SGI really lost it’s way after this. Years later they would be showing at SIGGRAPH with multi-wall-sized displays, but without the programmable shading that would be all the rage not just right then, but for the next two decades... They went so far as to publish papers showing that with hundreds of passes and temporary buffers using fixed-function shading, you could eventually get mostly the same results as the new-fangled programmable BS...

The brilliant engineers of SGI soon moved on to become founding members of Nvidia, ATI, Imagination and other graphics tech companies.

pandaman
>a year later the 3DFX Voodoo 1 would come out and deliver approximately 1/4 the fill rate for 1/1000 the price

You are probably misremembering things. GLQuake came out in 1997, after 3DFX Voodoo in 1996, not the other way around.

theandrewbailey
ATI was founded in 1985, long before this machine. Maybe you meant ArtX? They eventually got bought by ATI, and later, AMD.
sureaboutthis
If you mean in 1993, you mean GL, not openGL, I think. I worked at SGI in 1992.
analbeads01
OpenGL was founded in 92 and basically IrisGL with bits removed. Anyway by the time quake was released in 1996 OpenGL was very much a thing.
TomVDB
> The brilliant engineers of SGI soon moved on to become founding members of Nvidia, ATI, Imagination and other graphics tech companies.

It’s called “the University of Silicon Graphics.”

jaysonelliot
The VRML demo at 18 minutes in really took me back: https://youtu.be/Bo3lUw9GUJA?t=18m

I remember how convinced some people were that VRML was going to be the future of the Web. 3D graphics in your browser, all coded with a simple markup language! How could it fail?

Ultimately it was just too complicated for the average web surfer to navigate, especially in the '90s. For a while, though, it really looked like the visions of virtual worlds that cyberpunk fiction loved might show up in your Netscape browser.

twic
VRML! I wrote a modest amount of VRML by hand in the early '00s. One project was a 3D model of a castle, which i worked on in large part because it was something i could do together with my then-girlfriend - she would describe how the castle should look, and i would code it up. I learned a lot about how to use parametric prototypes, so i had a kit of parts i could rapidly use, to speed up the feedback loop.
hpcjoe
I wrote a code called genvrml which converted molecular models from any format (using Babel at the time for conversion) to VRML for display. I converted a bunch of my models for this. I still have them here. And if I dig around, I might even have genvrml as well.

I did believe that this was fairly game changing tech. I showed my 3D models sticking 2m out from a rear projected Reality Wall powered by an IR engine in the Detroit office.

But SGI was beset with inane leadership, and we began our terminal decline mere weeks after I was hired ABD from grad school. I finished up my thesis and developed an autoinstaller code for Irix at the same time. Writing the manuals in LaTeX.

ipsum2
I remember reading an old copy of a VRML book from the library, marveling at the syntax (computer couldn't run it).

There's some modern VRML-esque things now, A-Frame and ReactVR come to mind.

mattlondon
I also have fond memories of marveling at VRML scenes on TV when my computer and dialup internet connection weren't really ready to experience it.

Thanks for the link to A-Frame though! This looks great - I am loving the declarative nature of it, and the scene-inspector seems like a really useful tool.

narrator
Virtual Reality is so weird. It's a future that never really got here. Even now that the tech has caught up, it still just doesn't get a lot of enthusiasm from the market.
twingie
Well, if you look at any form of interactive 3D multiplayer game, it really is here, it's just that there's so much to pull apart, in terms of the nuance of online interaction with humans, that we're reluctant to move beyond the cartoonish worlds of video gaming.

We have tons of really popular game titles, and millions of people playing or watching or chatting, as tournaments unfold. We fill convention centers with people playing games, and stream a webcam feed of people at conventions playing the games. But it's all still games, world of warcraft, fortnite, minecraft, eve online.

So, what if it wasn't a game? What if it was more than video conferencing? What if there was more going on, and an immersive system of interaction closer to first person gaming engine was as relevant, in social terms as facebook, twitter, bitcoin, uber, amazon, google, github, along with email, chat, and file sharing, all as a single service?

What if all your home automation and mobile devices were integrated? What if it gave your GPS location to the world, and authenticated you selectively as context sensitive pseudonyms, or conversely your authentic legal identity, securely and as preferred?

How much of that is possible? And how much of it do we actually want? But most of all, if corporations collaborating with governments, or an open community of enthusiasts, put this in your hands, would you trust it? Would you put all your eggs in one basket like that?

How many IP addresses would a system like this require, to keep each context secure from another, to preserve pseudonyms, and not leak GPS location information? What about background audio, and geolocation by timing attacks? Plus the basics of platform security, knowing what we see every day in terms of zero days getting dropped, and leaving people out in the open?

It's cool to think about, but our technology as it exists, is a drafty outhouse, compared to the brick shithouse we really need.

jonhendry18
Part of it is that interfaces that require "walking" through a space or looking around for things in a 3D space are terrible if there's no inherent justification for the movement (as there is in a game of exploration).

I don't want to have to navigate a space in order to find something. Navigating space is a limitation of physical objects; replicating it in an online environment is an error. It's like if automobiles were designed so that steering and speed were controlled with reins.

nabla9
>it really looked like the visions of virtual worlds that cyberpunk fiction loved might show up in your Netscape browser.

It was exactly that.

I participated on one SGI demo around 1995. It included tehnology demo and the grand vision from some guy I don't remember.

They had few the these bad boys (bigger refrigerator sized models) and very fast internet running Netscape browser. They wanted us to start to develope for this to this new thing called Java that would allow code run everywhere.

They demoed what they thought internet would look like after 3-5 years:

* 3d browsers with VRML. HTML is replaced with 3d virtual rooms with 3d surround audio.

* you can attach pieces of Java code to the VRML and freely interact with objects

* people would buy stuff in the internet using micropayments possibly using new fancy new digital currencies like DigiCash (this was still hand waving). There was also this thing called E-gold few years later.

* the demo included virtual kitchen design app, virtual rooms with water fountains and java controlled robots, and fish that responded to their environment.

It was all mindblowing technologically, but none of the ideas were new to me because I had been reading Snow Crash (1992) and others. Jaron Lanier and VPL Research were also known to work with these things.

In fact, it looks like we are repeatedly trying to bring the same vision alive. Every wave of attempts is getting us little bit closer but always falling short little bit.

In the current wave Nvidia has replaced Silicon Graphics and there are new technologies doing the same things in VR and AR. Now we have cryptocurrencies, javascript, Wasm, WebGL, mobile GPU's, 5G. Most of the current bleeding edge will crash when the next recession hits around 2020 or so and some of it will survive. I'm looking forward to the next generation of technology emerging around 2025 ...

harel
Flashback scene begins.

When I was much younger my dad (who has a print business) was tasked with printing panels for F16 jet fighter pilot cockpit to be used in a simulator of the aircraft. The simulator was developed by a private company. One Saturday he took me to their office. They had one of these and gave me a demo of it. Until then all I knew was of standard consumer computers (commodore, pc, macs etc) . In another pretty big room they had a lot of amiga computers used to make the 3d models. I then had a go at the simulator. 180 degree screen, a real cockpit, and pretty amazing 3d graphics.for the time. Flew my jet through a mountain tunnel. I think the only thing that impressed me as much as the simulator itself was the sgi. I've never experienced that kind of futuristic computing power.

Flashback scene ends...

mveety
I remember seeing Boeing's simulator at their factory in the early 2000's that was powered by a room filled with SGI machines running the graphics, controlling the hydraulics, and lying to the avionics. It totally blew me away.
sureaboutthis
I was the systems engineer for SGI mainly to McDonnell-Douglas in St. Louis in 1992 for their flight simulators. If you think what you saw was something ...
harel
I'm sure my early teen eyes saw only a glimpse. The one I was shown was designed for the Israeli air force. This was the closest I got to sit in an F-16 cockpit, and the closest I got to super computing power even films couldn't show me at the time. I do however remember them totting the 250K price tag.
sigvirt
Also around 1993, Onyx with RE II was used by Nameco and a company called Magic Edge to build actuated flight sim pods [1]. Coincidentally also in 1993, NASA published a guide [2] to their Vertical Motion Simulator [3]. At the time, the VMS used several SGI IRIS machines as auxiliary display processors, but custom machines for primary.

[1] http://www.system16.com/hardware.php?id=832 - Nameco Magic Edge Simulator Ha rdware at System-16, The Arcade Museum

[2] https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/199400... - Vert ical Motion Simulator Familiarization Guide (pdf) (1993)

[3] https://www.aviationsystemsdivision.arc.nasa.gov/facilities/... - NASA Vertical Motion Simulator

One day, the din of the machinery paled and withered under a thunderous earth-shaking draconian roar - whiplashing my head out of the rack toward the windows - an F/A-18 tilted sideways at the edge of ground effect ripping a minimum radius turn back to from whence it came ... without ever leaving the airfield. Sigh. There is no simulator for That.

slobotron
Always loved the IRIX Motif and 4dwm aesthetics, and was lucky to have had access to some Indigo machine in school.

Was really impressed by how responsive the file manager was wrt. the immediate appearance of icons for files created from terminal. It also kept track of which folders were open in other windows, giving them different appearance - felt like one was interacting with actual objects.

Tried numerous times to get the same feel on Linux; there used to be a project called 5dwm to recreate the Indigo Magic Desktop experience, I should check if it is still around for additional kick of nostalgia...

mixmastamyk
I still move scroll bars and expect the indentation (to show where it was) to be there.
Fnoord
I don't know about the SGI Indigo, but the SGI Indy had relatively low latency [1]

[1] https://danluu.com/input-lag/

ChuckMcM
Fun times. Could have been the envy of all your engineer buddies with one of those bad boys all to yourself. What a difference 25 years makes.

Personally I find that learning about these machines is a good way to better understand computer architecture because each one was, during its reign, some really smart bunch of peoples idea of what was state of the art. You can see the evolution of ideas that way.

blattimwind
What I think is quite interesting about these era of machines is where the power goes. First off, there aren't actually many heatsinks in the machine, and those few that are used, are really tiny. This means that the power used must be distributed rather evenly among all components, even on the CPU boards. The CPUs can at most pull 10-15 W with those heatsinks, meanwhile the big ASICs spread around everywhere can probably sink perhaps 3-5 W with no extra heatsink. Compare this to modern systems, where the (single chip) chipset has a TDP of ~5 W, while the CPU(s) run anywhere from 50 to 100+ W.

So the gap between high-power and low-power components was way, way smaller back then, and I'd wager that a huge chunk of power in these SGI machines went into their high-speed bus systems.

Another big difference is how the power system was laid out. Here are the specs of a SGI Onyx 2 PSU [1]:

    1750 W @ 230 V
    3.45 V, 375 A
    5 V, 85 A
    12 V, 22 A
    5 V, 1 A (probably standby power?)
In that day and age, most of everything in there would've run off 3.3 V. So the PSU directly, centrally provides that voltage and due to the large currents droop and such will have to be considered carefully in the entire system design (and it almost certainly uses remote sensing at the backplane, perhaps even sampling various points in the system). This makes sense, because there are many components and the difference in power between them is kinda small.

Nowadays, a computer PSU provides essentially two voltages: 12 V and 5 V standby. It still provides "legacy" power, like 5 V and 3.3 V rails, but these are derived with simple buck-converters from the 12 V rail(s) in modern designs. Because a modern computer has few components requiring a lot of power, each of them has their own local converter sourced by 12 V to reduce losses.

[1] https://web.archive.org/web/20170314114558/http://sgistuff.n... https://web.archive.org/web/20170314160011/http://sgistuff.n...

ebikelaw
When Intel released the Pentium with mandatory active heatsinks that was called a big scandal by the industry magazines. Now of course we have smaller CPUs drawing 150W or more.
duncanawoods
Great video!

Not one for nostalgia but I have found myself cooing over old computers the way most react to babies. A BBC B can bring a tear to the eye. If you are in the UK then http://www.tnmoc.org/ is well worth a visit. Its like a themepark for nerds.

Working at one of those noisy brutes must have been harsh. It brings back flashbacks of years spent in server rooms. Kids with their cloud computing don't know what they missed!

mothsonasloth
I'm a member of TNMOC and go maybe 2-3 times a year. The Acorn machines RISC OS always fascinates me.
duncanawoods
Excellent idea, definitely something I want to support. I have just joined too.

I remember using an Archimedes after years of fighting to get my games to run well on a BBC Micro. Wow - it felt like cheating!

gdubs
When I was about 13 years old I had my mom call up SGI to ask if they could send us any information. This was the time of dialup internet / AOL, and info came to our house primarily via snail-mail. A few weeks later a thick package came in the mail, full of beautifully printed brochures and booklets on their machines of the day; Indigo, Indy, Onyx... I treasured those pamphlets, and dreamed of maybe one day getting to actually use an SGI.
mark_l_watson
I used two of these in the 1990s: one for Nintendo video game development (thanks Nintendo!) and one for a Virtual Reality system for Disney (thanks Disney!). Awesome tools, but the functionality is now more or less commodity gaming PCs.
martyvis
We started using SGI Personal IRIS systems which were I think were using 33MHz MIPS R3000 processors in the late 80s. I think they might been able to address a heady 32Meg of RAM. We were developing a graphical operation control system at the steelworks where I worked as an engineer. I remember them costing something like $25-30K each. So to enable enough developers we had to add serial terminals and X terminals. It was actually this aspect that got me involved in finding open source X windows solutions that allowed us too repurpose standard Intel PCs as terminals. Fun times. But the sgi's were ahead of everything else out there. While we didn't have a real work need for their 3D capability it did let you see what the future could hold.
dev_dull
The scanlines on the CRT monitor reminded me how good the refresh rates were on that hardware back then. How did we move backwards so spectacularly with LCD panels? It’s only now that reasonably priced 144hz monitors are hitting the market.
fiddlerwoaroof
A low refresh rate on an LCD is a lot more bearable than on a CRT
rasz
What good would high refresh LCD do you in 1995-2005? From no acceleration at all to barely being able to run modern games at 40 fps (cod2 on 7800GTX in 2005).

CRT need high refresh, because they flash picture content at you. LCD do not flash, they maintain steady picture.

saltcured
The main reason for 144 Hz refresh back then was not to tame normal CRT flicker. It was to get 72 Hz refresh of a stereoscopic pair. You had to use really fast phosphors to make this work and not have severe blurring/ghosting of the left and right images. Monitors intended for non-stereoscopic use could have slower phosphors which helped reduce the flickering effect at lower refresh rates.

Having this rapid refresh helped give persistence of vision for each eye, where if you had subdivided 72 Hz into 36 Hz cycle rates for a pair, you would have very annoying strobe effects visible to most people.

No application was rendering at 144 Hz. They might be lucky to sustain 20-30 Hz, but the framebuffer held both a left and right image and the video output switched back and forth between them on each frame scan. Complex visualizations often got down to 12 Hz or so and considered that acceptable as people would still perceive motion as in an animated cartoon.

localhost
We had one of these in Bryan Jones' laboratory in the Chemistry Department at the University of Toronto. We also had the 3D polarized shutter-based glasses as well - it was cool to see small molecules docking into active sites of enzymes right in front of your face. Of course, none of that stuff amounted to anything, and in hindsight it was a gigantic waste of money.

In my lab, we only had one of the SGI Indy machines - it was fun surfing the web on the original NCSA Mosaic browser on that pizza box machine.

mstade
Man, this brings back memories. I still have an O2 and an Octane system in storage. Been meaning to take them out and set them up again at some point.

My father was generous enough to give me an O2 in the late 90s so I could learn 3D graphics, since I was interested in making special effects for movies. We didn’t have any software then but found this thing called Blender which was available for free and worked on IRIX. Still have a bunch of blender books and manuals from back then. Good times!

XnoiVeX
:-) I have a fully working O2 setup somwehere in the attic. Could not get myself to get rid of it. Also have an Indy that won't boot up.
Keyframe
I had the pleasure to work with various SGI machines back then (video and animation). Being around Onyx was always a special feeling, even though I prefered thw look of older Crimson chassis. SGI made a poster calendar even, which was set like 40 odd years into the future or something, taking advanfage of periodicity of calendar years. Nice gimmick, nice poster which I lost, sadly.
PeterStuer
At one point our lab bought an O2 and an Indigo. Mostly these were fashion statements as we didn't do much graphics work. I can't remember them being remarkable performance wise for normal workloads compared to the Sun machines that where our staple.

The machines most certainly stood out in a time where the only options you got for an exterior finish were beige and beige.

gcb0
sgi was the preferred way for departments to blow their year/quarter budget so it wouldn't be cut off next year.
blincoln
In the mid-90s, a friend of mine described an SGI demo where the player flew a speederbike through a forest. Apparently it was quite impressive for the time. Every time I come across footage of old SGI software, I hope it's in there, but it never is. Has anyone run across a video of this?
a-dub
I had an Indy on my desk when I was in my first paid job in high school. I loved that thing. We also had a Challenge named challenger. I blew it up once.

Years later I interviewed at Google. One of my interviewers turned out to be one of Jim Clark's original grad students. That was pretty crazy.

russellbeattie
Years ago, Weird Stuff Warehouse (R.I.P.) had a shell of the larger closet-sized Onyx system sitting in the back, and I always thought it would be fun to do something with it, but couldn't think of anything. With a shell of a Cray, you could at least make a couch out of it...
narrator
Unfortunately, the power consumption alone makes these computers not worth using for anything practical no matter how cheap they are.
jsjohnst
It’s possible that was my old machine. I donated a lot of old SGI hardware to Weird Stuff in 2008-2009 before moving from SV up to SF.
mnutt
My father worked in sales for sgi through most of the 90s; what started as him bringing me to the office a few times on weekends turned into me asking to go to the office most weekends. ("don't you have any work you need to finish up?")

Every employee had an an Indy or Indigo2 in their office and most (us included) had one at home too. Very few of the machines at the office had passwords. I never really got very far past playing with the 3d tech demos or surfing the web on Mosaic and Netscape, but it was a pretty formative part of my career in technology.

some1else
Could anyone use an SGI Indigo 2 Impact (the purple one)? I've got one in Slovenia, ready to part with it for 100€ + shipping. My mother threw away the screen, keyboard and mouse (thinking I won't need them), so it's just sitting around. The box is still fully operational, with Photoshop for Irix installed.
wslh
The main goal of going to exhibitions like COMDEX [1] was playing a little bit with these computers. Those were strong feelings!

[1] https://en.m.wikipedia.org/wiki/COMDEX

deng
Oh, who could forget the "not PS/2 connectors". I also vividly remember the sysadmin telling me to never ever disconnect they keyboard while the machine is running, or something inside might blow. Not sure if he was just paranoid, but I also never tried... I also remember one German shop buying a slightly used Onyx for half a million Deutsche Mark, and realizing they did not have the key for turning it on, so they had to short the thing. And it was so loud you had to wear earplugs...
abcd_f
Speaking of keyboards - I can't believe how well preserved the one in the video is. No discoloration or fading, as if it were made just last month.

Is it an SGI thing or was it just sitting in a box as a spare all this time?

sp332
Early Macs didn't have hot-pluggable keyboards and mice either. It was one of those things where it would usually work, until one time some fluctuation would just brick the controller chip.
redm
Interestingly, my Indy Irix “learning” workstation, circa 2001, is selling for more today on eBay then i bought it for back then. I guess it really has become a collectors item.
uptown
I used these in the computer lab at RPI back in the 90s to do 3D animation. It’s crazy to think how much smaller and faster today’s tech is compared to these behemoths.
Annatar
And yet, this smaller tech comes directly from SGI: lots of former SGI hardware engineers ended up at NVIDIA.
herf
CMU's graphics lab had a Crimson when I was there as an undergrad - many hours late at night! These models had an accumulation buffer (a way to combine/average screenspace image buffers), which was fun for all kinds of tricks, like motion blur. At the time, it felt amazing to get to use all that fill rate.

Just five years later, the 3dfx voodoo2 did most of what that machine did (without the accumulation buffer).

jacebot
When I was younger I was really into CGI, I remember thinking man, If I ever get to work on one of those to do some 3d rendering, that will be something.
Annatar
Had the server version of this system as my private server in my basement at home; the electricity bill was peppery.

I must have torn it apart and put it back together about ten times, studying the hardware, and must have installed IRIX 6.5.30 on it just as many.

eveningcoffee
It is about $425,000 in inflation adjusted dollars. If we adjust it to the real estate prices then even more.

This price is no wonder considering the amount of the silicon it contains.

What I do wonder is what would be the performance of the same amount of the silicon today.

coldtea
>What I do wonder is what would be the performance of the same amount of the silicon today.

Well, aren't there several comparable models from IBM and co?

None
None
shimfish
I have a vivid memory of having to carry one of these up a flight of stairs.

I used the machine as an undergrad, doing some coding as part of the university's driving simulator project.

kilroy123
So what would be the modern-day equivalent to this machine?
rasz
CPU and 3D capabilities were late 1999 early 2000 gamer PC level.
akhilcacharya
I wonder how usable this would be with a modern browser.
dis-sys
I tried to search online to see whether DOOM has been ported to Onyx RE2, but couldn't find any video proof. That is really strange.
sigvirt
Near Y2K, as development slowed, several of the hands-on QA teams were found to be astonishingly good at Quake. Later, as labs high and low were decommissioned, a couple of sysadmins consolidated equipment and absolutely dominated the SETI at Home rankings for many months, and likely the PG&E [1] rankings as well.

[1] Pacific Gas and Electric, a California utility company

s369610
https://youtu.be/Bo3lUw9GUJA?t=1042 show a doom icon
spilk
DOOM shipped on the IRIX install CDs for many years.
saltcured
I remember seeing Quake ported to the CAVE environment and demonstrated on an Onyx (well actually an Origin with reality engines) with a 3-wall and floor projection room...
rwmj
Is the backplane 32 bit VMEbus with an additional central connector? [Edit: 32 bit, not 64 bit]
kyberias
How can we reach this guy? His site currently doesn't work and has a security problem.
wincy
I think he mentions in the video that his site is down and how to get ahold of him but I don’t care to watch it again to verify.
jsjohnst
iris.cc is the website mentioned.
pcurve
can you imagine companies spending on insane amount of money on these only to have them become obsolete in 3 years?
rasz
No, because it didnt happen, commodity hardware got there ~7 years later. You were buying a window into the future, allowing your researchers to do things mere mortals could only read about in science fiction books.
browsercoin
I'm shocked by how crazy the graphics were for back then...I mean seeing the metal logo, I couldn't quite believe that this was possible in 1993. I still have the CGW magazine from this era.
None
None
liftbigweights
We had these and a few SUN systems in our computer lab storage room. We called them the dinosaurs. It's amazing how things changed from 1993 to mid 2000s. From technology rock stars to useless junk in a little more than a decade.

The "lab master" bribed us with pizza to help him clear the storage room. Quickly found out how heavy these things were and what a raw deal we got.

wowtip
I heard somewhere SGI put a big chunk of iron/lead inside the case to make them heavier and add to the expensive impression when handling the machines.

Not sure if that was the case, or if it is just an urban legend?

qubex
They used cast iron extensively.
saikiran91
Old News. I saw that video a week back.
qaq
Could be converted to a cool mini-fridge
walrus01
Something that looks cool and is cube shaped, but way less useful, like an SGI Crimson would make a better fridge.
deerpig
SGI famously converted one of their Crimsons (which was a commercial failure) into a fridge and beer tap...
sigvirt
The Silicon Graphics crew did like their case mods. Adding to the list: a demo team had a deskside O2K with a top air-intake and side exhaust pipes like a dragster engine.
monocasa
It still runs. Let's keep the fridges to "for parts" ebay purchases for computers that combo of old and interesting.
qaq
I was not suggesting this particular system.
voxadam
Or a toaster oven.
Keyframe
Or legendary Espressigo https://youtu.be/NPvIs0I-gmg
qaq
This is really cool
abcd_f
Damn. This should fit our Rancilio Silvia just like that... brb.
octorian
Let's please stop doing that to interesting vintage computers. Well, at least to working (or fixable) ones. I cringe every time someone shows off that sort of project.
qaq
While I was not suggesting doing to this particular model I would imagine a good chunk of them end up in a landfill.
corerius
We had a maxed out one of those in the company lab. I seem to remember that it was on the loud side.

What a tremendous waste of money.

None
None
None
None
iamgopal
The true victory and underappreciated skill of first world country is in their scalability to earn profit from massively overpriced machines.
justinator
Why is the dude wearing untied board shorts and no shoes? Is that his signature Thing?
pvg
It's traditional in some parts of Canadia.

https://www.youtube.com/watch?v=6vgoEhsJORU

mattl
This looks good. A previous video on this channel said there was no WWW in 1993 and I stopped following things. Have things improved?
dodoid
Hi there!

I am Dodoid, the creator of this video. Created an account here to ask you this. Where did I make that mistake? I'm fully aware that there was a world-wide-web in 1993, and if I made the mistake of saying that there was not, I have a post-release correction system and can add a correction to the video.

Anyway, I hope things have improved.

Thanks for the interest, -Dodoid

axilmar
Thanks for the awesome video!!!!
danieltillett
I had the fun of using this system when I was doing my undergraduate thesis. I had done some protein modelling work and ran straight into the problem that there was no way to get the beautiful models into my thesis - we had no printer.

I came up with what I thought was the genius idea of photographing the monitor, the problem was the CRT screen was so curved my models were all distorted. I ended up wheeling the whole SGI computer out into the hall and setting my camera up with a 500mm lens (borrowed from the graphics unit) at the other end of the hall (maybe 50m away). Worked great.

None
None
swypych
I love this story, great digital to analog conversion.
yaakov34
These early SGI machines did not have good support for printing. Even if you did have a printer, the only way to print was to take a screenshot, which had much lower resolution than laser printers and did not look good.

My first paid job - this was after the 10th grade in high school - was writing a kind of printer driver for one of these machines. One of the researchers at a local university had the same problem you did, and heard about this kid who was supposed to be good with computers :)

The SGI GL manuals (this was before it became OpenGL) included all the mathematical formulas they used to display the graphics - the rotation and perspective matrices to transform the coordinates, the vector cross products to calculate shading, and so on. I took these and implemented a subset of GL which outputted PostScript commands to a text file. This was then sent straight to a laser printer (Apple, I think). I didn't implement everything that SGI did, of course - no smooth shading and I think I could handle only the simplest types of occlusion. But it was good enough to handle the models that the researchers needed to print.

I still remember the SGI manuals in big 3 ring binders. This is what I learned linear algebra from - thanks guys.

timthorn
I spent a week at SGI UK in 1994. Everyone had at least an Indy workstation as their machine, even for secretarial purposes. But still there were a couple of 486 PCs acting as print servers.
sureaboutthis
I worked at a remote office for SGI in 1992. I called my boss at the home office complaining I didn't have a computer to test software with. He apologized for having dropped the ball and immediately ordered a 16 processor system for me.

I don't remember which model that was but I do remember almost falling out of my chair as it was the top of the line system at the time.

danieltillett
I haven't thought about the binders in ages. There was nobody in my department who knew much about the SGI machines and my only tutor was those binders. Looking back I wonder how I accomplished anything at all.
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.