Hacker News Comments on
Seymour Cray's Only Surviving Talk: "Cray-1 Introduction" (1976, LANL)
UC Berkeley Events
·
Youtube
·
123
HN points
·
1
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.This is the Watson "janitor" letter: http://www.computerhistory.org/revolution/supercomputers/10/...Here is a talk of his: https://www.youtube.com/watch?v=vtOA1vuoDgQ
Bill Norris mentioned was the CEO of CDC.
⬐ hownottowriteNote: The audio gets much better about 8 minutes in.⬐ teddyh⬐ mmlAt exactly 8:30, to be precise.Amazing he mentions trade schools. Wish they still existed.⬐ empressplayThis video is totally Cray ;)⬐ frikWhat about https://www.youtube.com/watch?v=xW7j2ipE2Ck ? (one of the Youtube Video suggestions)⬐ dang⬐ gjkoodWe removed "only surviving talk" from the title.⬐ nsxwolfYep... unless that doesn't count as a "talk" for some reason.If you haven't already done so, I highly recommend reading "The SUPERMEN" by Charles J. Murray.http://www.amazon.com/Supermen-Seymour-Technical-Wizards-Sup...
⬐ mark-rI had the pleasure of touring the Cray plant in Chippewa Falls and meeting the man himself, not too long after this talk. My expectations were that we would be on one side of a glass window and the computer would be on the other, but no - we walked right up to one that was being tested for delivery, serial #5 I believe it was. They were obviously proud of the cooling system - we were invited to put our hands on the side panels to see how cool it kept. The wire maze on the inside of the "C" was incredible, one trip would have caused an immeasurable amount of damage.Unfortunately we only met Seymour for a short time, obviously he was quite busy. Even more unfortunate is that I don't remember much of that meeting. The only tidbit I remember clearly is him pointing to a pumpkin on his desk. His daughter had grown it in the garden, and knowing he was already thinking about a Cray 2 machine decided it should be named the "Cray 3".
⬐ at-fates-handsMy father recently told me he used to share lab time with Cray when he was in college. He said people would go crazy when he was on campus and follow him wherever he went.He said he was like the Steve Jobs of his day - if you were into tech, you knew who he was and just wanted to soak up what he was saying.
⬐ GuiATangentially related but I had to share:Another favorite pastime [of Seymour Cray] was digging a tunnel under his home; he attributed the secret of his success to "visits by elves" while he worked in the tunnel: "While I'm digging in the tunnel, the elves will often come to me with solutions to my problem."
⬐ pyglow⬐ kabdibI honestly think this is a metaphor for a DMT trip. Without getting into too much detail, "machine elves" (http://lmgtfy.com/?q=machine%20elves) are a common name within the culture for beings that give you guidance during the trip.⬐ frikI read about it in the "Supermen" book [1], he dug extensive tunnels. Doing repetitive work makes it probably easier to do daydreaming. That's how he designed his Cray super computers architectures in his head.[1] http://www.amazon.com/The-Supermen-Seymour-Technical-Superco...
⬐ kjs3Some geniuses take long walks for inspiration, some dig tunnels and talk to elves. I don't judge.We had a Cray at Apple. I got an account on it and mucked with it for a while, but nothing serious. Its main use was to simulate plastic flow in molds for the cases of Macintosh computers (molds are pretty expensive, $500K to $1M, and you want to make sure they'll work well before you commit to making them).Turned out that Seymour used MacDraw to design parts of the Cray-3. So we were using each other's computers to design our own computers....
⬐ guiambrosI had the opportunity to use a Cray Y-MP 2E at University, early-90s. Even though this was in Brazil, you had to get special approval from the US Government to be able to run your code. I got a free pass as I was limited to 10 mins of CPU time/week. Shell usage didn't count, so 10 mins was plenty of time to play with UNICOS.It was a beautiful machine. The C-shaped purple machine was impressive, and the refrigeration huge. And we had some nice SGI Indigo workstations as the front-end, so you could optimize your (Forth) code before moving to Cray.
Amazing to think of all that paranoia for 666 Megaflops. My not-so-new Core i7 has 150,000+ Mflops, not counting the GPU.
⬐ kjs3⬐ dangI got to use a Y/MP-48 (COS with VAX/VMS front ends) in the late 80s. Excellent learning experience. The professor offered something like 10 bonus points on the final for anyone that could solve a particular problem (don't recall which one) in a faster time than he was able to achieve. Having done the best I could, I called the local Cray office looking for help. It was a sales office, so the person I talked to could only say he'd see what he could do. A couple of days later, they called and asked if I could come to the office for a bit. Turns out, one of the senior compiler engineers was in town, and he took something like 2 hours out of his trip to teach this college kid all the black magic and voodoo compiler switches and optimization techniques that weren't covered in the class. Very generous and great customer service. I ended up about 12% faster than the shocked professor, and got my A.⬐ guiambrosLove the story, thanks for sharing it.This is splendid! But definitely not Cray's "only surviving talk", since YouTube recommends another:⬐ alayneThe only good book on Cray I've found is he Supermen: The Story of Seymour Cray and the Technical Wizards Behind the Supercomputer by Murray.Besides two or three videos, The Supermen, and tangentially related books about the history of CDC, it's hard to find a lot of details about Cray's work. I think the University of Minnesota has some Cray related papers, but I'm not sure what.
⬐ frik⬐ RustyRussellI loved the Supermen book as well.It would be great if some of the internal design documents of Cray super computer architecture have survived somehow and someone publish them on archive.org or so.
⬐ pico303I loved Supermen. A little light on the technical, but a really fascinating read, particularly if you're into the history of computing.I'd love a transcript of this talk: the audio is poor...⬐ ChuckMcM⬐ cl8tonIt is very amusing though to turn on Google's auto close captioning though :-) (my way of agreeing with this request)Great seeing this again. I remember in its day Cray was like a mythical beast we only heard of since Gov was the only one with enough money to buy one.The C configuration with hard wrapped wires were awesome. Hard wrapping was the fastest path for electrons from point A to B.
⬐ hcarvalhoalves⬐ danjayhWhat is "hard wrapping"?PS: No kidding, can't Google it. I only find articles about jewelry instead.
⬐ cl8tonBack in the bread boarding days, hard wrapping was using a tool to physically wrap a hair thin insulated wire from one pin on an IC to another pin on an IC.Not using the PCB foil electrical path but rather a direct short path as possible wire connection between two points.
⬐ mikeashIs this another term for wire wrap, or is it something different?⬐ cl8ton⬐ crshultsYes wire wrapping and hard wrapping the same thing.Hard wrapping was something the current day mags called it (Popular Electronics) for one.
Do you mean wire wrapping? http://en.wikipedia.org/wiki/Wire_wrapWhat an amazing speaker. He brings what should be boring technical details to life in an engaging, humorous dialogue. Obviously a very technically intelligent man, and an excellent public speaker, that seems like it's a rare combination.⬐ localhostAmazing what ~40 years can do. I still find it astounding that my Snapdragon 800 powered phone, a Lumia 1520, is ~800x (130 GFLOPS vs. 160 MFLOPS) faster than the Cray-1 and ~1/24000th the mass (5.5 tons vs. 209g). Truly a "supercomputer in your pocket". Which plays various flappy bird clones. Or Wordament.⬐ None⬐ protomythNoneI still think two of the great tragedies of modern computing were the loss of Seymour Cray and Jay Miner.[edit: I still have a picture of me standing in the middle of a Cray 2 - that was a fun machine]
⬐ tom_jonesThanks for posting. Seymour Cray is one of my heroes. Great man with great vision. Such a great loss to the world.⬐ kylekThis talk is an absolute gem. Thank you.⬐ kichu37nice⬐ hfCray's rather famous anti-parallelization quip, "If you were plowing a field, which would you rather use: Two strong oxen or 1024 chickens?"[0], reminds me somewhat of one of Donald Knuth's statements: "During the past 50 years, I’ve written well over a thousand programs, many of which have substantial size. I can’t think of even five of those programs that would have been enhanced noticeably by parallelism or multithreading. Surely, for example, multiple processors are no help to TeX"[1][0] https://en.wikipedia.org/wiki/Seymour_Cray#SRC_Computers
[1] http://www.informit.com/article/article.aspx?p=1193856&_
⬐ protomyth"Two strong oxen or 1024 chickens?"At the time he said it and given the types of problems people were using it for, the chickens would have never finished. He built vector machines and that is where his ideas on parallelism were.
⬐ akira2501Well, he built his empire on Emitter Coupled Logic. He didn't have the limitations of CMOS to deal with and so parallelism wasn't as necessary for continued computational power.⬐ weland⬐ frikI...I'm not sure I understand what you mean, but it feels rude to downvote based on it. Can you elaborate on that, please?⬐ sp332No one is arguing that parallelism is easier or more convenient than serial processing. (I think.) But clock speeds top out eventually, because power usage goes up with the square of the clock rate and eventually your chip just melts. So the only way to get more performance out of a piece of silicon is to put processors side-by-side.⬐ welandI'm not arguing that either, it's the nonsense about ECL and CMOS above that I can't understand.⬐ kjs3It's not nonsense. By committing to using ECL logic (read: cost is basically no object) Cray didn't need to bother with parallelism to get the best performance possible at the time, and so it was reasonable marketing for him to discount it. If he'd been confined to CMOS (or to a somewhat lesser extent bipolar) logic, he'd have had a very different set of trade-offs to get the performance he needed.⬐ weland> By committing to using ECL logic (read: cost is basically no object)I don't disagree with your assessment of ECL, but I think you're holding the historical account upside down :-).
> Cray didn't need to bother with parallelism to get the best performance possible at the time
Let's put things into context first: the Cray-1 was a parallel computer. It had vector processing. That's the most trivial kind of computational parallelism (at least from a mathematical point of view) and the Cray-1 had it! Cray not only "bothered" with parallelism, his supercomputers were as parallel as they can get.
However, that wasn't new. There had been vector machines before, like CDC's STAR-100. However, one of the reason why they failed to gain traction was precisely the fact they sacrificed serial performance for that. Turns out that's a really bad, bad, bad idea for scientific computing, because 90% of the problems that involve cranking matrices usually involve cranking the same matrices over and over again.
So "the best performance possible at the time" (much like today, ironically!) turned out to require not only the parallel processing facilities offered by vector instructions, but also good serial performance, which basically boiled down to SQUEEZE MOAR CYCLES!! Remember, this was still five years before RISC became a thing; it was around the time Cocke began designing the IBM 801 (which only became available in 1980!).
At the time, CMOS simply wasn't up to it in that regard. It's not that Cray made a conscious choice to avoid wrestling with CMOS' limitations, there were literally no CMOS logic ICs that moved that fast. Not using CMOS was not a design decision anymore than not using relays was! This was happening at a time when pretty much all serious computing was done with bipolar logic. This was still a good two years away from even the faster companies in the same field (like DEC) abandoned the bipolar logic boat. IBM continued building mainframes with bipolar logic (albeit using TTL, not ECL) well after that (the 4300 was retired in the early 1990s, although I don't know if they were still manufacturing it in TTL; but it definitely was in 1979 when it was introduced!).
When CMOS was adopted later on for high-performance computing later on, it wasn't because of the costs! ECL's power dissipation demands made it technically, not economically, unfeasible to make faster logic circuits.
⬐ kjs3So when shown to be wrong, you redefine the question so you get to be right. Have fun with that.⬐ welandWhat was the question? I don't see any question mark in the posts above. And how exactly did I redefine it?I pointed out that this is wrong:
> Cray didn't need to bother with parallelism to get the best performance possible at the time
because Cray not only bothered with parallelism, but all Cray computers were parallel. And that this is wrong:
> By committing to using ECL logic (read: cost is basically no object)
because cost is hardly the only challenge when working with ECL. And that this is not only a truism, but also unenlightening from a historical perspective:
> If he'd been confined to CMOS (or to a somewhat lesser extent bipolar) logic, he'd have had a very different set of trade-offs to get the performance he needed.
First, ECL is bipolar logic, like anything built with BJTs. Second, that would have also been true if he'd been confined to relays or cranks. Of course you get a different set of trade-offs if you use a different technology.
But no computer company at the time was building minicomputers, let alone mainframes or supercomputers, in CMOS! The most megasuperparallel computers of the time, from the C.mmp to the ILLIAC IV were ECL or TTL. Cray's empire was built on precisely the same technological basis as that of every other computer company in the seventies.
Is there a discussion board with equal minded people?We had related discussions about that the other day: https://news.ycombinator.com/item?id=7658842 and https://news.ycombinator.com/item?id=7658275
Btw. Knuth is still going strong (he is doing research about "SAT solving" for his book series), I met him at University Linz last year - where he also hold a two hour lecture about his recent research: http://www-cs-faculty.stanford.edu/~uno/news13.html