Hacker News Comments on
Indistinguishable From Magic: Manufacturing Modern Computer Chips
Andor Gafotas
·
Youtube
·
229
HN points
·
55
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.https://youtu.be/NGFhc8R_uO4
⬐ AardwolfThat's super informative, much better than any other video or article I found, thanks!⬐ HikikomoriIts one of those presentations I come back to and re-watch every now and then, wish there was a 202x version of it.
⬐ bhedgeoserIs there an updated 2022 version of this talk?⬐ ggmAs long as there is a directional effect, and something morally akin to what a triode valve and a transistor do amplifying signals, it's not magic. But does it look like magic when it's quantum effects? Hell yea.What beats me is the same thing in pentode+ valves and modern day transformers/gates -reuse of the same elements of the physical structure for different purposes at the same time.
MEMS. Now that's magic. You're down in the space where van der waals is bigger than other forces, and you can make gears and levers work.
Fibre optics is pretty magical. The modern day fibre bundle is doing very odd boundary effects on lots of signals in one complex structure.
Magic (at least as far as I'm concerned). I would love to see an updated version of this video: https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ murktI have basic understanding of a generic process. I mean specifically 200+ layers. Do they “just” deposit layer after layer after layer, 200-something times?⬐ alyandonI am not going to pretend to be a process engineer (I am chemical engineer by education) but I can't imagine any other way to do it than as you described. Would really love for someone more knowledgeable to chime in though.⬐ wtallisThey deposit half the layers, etch holes through them and form vertical strings of memory cells, then repeat all that to build a second deck of layers on top. The interface between decks is a potential source of problems, but is more manageable than trying to drastically increase the aspect ratio of the holes that need to be etched through the stack.Etching through lots of layers at once is why 3D NAND flash memory is actually economical, compared to basically every other kind 3D chips where layers have to be built one at a time.
Also "Indistinguishable from Magic: Manufacturing Modern Computer Chips" https://www.youtube.com/watch?v=NGFhc8R_uO4 is a presentation from a manufacturing engineer which goes into some nitty-gritty
⬐ russellbeattieSeconded. Such a well done presentation absolutely packed with info and lots of inside-baseball comments which were interesting as well. I'll never be within 10 miles of a fab, but still totally worthwhile knowledge to have.
It's infotainment, not information. I really like the 'indistinguishable from magic' video:https://www.youtube.com/watch?v=NGFhc8R_uO4
lots longer, but far more informative and less fluffy.
This video is in the same vein, and does a great job putting this machine into perspective in teh chip making process, but it's sadly a decade old now.
If you have an hour, this decade-old video blew my mind: "Indistinguishable from Magic: Manufacturing Modern Computer Chips" https://www.youtube.com/watch?v=NGFhc8R_uO4HN thread from a couple years ago: https://news.ycombinator.com/item?id=16175949
⬐ dataflowAwesome, thanks!⬐ dataflowI'm watching it and it's completely blowing my mind! Thanks so much for sharing.He has a funny slip too: at 20:51 he says Xeon instead of Xenon!
⬐ ta988Previous discussions: https://news.ycombinator.com/item?id=27376192
Yes that's pretty similar to what is used. Each mask takes days to get simulated. It is getting "old" but this great video is covering it https://youtu.be/NGFhc8R_uO4
We talk to the piece of sand with lightning (USB Keyboard 5V interface), and then we hear its response back with lightning (HDMI Port interface). And somewhere in the middle, there's a source of lightning that's making it think on its own (Power supply and VRMs).EDIT: More seriously though: https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ carlhjerpeVideo saved to watch later, thanks mate. Though I must say I like your first unedited answer too, so I'll put it back here.> Its pretty simple. We zap sand with lightning until it starts thinking for us.
⬐ dragontamerLol, thanks.I think I kept working the joke in my brain, and it was too brutally simple at first. And then I worked it over, and now it reads too complicated. Ah well.
I think I was overthinking the joke.
A frequently posted reference https://www.youtube.com/watch?v=NGFhc8R_uO4
Take a watch of this: https://www.youtube.com/watch?v=NGFhc8R_uO4The talk is a few years old, but that doesn't really matter because this chip was fabbed at 40 nm.
The Larrabee cores were intentionally simpler than even most 2010 CPUs.Yeah, modern semiconductor fabrication is pretty much the pinnacle of human achievement. My favorite video on the subject: https://www.youtube.com/watch?v=NGFhc8R_uO4
Just doing some napkin math to give myself some other context in my own head with something much smaller, die size for the Apple A14 isn't out yet, but the A13 (found in the iPhone 11 series and SE 2) is around 98mm^2 manufactured on TSMC's 2nd gen 7nm N7P. A14 might be something like that, because while it's a smaller process Apple boosted transistor count by 38% as well. Lazily/conservatively taking off 10mm on the edge for squares into a circle, that'd be about 630 chips per N5 wafer, a high enough count that chip loss rate shouldn't be too far from defect rate. Even assuming they can get 90% yield right out the gate (maybe generous) that's still about $30/chip, just for manufacturing without any R&D amortization, packaging etc. That actually seems not insignificant going by historical iPhone BOMs [1], so even for smartphones with their far far smaller SoCs 5nm at launch isn't nothing.The continued ongoing progress in silicon is just mindblowing when you dig into the details in any real way. I was blown away already watching the "Indistinguishable From Magic: Manufacturing Modern Computer Chips" [2] presentation years ago, which still seems incredible and yet that's all talking about now ancient tech. The increasing costs and design challenges though are also going to have interesting effects on the industry, as who/what has the margins to afford fabbing aggressive chips on the cutting edge might start to diverge a bit more. I also wonder how longevity will be, I remember speculation that smaller processes would eventually have to deal with more risk of earlier breakdown even in more normal shielded settings. And speaking of that, in higher rad environments it's normal to use older bigger processes since they're more resistant to disruption. I wonder if small enough processes ever make that more of a consideration even at sea level.
----
1: iPhone Xs Max for example: https://technology.informa.com/606680/iphone-xs-max-costs-ap...
2: https://www.youtube.com/watch?v=NGFhc8R_uO4 (highly recommended if you haven't seen it)
⬐ ajross> Even assuming they can get 90% yield right out the gate (maybe generous)That seems outrageously generous. Rememember that Apple doesn't (as least hasn't so far) bin their parts to sell the ones with faults via different configurations. The only way to get a chip yield like that for a single device configuration is with a ton of redundancy.
I certainly don't know what the yield rate is of Apple silicon, but I definitely don't believe 90%.
⬐ ncmncm⬐ mrtnmccUsual yields are quoted at 60-75%. More for smaller chips. But can you even tell how many cores are actually running in your iPad? Maybe they do fuse out a faulty one.⬐ avianes⬐ ggmOn modern chips, we are not only "fusing out" faulty cores. We also put more memory cuts than necessary into the silicon and we fuse out the defective ones to increase the yield. You can look at Memory BISR (built-in-self-repair).If Apple wanted to bin the chips, they have devices crying out for low-power draw, albiet slower clock. They can sell a tonne of earphones, watches, HDMI enabled USB-C dongles. They could re-purpose chips which can clock 10x slower for 1/2 the cores, and still have too much CPU. But free.They can use them for the Apple touchpad, the keyboard, the on-mobo controllers which currently consume non-general-purpose chips, if they want to. If they displace a 10c part by an otherwise redundant $30 part, they are still ahead, if it works.
⬐ grandinj⬐ fomine3Not sure if they still are, but for a while they were shipping Apple TV's with Apple CPUs which had half the number of working cores, but they were running it with higher thermal limits.⬐ solarkraft> If they displace a 10c part by an otherwise redundant $30 part, they are still ahead, if it works.I am no expert, but my intuition says that even die size alone would present a problem.
⬐ manquerThe power draw for a full blown chip like A13 is vastly different from that of the typical ones you find in the devices you mentioned. Also the form factor could limit for many of the smaller devices⬐ ggmYes, I realised this was stretching the argument after I posted. Probably, repurposing terminates in slower, lower spec devices the A13 targets anyway. Maybe in something like the Apple TV?It would be true that compared to Intel they have far less binning opportunities
⬐ FredejOne way to bin it would be:Macbook -> iPad -> iPhone -> Apple TV
Are Non-Pro iPhone and cheaper iPad not using slower clocked chip?⬐ xoa>That seems outrageously generous.Probably, but until we actually get some A14 die shots I don't know how aggressive Apple has been with caches and the like with some redundancy that can be fused off for yield. Again though all the numbers are napkin math and trying to give some fudge factor in both directions, so conservative on the number of chips total (I didn't try to calculate exactly how many full squares fit I just took off an entire outer radius) and generous on the yield. Mainly I was trying to reason about the question of "even if everything goes great, and even if it's something smaller not a monster desktop GPU or the like, is it still a non-trivial cost?" to which it seems like the answer is yes. If it's actually a smaller yield that just makes it get even more expensive, but yield itself won't be a silver bullet.
> And speaking of that, in higher rad environments it's normal to use older bigger processes since they're more resistant to disruption.This was mostly debunked in https://news.ycombinator.com/item?id=24430234 with some nuance.
Each additional step requires solving a lot of new engineering problems and usually some new physics too, and it's not a linear process: many techniques are proposed that become cost effective only when previous techniques hit their size limits, and then a lot of research is done and most techniques are never gotten to work at scale and just a few pan out and become the next process.This is a great talk on the history of chip processes: https://www.youtube.com/watch?v=NGFhc8R_uO4
This is about some of the challenges that had to be solved for a current modern process: https://www.youtube.com/watch?v=f0gMdGrVteI
Semi-related, he kind of covers it in here: https://www.youtube.com/watch?v=NGFhc8R_uO4
If you think this is cool, watch this video. I've linked a relevant to the comment portion, but entire video is amazing: https://youtu.be/NGFhc8R_uO4?t=2065P.S. if anyone knows a more modern version of this talk, I would love to see it.
I found this to be basically an ad.My favourite video on them is this one from many years ago, it’s a presentation by an engineer not being paid by a fab: https://m.youtube.com/watch?v=NGFhc8R_uO4&t=3261s
Most of it is lost on me but there is a lot of detail in this video https://youtu.be/NGFhc8R_uO4
Sounds like this video: Indistinguishable from magic: Manufacturing modern computer chipsJust amazing. I wish a newer video explaining more recent technologies was available.
Here's a talk that's pretty mind-boggling, too:Indistinguishable From Magic: Manufacturing Modern Computer Chips
If you're interested in chip fabrication, I heartily recommend watching https://youtu.be/NGFhc8R_uO4 - it goes over the evolution of how they make transistors all the way up to 2009 state of the art, with a follow up talk in 2013 (https://youtu.be/KL-I3-C-KBk).Both of them mention how Intel is sinking billions of dollars into ASML to try and get this process working, and how impossible everyone thinks it is, so I'm skeptical that they finally got everything squared away now :)
⬐ awalton> Both of them mention how Intel is sinking billions of dollars into ASML to try and get this process working, and how impossible everyone thinks it is, so I'm skeptical that they finally got everything squared away now :)Intel claims EUV is now at production volume and ready for manufacturing introduction on the 7nm node, even at 75-80% EUV machine uptime.[1] At the kind of production volumes Intel is capable of sustaining, and ASML's projected machine throughput, that means they've got more than a dozen of these machines operating in Fab 42 (likely closer to 40-50 if they want to hit a projected 100k wafer starts/month target, with extra machines to make up for the excess downtime; UV litho machines can easily run with 90+% uptime). That should be an enormous win over the current quad patterning they're using at 10nm, cutting out tens of process steps.
This stuff is incredibly challenging to get right, but they've been working on it for over a decade and the machines ASML are building now are production machines, not prototypes. And now from the linked article, it seems Trumpf's also made it to production equipment.
Once everyone's had more time to operate these machines at scale and really crunch out the bugs, we'll see how far we can continue to push physics on making these tiny etched transistors. 5nm shouldn't be an impossible step with EUV, but the stochastics after that node could make 3nm and smaller logic very questionable.
⬐ LegitShadyI've watched the first one before and enjoyed it, never seen the follow up thanks!⬐ deepnotderpTSMC and Samsung are also exploring EUV, so it's not just Intel. New lithography is necessary to continue shrinking, otherwise mask costs will rise exponentially (literally).⬐ tlbCan you explain the exponential rise in mask costs with shrinking? My naive understanding is that you simply can't resolve features below some fraction of the wavelength. How do more expensive masks get beyond that?⬐ testvoxWhat you can do is basically use partially overlapping beams to energize an area smaller than any of the individual beams (think venn diagram). Its a similar technique (except in 2 dimensions) to radiation therapy where multiple beams of radiation are passed through healthy tissue such that their intersection point is on cancerous tissue. That way only the cancerous tissue receives a highly damaging dose of radiation while the healthy tissue receives significantly lesser dosages.⬐ deepnotderp⬐ deepnotderpThat doesn't usually work because the the diffraction limit PSF is usually Gaussian and therefore the overlapping region can have lower fluence than the peak regions.We are not concentrating light below the diffraction limit, instead we are printing features below the diffraction limit with multiple exposures.See here: https://en.m.wikipedia.org/wiki/Multiple_patterning
Edit: the exponential rise happens because to go 2x smaller I now need to double my # of masks. Of course, in practice overlay accuracy could kill you first.
⬐ blpI can split any small pitch exposure into separate, larger pitch exposures. For instance, if I need 40nm pitch, and my system is limited to 80nm, I can do 4 exposures. The next shrink requires 8.To figure this out, draw a square, and each node is an exposure.
This process works forever in theory, but is limited in practice by the mutual registration of the exposures.
If you haven't watched it, there was a fantastic 2009 talk (given during HOPE09) on what goes into fabrication called "Indistinguishable From Magic: Manufacturing Modern Computer Chips" [1]. Keeping in mind things have advanced even farther over the subsequent decade, even at that point it was mind blowing what goes into the hardware at the core of what we all do. The level of purity and consistency needed for a lot of basic inputs is measured in atoms, really digging into the details it's amazing some of this works at all let alone with high yields. Helps give some context of how an incident like this can happen and how easily the cost could rapidly rise to eye watering levels.I think Andor has posted on HN since too with some other high level talk, though not further updates per se since unsurprisingly details around fabrications are very heavily guarded trade secrets. I still rewatch it every year or two, it remains a real source of wonder to me what we've pulled off there.
----
⬐ tecleandorHi, Andor here.I'm the uploader of the video. I'm not the Todd Fernandez, the speaker for that HOPE09 talk. I happened to give a talk at the same congress (Digital Security in Healthcare Institutions, pretty outdated as of today), watched Todd talk, and loved it.
My upload is published every once in a while in HN or Reddit, but I'm not the author, just a, ahem, freeloader.
I uploaded the talk with intention of subtitle it for my deaf teacher in particular and Spanish speaking people in general, but I lost control of the account, I think it was related of some sort of Google+ problem.
So here are my explanations
⬐ drbawbThank you so much for posting that video. I watched it a long time ago and have been wanting to revisit it, but was never able to find it again.⬐ downer52The best part is the brief addendum about some of the bunny suit inhabitants of the chip fabs (~20 minute mark), and how they are weird, but that the gender distribution of their children is, in fact, "normal" or at last not specifically a product of their occupation.⬐ JoonaDefinitely one of my favorite talks. I believe this is a higher quality version of the same talk a few years later: https://www.youtube.com/watch?v=KL-I3-C-KBk
If you have time : https://www.youtube.com/watch?v=NGFhc8R_uO4It's great.
⬐ avs733Always flattered when this gets posted.Always wish I could go back and re-record this talking about 50% as fast.
⬐ parimmThank you for making and giving the talk. It got me interested in lithography.Do you have any plans to give an updated talk. It would be interesting to know about what comes after 7nm.
⬐ avs733⬐ spearo77Unfortunately I'm not the guy to do so...I left the industry for a lot of reasons including my own mental health :)⬐ bfleschAlso thanks for this interesting talk, really appreciate you spent the time to create and share it.Which industry are you in now?
⬐ avs733I ended up a professor at an engineering school. Found a passion for education.⬐ bfleschThat's great to hear. I am convinced you have talent for teaching. Wish you all the best.Great talk. Thanks for making it available.Soooo, it's now 2019.. is EUV working?
⬐ taxicabjesus> Always wish I could go back and re-record this talking about 50% as fast.I'm listening to this at 1.5x, which is my usual listening speed for non-musical videos. Your speed is fine.
Toastmasters [0] has helped me reduced my use of filler words: ahh, uhm, and, okay, so, you know, like, well, [pregnant pauses], [double clutches]. It's hard for me to listen to politicians anymore, because they're always gumming up their speech with linguistic crutches. I didn't used to notice.
[0] https://www.toastmasters.org/
"When we find ourselves rattled while speaking — whether we’re nervous, distracted, or at a loss for what comes next — it’s easy to lean on filler words. These may give us a moment to collect our thoughts before we press on, and in some cases, they may be useful indicators that the audience should pay special attention to what comes next. But when we start to overuse them, they become crutches — academics call them disfluencies — that diminish our credibility and distract from our message." [1]
[1] https://hbr.org/2018/08/how-to-stop-saying-um-ah-and-you-kno...
⬐ avs733Thanks!
Indistinguishable From Magic: Manufacturing Modern Computer Chips -- https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ JoonaSame talk (slightly updated perhaps?) in higher quality: https://www.youtube.com/watch?v=KL-I3-C-KBk⬐ andreareinaDidn't know about that, thanks!
They've talked about skipping it since 2012: https://youtu.be/NGFhc8R_uO4?t=3072
Indistinguishable From Magic: Manufacturing Modern Computer Chips [0]. This talk is delivered really well and always leaves me with a sense of awe in how CRAZY it is that humans figured out integrated circuits. Its a bit out of date, but it gives enough of a peek under the hood to understand why Intel has had such difficulties going to 10 and 7mn processes[1].[0]: https://www.youtube.com/watch?v=NGFhc8R_uO4 [1]: https://www.tomshardware.com/news/intel-cpu-10nm-earnings-am...
I highly recommend watching this, it's one of the best videos available on how chips are actually built and gives a great overview:"Indistinguishable From Magic: Manufacturing Modern Computer Chips" - https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ bumholioIt's just incredible how complex and advanced the hardware got, but at the same time, how crappy and inefficient the software on top of it is. I know it's simple economics and the chip foundry has an economy of scale that the Electron developer lacks. But it still feels like we're all eating garbage out of golden plates.By the end of the next decade we will probably reach the end of what is physically possible in device fabrication, so competition will have to move to design and software efficiency.
⬐ bigcostoogeImprovements in efficiency may only be needed in build and runtime systems. You may still be able to write electron apps.⬐ ants_a⬐ brian-armstrongThat presumes the existence of a Sufficiently Smart Compiler.I thought at first you were going to talk about how EDA software sucks. Now /that's/ garbage⬐ bumholioIt's a good example of the same underlying economic incentives. An EDA title is used by a few tens of thousands of professionals, at best. Large investments into making it more user-friendly and stable is not really worth it in the grand scheme of things. It needs to be just the minimal viable product that can get the chip out the door; it will be complex and feature packed rather than usable, and the engineers will learn to navigate around it's idiosyncrasies and bugs just like they solve other technical problems, it's what engineers do.The chip itself though, that's a whole nother thing. Especially the process that works across multiple designs and becomes an asset for the company. Any nanosecond shaved here improves the capacity of the foundry to compete and helps a billion people complete the task 1% faster or have 1% more memory. The customers have no reason to work around a less performing product, they will just buy from the competition.
So you get this strange combo of supernatural hardware assembled with atomic precision by garbage software written in Perl by an outsourcing guy with 6 months total programming experience.
⬐ vgy7ujmWhat does Perl has to do with this? Wouldn’t be any better if the same guy wrote it in crap Python now would it?⬐ sanxiynBy the way, the lingua franca of EDA is Tcl.
https://youtu.be/NGFhc8R_uO4This talk on YouTube is the most in depth explanation of IC manufacuring I have seen. I highly recommend it to anyone, no matter their background. You really gain an appreciation of how insanely complex modern computers are.
It’s from 2012 but he talks extensivly about how (2012-)future chips will be made. And most of the information is still applicable.
Reminded me of this:Indistinguishable From Magic Manufacturing Modern Computer Chips https://www.youtube.com/watch?v=NGFhc8R_uO4
Indistinguishable From Magic: Manufacturing Modern Computer Chips: https://www.youtube.com/watch?v=NGFhc8R_uO4
Indistinguishable from Magic: Manufacturing Modern Computer Chips
⬐ godelmachineThis looks good.
⬐ avs733Hi...presenter here. This has been posted a couple times and it's always flattering and a bit nerve wracking.Happy to answer questions, there are a couple of errors and misspeaks. Note I'm not in that field anymore (and,thankfully, I'm in much better physical shape).
⬐ tecleandor⬐ news_to_meGreat! I'm the uploader (to YouTube, not HN). I was present at the time and also gave a talk at that HOPE. I loved your talk and I'm happy to see it surface again here by surprise.I uploaded it at the time as I wanted to subtitle this for my deaf (and also non English speaker) teacher, but I lost the password for that YouTube account mid progress. Maybe I should try to reset it.
Thank you!
⬐ peterburkimsher⬐ NeradaCould you send the subtitles to me too? That would be really interesting! (and if your teacher is non-English-speaker, a translation to Chinese would be amazing if you have that).⬐ avs733⬐ avs733I'm happy to send you the actual slides if that helps as well.⬐ tecleandorI'm trying to find the access to that old YouTube account. I can't remember how much I got to subtitle, maybe around 20-30%, but I'd like to finish that work :)REALLY!?1) thank you for the effort to make cons more inclusive
2) would love to help...would having the slides help?
I'm just curious as to what you're doing now if you're no longer in that field.⬐ avs733⬐ aphextronsorry, wasn't great about checking this.Right now, finishing up a PhD in engineering education where I study the development of engineering students' beliefs about knowledge and how those interact with their development of entrepreneurial competence.
What's the single best way to get involved in the industry? Should I bother getting a degree or just go straight for hands on manufacturing experience? Are there any universities known for good programs?⬐ peterburkimsher⬐ peterburkimsherI studied Electronic Systems Engineering at Lancaster University in the UK, and graduated with a Masters in 2011.Now I'm working for a tech company in Taiwan, who do the next process step: dicing the wafers into individuals dies, and packaging the dies into chips. There's a trolley downstairs with over 1 petabyte - stacked high with trays of 16GB memory cards from the testing machine.
Unfortunately I only managed to get a job writing monitoring software, equipment control drivers, and business planning software, so I don't get to handle the wafers myself.
In my opinion, you'll need to learn Chinese if you want to work for the lithography companies, or work for the German/Japanese/Swiss companies making the machines that take in the FOUPs. If you want to simply be a technician, most of the migrant workers who operate the machines here come from the Philippines and get paid very little while working very long hours with no path to promotion (all the line managers are Taiwanese). On the bright side though, the technicians speak better English, so it's easier to talk to them directly!
I enjoyed the circuit design/FPGA programming courses in university, but the projects to design ASICs usually take several years, so I couldn't get work experience thorough summer jobs (which is how I ended up in software).
⬐ pkayeI used to be in that industry back 20 years ago helping design one of the major machines still in production. Since then I've moved onto many different things. The best way to get into this field is get a materials science/engineering degree. Sadly there has been lots of consolidation among fabs plus most of them moved overseas so hard to get any hands on experience. They also have lots of technician roles in the fabs but with automation and improved reliability, maybe not so easy to get. Your best shot for hands on exposure would be a equipment manufacturer like Applied Materials or Lam Research.⬐ ambroodBack when i was majoring in ECE at UIUC, we got to play around with Fab labs :). Yeah you'd pretty much need a advanced degree in EE or Physics to be a semi-conductor engineer I'd reckon.⬐ aphextron⬐ avs733Surely there are entry level technician jobs though?⬐ avs733For those you are typically looking at an as or even a bs. They look for strong reliability, precision and attention to detail, and hands on skills. Strong evidence of an ability to learn is also gold.Plus most of those guys work 3/4 12hr shifts a week...great schedule. No work goes home with you.
It depends on what you want to do. Do you want to just monitor tools running production? Do you want to actively work on the tools in the fab? Want to be a process engineer? Be a device design engineer? Work in process development?Each is different...a college degree will never hurt and a master's is pretty normal for fab engineers and a PhD for process developers.
As for schools it definitely depends in a similar way, but most top technical schools will get you recruited. A standout is Rochester institute of technology which has an undergrad degree in microelectronics.
Tell us more about clean rooms!What I remember from class is that it's a fan on the roof, blowing air through a very fine filter.
The most magic part for me is that this still works even though the PM2.5 pollution outside is terrible (the AQI is over 150 today, which causes me to sneeze and get red eyes, and is known to increase the risk of cancer, so everyone wears dust masks outside).
⬐ avs733⬐ Zarathustenormous spaces with laminar flow. So cool! most relaxing place in the world, wonderful background white noise for getting work done.Most cleanrooms are now class 10 (meaning 10 half micron sized particles per square meter) compared to a class 1,000 operating room. The wafers are all sealed in the 'clearnoom within the cleanroom' spaces of tools and FOUPS.
I have heard of fabs having problems fro msources as strange as nearby farms and earthquakes half way around the world. The filtering (ultra-HEPA filters) will absolutely clear up your alergies almost over night.
You have a bunch of nice images. Are those taken with "standard" microscopes?Also you mentioned a background in metrology. How do you inspect something a few atoms thick?
⬐ avs733⬐ barsonmeThe device level images are a mix of Scanning Electron Microscope (SEM) and visual microscope images.A bunch of different ways. Depends on the material being measured but measuring sheet resistance (effectively, you can back calculate the thickness by measuring the resistance of a square of known size) and light scattering off the surface are the most common.
Jeez, it really is like magic. Awesome presentation–thanks! (Also, congrats on getting in better shape.)⬐ avs733⬐ 1risThanks :) bot working 14hrs a day 6 days a week helps.Any thoughts on germanium (not GeSi, but Ge without or with little Si) or carbon based chips?⬐ avs733⬐ coreyp_1Unfortunately, not really. I'm not super familiar with germanium. I have some experience with gallium arsenide.Great presentation!⬐ madeuptempacctDid you make bank? How did you get into it?⬐ ben174I’d be very interested in hearing a quick summary of the last six years. Particularly how they compare to your predictions in the future slide at the end of your presentation (about 51 minutes in)."We're making things smaller than the wavelength of light [that] we're using to make the features"Incredible.
⬐ danbruc⬐ snake_plisskenWithout wanting to diminish the technological achievement behind that in any way, but it is really not that uncommon to create features that are smaller than the tools we use to make them. An obvious example are the details on a sculpture which can be much smaller than the chisel used to create them. Maybe not a perfect analogy because you only really use the edge and don't just throw the chisel in the rough direction of your marble block. Anyway.⬐ kevhitoA better analogy might be reading regular-sized braille writing by poking it with a broom handle.Or writing an essay with the pencil tool in mspaint while using a 50-radius sized brush.
Awesome presentation.To me the most magical part is not the manufacturing process, although that is very special, but how a team of engineers designed something which is made up of hundreds of millions of transistors, each in a specific location for a specific reason. How is all of that done? How do they keep track of things? Is there special software?
⬐ FPGAhacker⬐ saganusThere are a number of disciplines involved.Engineers at the foundry will range from those focused keeping the equipment aligned to produce functional structures on/in silicon over time, to engineers that design standard libraries of components to work at a given 'process' (the steps in going from wafer to chip for a given set of parameters/attributes).
The structures in silicon (or other semiconductor material) are the transistors mainly. These are created by embedding impurities in geographic locations and patterns in the silicon that change it's conductive properties. Embedding these impurities is called doping.
The structures on top of the silicon are conductors and insulators used for interconnecting all the transistors.
There will be engineers at the foundry (or a company that works with that foundry) that take a "netlist" from the customer and convert that into files that can be used by mask company to make masks for the layers of conductors and insulators the lay on top of the transistors, and also the patterns for doping the silicon.
Engineers at "tool vendors" will build software that takes a high level description of the desired functionality of the chip and turn it into the netlist that foundry engineers can use. The netlist is a decomposition of the high level description (such as x <= a * b) into the components in the "standard library" for that foundry. This library will contain basic components like AND gates and OR gates, but also more complicated things like IO pads and RAMs, and things like multipliers.
Then you have design and verification engineers at a given company that wants to build a chip. These tend to stay at the higher level, working in VHDL, Verilog, and SystemVerilog. Which are all pretty archaic by software language standards. And what is considered high level in this arena is about the C language level or below. So not so high from a software point of view.
I just spat this out off the top of my head, so apologies if I made mistakes.
I've watched this video several times (probably 5+ or maybe more) and it does not cease to amaze me.Both the presentation style and the content itself is really engaging.
⬐ kiddicoI know that there is very high tech development going on around the world by huge organizations, and it would blow my mind with how advanced it is.But watching this made me feel like the computer I'm typing this on is some space age alien tech. It's a little mind blowing how inexpensive it is to purchase stupidly complex hardware.
⬐ kowdermeisterI can't find a long video animation depicting the process. It was a bit old, but it was so great at displaying the process of coating a layer with masks and etching them with acid. If I remember correctly it was repeated 5 or 7 times.After watching it, I concluded we are gods for manipulating matter at this level :)
⬐ jacksmith21006Love this video and highly recommend. Love the idea also of the slides via a barcode.⬐ avs733⬐ Nonethanks!Its something I'm a big fan of. A talk like this is inherently a ton of information in a short time and I try to be inclusive and accessible by being conscious of how others process information.
None⬐ derefrEven old computer chips were magic. Really, photolithography is magic—it jumped us way ahead in our ability to miniaturize. It really seems like the kind of thing that, in fiction, would be introduced to a culture by aliens.I've always wondered: could we have discovered/invented photolithography earlier, and thereby started the integrated-circuit revolution earlier? As far as I can tell, there was nothing holding someone back from inventing it in the 1800s, even: they had access to all the relevant chemicals, and (in some countries) lenses were already precision-ground enough to serve the purpose down to some pretty tiny process-nodes. We wouldn't have had digital logic (no transistors), but we could have been making "electronic" watches, tiny AC-to-DC power converters, and other analogue ICs even back then.
But heck: suppose we had known the principles behind semiconductive materials back then, too. We could have passed right by the vacuum-tube era and started in on inventing digital ICs (and boolean algebra.) What could the 1800s have looked like then?
(Is there any well-known science-fiction story exploring this premise? I might give a shot to writing it, if not...)
⬐ philipkglassIMO there were enough critical precursor technologies to start making discrete semiconductor devices from silicon by about the late 1920s, had a large industrial backer known the useful properties of such devices. They had most of the pieces for "how" by then but didn't discover the "why" until a generation later. And of course people spent a lot of time pursuing dead ends before discovering which industrial techniques would successfully combine to make silicon suitable for semiconductor devices, with properties surprisingly different from silicon of only 99.5% purity.- The electric arc furnace, from the late 19th century, enabled the large scale production of crude silicon.
- The crystal bar process, from 1925, is similar to the Siemens process developed in the 1950s for refining purified volatile silicon compounds to electronic-purity elemental silicon.
- The Czochralski process, from 1915, permitted the growth of large single crystals of purified silicon.
There's a great review article from 1981, available through sci-hub, if you want more details about the history of purified silicon electronic devices:
"Twenty Five Years of Semiconductor-Grade Silicon"
DOI: 10.1002/pssa.2210640102
⬐ workthrowaway27I don't think there was much incentive to make things smaller before the transistor was invented. Also, transistors are used in most analog electronics. You can't do much without them since they let you amplify signals and apply negative feedback.But to answer your second question. I don't think we could have skipped the vacuum-tube era. It's because we already had applications for those sorts of electronics (and unreliable mechanical relays that were replaced with transistor switches) that researchers realized the potential for transistors in the first place. Even after the transistor was invented there was still a lot of research necessary to make them feasible in practice. The first one was germanium if I recall correctly and it took a while before silicon transistors were viable.
Edit: Also, sort of unrelated, non silicon semiconductors are still used a lot because they can switch faster, and different color LEDs use different semiconductors because they emit different colors of light, which is also related to why different color LEDs have different voltage drops.
⬐ dfox⬐ jdietrichOne thing that everybody seems to glance over is that photolitography made manufacturing of discrete transistors practical and the step to integrated circuits is not that big from that.Edit: what you buy as a typical discrete transistor is in fact integrated circuit which contains tens to thousands of parallel connected transistors.
Also I believe that photolitography-like processes were used before semiconductors (and I would not be too surprised if production of at least some mass manufactured valves involved photolitography).
⬐ dmitrygrThis is true for FETs but not for BJTs⬐ dfoxI originally intended to use IRF640 as an example, as for HexFET it is painfully obvious that there are separate parallel connected transistors.For typical BJTs it depends on how you define "separate transistor". The structure in typical BJT has one contiguos junction, but its geometry is non-trivial enough that you can well think of it as multiple paralleled transistors.
I can highly recommend this documentary from 1943, which shows the manufacturing of piezoelectric quartz crystals. The amount of skilled manual labor involved in the process is quite staggering.https://www.youtube.com/watch?v=b--FKHCFjOM
Photolithography in general substantially pre-dates the development of the IC. We were using it to produce printing plates in the 19th century. Breakthroughs in chemistry made photolithographic PCB manufacture possible by the early 1940s. It took us another 20 years and a huge array of discoveries and inventions to finally start producing useful ICs.
Development of IC manufacturing required a very broad tech tree - better chemistry, better optics, better materials science, better process engineering and, most crucially, the invention of p-n junctions. Without the p-n junction, any effort to etch a practical integrated circuit is pretty much futile.
⬐ NoneNone⬐ ameliusDon't know what you mean by analogue ICs without semiconductors.Anyway, perhaps the problem was more that (almost) nobody back then saw the potential of this technology.
⬐ InclinedPlane⬐ JumpCrisscrossThink about a PCB except at micro-chip scale. If you don't have semi-conductor technology you can still lay down layers of metal using photolithography, and thus making tiny circuits. I'm not sure what the limits are of such technology but it seems interesting.> What could the 1800s have looked like then?If it happened before the Battle of Jena, we might have found ourselves in a world of technologically-ensconced monarchies.
⬐ leggomylibro⬐ afandianHey now, we still might.It's not exactly what you're looking for but in that vein The End of the World has a similar theme.⬐ jablWell, in the 1800s steam locomotives were considered to be awesome high-tech. The famous HMS Rattler vs. Alecto trial which proved the superiority of propellers over paddles for ship propulsion was in 1845.Even on the basic science level, Maxwell published his seminal treatise on classical electromagnetism in 1873, and it wasn't until 1881 that Heaviside published them in the form we know them today and that sort-of made people understand them. Semiconductors were sort-of known already in the early 1800s, but it wasn't until the, say, first half of the 20th century when we really started to understand them with the development of solid state physics (both theoy and experimental methods such as x-ray diffraction etc.).
So yes, while some aspects of lithography could have been developed in the 1800s, I think there was quite a huge gap between that and a whole lot of other things needed.
⬐ sehuggWe didn't even have a practical vacuum tube AND gate until 1930, and a formal treatment of logic design until 1936-1937. It's doubtful the need for ICs was well understood. And they would be expensive until there was a big enough market to produce them at scale.⬐ TheOtherHobbesWe didn't have the semiconductor science needed to make effective ICs until we started building them.The logic is relatively easy, but the materials science needed to decide what to lithograph, using which materials, and in what order, is post-WWII, and probably couldn't have been invented much earlier.
It takes a lot of work to work out ideal doping factors and diffusion rates, estimate charge distributions across junctions, calculate propagation speeds, and eliminate leakage.
But more - the chemistry of semiconductor doping and deposition is often rather nasty, and it would have been a huge challenge to industrialise it before WWII.
⬐ Taniwhawell most chips these days are done with FETs which don't really need semiconductors in the way we usually think about them ... and FETs were invented (and largely forgotten) before bipolar transistors ... but doped semiconductors does make it easy to make FETs⬐ avs733I think this is a pretty reasonable answer...summed up, you could have all the steps but you were missing two things:1) the theoretical knowledge of how the devices worked, which enabled the creation of steps in the process that resulted in fully working devices
2) the ability do do each step repeatably and precisely and reliably.
I can't even begin to comprehend the complexity in a contemporary CPU. Recently I saw this video [1] and only juuust started to get an inkling as to how big the problem space is.
Side note, for anyone interested in how chips are produced, this is one of my favorite videos:Indistinguishable From Magic: Manufacturing Modern Computer Chips
This is strikingly similar to the process used in modern offset printing. As a young child, I grew up in a family printing business, and though I didn't learn all the details (I didn't really have much interest in the printing side of the business and spent most of my time in the then still very early digital prepress playing with the computers).My earliest memories of it were marked by specialists doing digital typography on specialist computers, one monitor was a green-screen where some kind of markup was entered and the other screen was a raster monitor where the output was rendered. Each block of text was "printed" on photographic paper, cut and then sent to the "layout" department where it was taped or wax adhered to a page. (If this is hard to picture, imagine taping each block of text from this page [1] onto another piece of paper in order to layout the entire page.) The process looked a bit like this [2]
Then the final page was photographed in a photography room and the negative developed. On occasion a photographic "proof" would be developed to show the final page layout and make sure there weren't any artifacts from all these tiny little pieces of paper that were used to make up the text blocks.
Then the negative was used as a lithographic mask to "burn" the image of the page onto a metal sheet which then went through some kind of chemical bath to etch the image onto the sheet.
From that point the process was basically what's in the rest of this video [3]. And one more [4]
My parents ended up selling their company in the early 2000's because the cost of going "fully digital" from pre-press to print was just too expensive. At that time, the process had completely eliminated the tape and wax steps and the layout was being done entirely digitally in software, then printed onto regular paper which was then sent to film. The rest of the process was simply unchanged.
Big competitors had invested millions of dollars into equipment which apparently was capable of eliminating parts of the lithography process and color separated artwork could go almost directly from the pre-press department to paper. According to the videos I've linked to, the industry is now using laser etching to etch the image on the plates.
Many years later, this knowledge became useful when, in another setting, I was being interviewed by a gentleman with deep experience in microprocessor lithography and we hit it off when I mentioned my family's business and the lithography process we used in printing. Nothing so exotic as what's used in CPU production, but the basics are almost exactly the same. There's been some great videos about CPU lithography posted here recently. Here's one of them [5].
The guys who actually did the printing were an interesting sort. Small businesses being small businesses, we needed cheap help. So my family first started hiring Vietnamese immigrants escaping right after the war and training them. When that ran out, they eventually switched to a variety of people, many of whom were ex-cons. I learned a tremendous number of life-lessons from watching those guys, many of whom ran presses as artists, on how to and how not to wreck my life. The guys who were reliable were worth their weight in gold. The ones who weren't suffered from an unimaginable litany of self-imposed life problems, mostly drug or alcohol related.
At any rate, thinking back, this is really how I got my start into computers, hanging out as a kid in the prepress department. Some more to read [6]
1 - https://marketplace.canva.com/MABXLyRjyco/3/0/thumbnail_larg...
2 - https://3v6x691yvn532gp2411ezrib-wpengine.netdna-ssl.com/wp-...
3 - https://www.youtube.com/watch?v=5LMU-zB8Sro
4 - https://www.youtube.com/watch?v=pNZb7CXUjs0
⬐ anfractuosityThat's very cool, I'd not heard of offset printing before.It's a shame your family had to sell their business, as from what you mention it sounds like not all of the printing method changed.
I was wondering with the kitchen method, how you'd get a photo onto the foil, but as you mention a laser could be used. One thing I was slightly confused about, is, is it always necessary to use a coating on the aluminium for that approach.
I guess you could also coat the foil with the photosensitive emulsion you use for PCBs too, and then etch that.
⬐ baneYeah, I guess I omitted that step as I'm not entirely sure how the actual lithography was performed. I remember a large "camera" where the negative would go into the top and the sheet would lie underneath. It looked vaguely like the reverse of an old acetate overhead projector.How the image was etched in that process I simply don't remember, but I do remember the sheet going through a number of chemical baths and washings (and being reminded to stay clear of the chemicals as a child).
So there may have been some phase where the sheet was treated with a photosensitive chemical before going into the camera, I simply don't remember the detail.
This link [1] seems to have some details.
It looks like the plates run for a little over $1/plate for the smaller ones. [2]
1 - http://www.offsetprintingtechnology.com/sub-categories/offse...
2 - https://www.valleylitho.com/acatalog/Valley_Litho_Supply_Pla...
⬐ HeyLaughingBoyGuessing it was something like an Agfa photostat camera. We had one of those at my first job. We used it both for making camera-ready art for ads (we were a tiny business) or positive film to send off to the PCB house to get boards made.IIRC, the film was standard Kodak 8.5x11 size negatives and the processing was pretty simple. I would do everything from initial tape & wax puppets to the final artwork. Not what I expected to be doing in my first engineering job, but it was fun.
I took a printmaking class sometime later. I think the litho "stone" is made photosensitive and etched, but that aspect of the process is fuzzy. Most of the class was about positive prints and I spent most of my time doing intaglio and lino block printing.
⬐ baneYeah, that looks basically like the device!
Think of them as relative indications of feature size and of spacing between identical parts (arrays if you want to use a software analogy), so even if an actual transistor will not be 10 nm or 14 nm their relative sizes will relate on one axis as 10 nm to 14 nm. Keeping the numbers from a single manufacturer will definitely aid in the comparison.There is a ton of black magic going on here with layers being stacked vertically and masks not having any obvious visual resemblance to the shape they project on the silicon because of the interaction between the photons / xrays and the masks due to the fact that the required resulting image is small relative to the wavelength of the particles used to project it.
There is a super interesting youtube video floating around about this that I highly recommend, it's called 'indistinguishable from magic':
https://www.youtube.com/watch?v=NGFhc8R_uO4
It's up to date to 22 nm. Highly recommended.
⬐ saganusI've seen that video several times now and it just doesn't cease to amaze me.It's really a must-see for anyone interested in processor technology.
Oh!, there's a new video! awesome!
⬐ deepnotderpIf you think that lithography is challenging, your brain is going to invert looking at the litho technology for 7nm and onwards :)⬐ whackedspinachHere is the talk a few years later: https://www.youtube.com/watch?v=KL-I3-C-KBk
Always relevant for informational purposes when chip manufacturing is discussed:
I'm super curious in what area you feel these breakthroughs will be.Tunnel effects are real and very hard to reduce, even at lower temperatures. The band-gap can't get much smaller, supply voltages are about as low as we know how to get away with.
There are solutions in terms of exotic materials with even more exotic fabrication methods.
I linked to a nice video the other day, see if that interests you:
https://www.youtube.com/watch?v=NGFhc8R_uO4
That's the state of the art per 2012, not much has changed since then, though there has been some incremental improvement and optimization as well as larger dies for more cores.
⬐ pegasus3D circuits/chips is one possibility. There are others that are being researched. Then there are the unknown unknowns...⬐ jacquesm⬐ aptwebapps> 3D circuits/chips is one possibility.Yes, but we've been aware of that one for decades, it just isn't going to happen for anything other than maybe (and that's a small maybe) for memory. Removing the heat is hard enough with 3D heat infrastructure and 2D chips. Removing the heat from 3D chips is not possible at a level where the interior of the chip is above permissible values unless you clock the whole thing down to where there are no gains.
> There are others that are being researched. Yes, but nothing that looks even close to ready to mainstream.
> Then there are the unknown unknowns...
They've been 'unknown' for a long time now.
Really, this is as far as I can see more hope than reason.
⬐ pkolaczkHuman brain is 3D and has no problems with TDP and cooling while being extremely fast at parallel processing of data like images or sound to the point that the fastest supercomputers ever built still can't compete.Your logic is useless against my feelings. ;) Seriously, I really don't have a solid argument here, it's just a fuzzy idea.
There are some exotic processes that give better performance but the economies of scale aren't there yet and the cost of production is probably always going to be much higher than that of Silicon, which is after all a very abundant element on Earth.There was a fantastic video about IC manufacturing linked here on HN a while ago, I'm not sure if I'll be able to find it.
edit: Hah, found it :)
... and almost all of it is done and/or coordinated in a CPU which is smaller than a stamp and consumes about the same power as a light bulb. It's a bit dated now, but I still find the video "Indistinguishable from Magic"[1] to be quite amazing.
⬐ Cyph0nYou could probably run it (and other stuff!) on a CPU that consumes 1/30th the energy of a light bulb.https://www.cs.virginia.edu/~skadron/cs8535_s11/ARM_Cortex.p...
I've seen great videos for learning about stuff. Here's one of my favorites, just as an example:https://www.youtube.com/watch?v=NGFhc8R_uO4
What I have not seen is a great video used to support an argument made online. I've seen plenty of "this is awesome, check it out" that panned out. I've not once seen "you are wrong, this video explains why" that did.
You're asking us to invest an hour in your argument based on nothing but your insistence that it's worth it. That's not at all the same as asking a student to invest an hour in a lecture for a class they signed up for as part of an ongoing educational effort involving lectures, books, and practical work.
Indistinguishable From Magic: Manufacturing Modern Computer Chips.Explains a lot of recent mass-market innovations that keep the semiconductor manufacturing industry rolling, and goes into detail about the many tricks used to ensure scaling down to the 22nm node.
QueueTard's Manufacturing Modern Computer Chips at HOPE number nine: https://www.youtube.com/watch?v=NGFhc8R_uO4Guy Steele's How to Think about Parallel Programming: Not! at Strange Loop 2011: https://www.infoq.com/presentations/Thinking-Parallel-Progra...
There's a brilliant talk, given in 2009 at HOPE09, NYC titled "Indistinguishable From Magic: Manufacturing Modern Computer Chips".At roughly 36 minutes, there's a cross-section of a (in 2009) modern chip. The topmost layer is the shiny rainbowy topmost metal-layer you see on typical (from the top) die photographs. The very, very small comb-looking bottommost-layer is the individual transistors, and structures there are of the size that give a "xxx nm" process its name.
https://youtu.be/NGFhc8R_uO4?t=36m8s
I very much recommend everyone to watch this video, to get a sense for the complexity that goes into producing a modern CPU.
To come back to the thread I'm answering here: The logic analyzer on the Alto (probing microcode) connected to interconnects of individual gates probably would be connected to signals somewhere in the lower middle of the stack in a modern CPU, I guess.
If you forgive the shameless self-promotion, I did a talk on the scale and economics and so forth a while back you might find interesting: https://www.youtube.com/watch?v=NGFhc8R_uO4 It has been posted on HN a few times before...the technology still puts a tingle in my spine[insert critcism of how I need to be a better public speaker here]
⬐ KeyframeBy the powers bestowed upon me by hacker news user account creation, I absolve you from your sin of shameless self-promotion. Seriously though, I am as remote as possible from that industry (film and TV content creation!), but it absolutely fascinates me. I devour all and every article and paper (that I can understand) about this and HPC as well. You did well in video, we should do a documentary together on this theme!⬐ avs733email me...tmf7811 on gmail.
I do not have the answers for your questions (and I don't think anyone can share actual failure rates), but I would direct you this video which goes over a lot of modern chip fabrication techniques, circa 2009:https://www.youtube.com/watch?v=NGFhc8R_uO4
It's crazy stuff.
There are wafer test machines which will interface with the wafer directly and do some testing (which are $$$$), JTAG type tests, which access parts of the chip out of band, and functional testing. Some products, like SD Cards actually have a microcontroller on board that will provide the test routines and error correction without the need of an expensive machine. Design for test is extremely important.
I'm by no means an expert however, I mostly deal with JTAG and functional tests.
can you explain the mechanism for the bandwidth increase?do the photovoltaics act as logic gates? doing calculations?
if the mechanism that is controlling the optics is electric:both transmitting and receiving; it would seem the throughput would be the same as pure electric
i understand fiber optic because light will travel a longer distance faster than an electric field propagated through wire, but these chips are built using atomic scale welding techniques(i)
(o) https://www.youtube.com/watch?v=NGFhc8R_uO4&feature=youtu.be...
Although it requires watching, I'm a fan of https://youtu.be/NGFhc8R_uO4
Not a documentary but this talk from HOPE has a lot of details: https://www.youtube.com/watch?v=NGFhc8R_uO4
For the most part, manufacturing advances drive microarchitecture advances. Smaller feature sizes mean more transistors can be stuffed in the same area. Those transistors can be used to make larger reorder buffers, more registers, more caches, better branch predictors, and more functional units. If you want to know about specific architectural changes in x86 over the years, I strongly recommend Cliff Click's talk: A Crash Course in Modern Hardware.[1]A lot of the specifics of semiconductor manufacturing are closely-guarded secrets, but Todd Fernandez gave a glimpse in an informal talk titled Inseparable From Magic: Manufacturing Modern Computer Chips.[2]
1. http://www.infoq.com/presentations/click-crash-course-modern... (starts about 4 minutes in)
Great talk that describes how modern (as of 2011) computer chips are manufactured: https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ signa11excellent talk. here is another SA article 'the first nano-chips' : http://community.nsee.us/courses/376-0_nanomaterials_sp04/na...⬐ thoughtsimpleThanks for this. Great presentation. I learned a lot.⬐ rndnThis EEVBlog episode on silicon chips is also great: https://www.youtube.com/watch?v=y0WEx0Gwk1E
If you didn't watch this video before... https://youtu.be/NGFhc8R_uO4?t=15m50s
⬐ deckar01This is a great presentation on the evolution of transistor manufacturing processes. I didn't know how a transistor worked before watching this video, yet was engaged by the technical narrative of the semiconductor industry overcoming one physical constraint after the other.
If anyone is interested in the manufacturing of these chips, this is an old but fascinating video: https://www.youtube.com/watch?v=NGFhc8R_uO4
⬐ NoneNone⬐ pierreThanks! It was really instructive.⬐ chiiwow, that was super interesting.⬐ rwmjAnd here's a video from the 1970s, probably the first and last time Intel let anyone inside one of their fabs with a camera:⬐ TazeTSchnitzel26 minutes in, and this may be the best documentary on the then-future use of computers I've seen.
This video, mentioned here, from HOPE, is amazing: https://m.youtube.com/watch?v=NGFhc8R_uO4
⬐ 0x0Great video worthy of a separate HN story submission!⬐ davidb_Wow, that's like the entire microelectronics class I took in undergrad compressed into an hour, just without any of the math and (obviously) light on theory. Impressive.⬐ spyderand here is the direct link to the part about the wear on the lenses:
⬐ mutagenThis excellent talk was linked at the bottom of the CPU backdoor article (https://news.ycombinator.com/item?id=8999507). Worth a watch, though unfortunately the slides appear to be offline.
Summary of the links shared here:http://blip.tv/clojure/michael-fogus-the-macronomicon-597023...
http://blog.fogus.me/2011/11/15/the-macronomicon-slides/
http://boingboing.net/2011/12/28/linguistics-turing-complete...
http://businessofsoftware.org/2010/06/don-norman-at-business...
http://channel9.msdn.com/Events/GoingNative/GoingNative-2012...
http://channel9.msdn.com/Shows/Going+Deep/Expert-to-Expert-R...
http://en.wikipedia.org/wiki/Leonard_Susskind
http://en.wikipedia.org/wiki/Sketchpad
http://en.wikipedia.org/wiki/The_Mother_of_All_Demos
http://io9.com/watch-a-series-of-seven-brilliant-lectures-by...
https://github.com/PharkMillups/killer-talks
http://skillsmatter.com/podcast/java-jee/radical-simplicity/...
http://stufftohelpyouout.blogspot.com/2009/07/great-talk-on-...
https://www.destroyallsoftware.com/talks/wat
https://www.youtube.com/watch?v=0JXhJyTo5V8
https://www.youtube.com/watch?v=0SARbwvhupQ
https://www.youtube.com/watch?v=3kEfedtQVOY
https://www.youtube.com/watch?v=bx3KuE7UjGA
https://www.youtube.com/watch?v=EGeN2IC7N0Q
https://www.youtube.com/watch?v=o9pEzgHorH0
https://www.youtube.com/watch?v=oKg1hTOQXoY
https://www.youtube.com/watch?v=RlkCdM_f3p4
https://www.youtube.com/watch?v=TgmA48fILq8
https://www.youtube.com/watch?v=yL_-1d9OSdk
https://www.youtube.com/watch?v=ZTC_RxWN_xo
http://vpri.org/html/writings.php
http://www.confreaks.com/videos/1071-cascadiaruby2012-therap...
http://www.confreaks.com/videos/759-rubymidwest2011-keynote-...
http://www.dailymotion.com/video/xf88b5_jean-pierre-serre-wr...
http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hic...
http://www.infoq.com/presentations/click-crash-course-modern...
http://www.infoq.com/presentations/miniKanren
http://www.infoq.com/presentations/Simple-Made-Easy
http://www.infoq.com/presentations/Thinking-Parallel-Program...
http://www.infoq.com/presentations/Value-Identity-State-Rich...
http://www.infoq.com/presentations/We-Really-Dont-Know-How-T...
http://www.slideshare.net/fogus/the-macronomicon-10171952
http://www.slideshare.net/sriprasanna/introduction-to-cluste...
http://www.tele-task.de/archive/lecture/overview/5819/
http://www.tele-task.de/archive/video/flash/14029/
http://www.w3.org/DesignIssues/Principles.html
http://www.youtube.com/watch?v=4LG-RtcSYUQ
http://www.youtube.com/watch?v=4XpnKHJAok8
http://www.youtube.com/watch?v=5WXYw4J4QOU
http://www.youtube.com/watch?v=a1zDuOPkMSw
http://www.youtube.com/watch?v=aAb7hSCtvGw
http://www.youtube.com/watch?v=agw-wlHGi0E
http://www.youtube.com/watch?v=_ahvzDzKdB0
http://www.youtube.com/watch?v=at7viw2KXak
http://www.youtube.com/watch?v=bx3KuE7UjGA
http://www.youtube.com/watch?v=cidchWg74Y4
http://www.youtube.com/watch?v=EjaGktVQdNg
http://www.youtube.com/watch?v=et8xNAc2ic8
http://www.youtube.com/watch?v=hQVTIJBZook
http://www.youtube.com/watch?v=HxaD_trXwRE
http://www.youtube.com/watch?v=j3mhkYbznBk
http://www.youtube.com/watch?v=KTJs-0EInW8
http://www.youtube.com/watch?v=kXEgk1Hdze0
http://www.youtube.com/watch?v=M7kEpw1tn50
http://www.youtube.com/watch?v=mOZqRJzE8xg
http://www.youtube.com/watch?v=neI_Pj558CY
http://www.youtube.com/watch?v=nG66hIhUdEU
http://www.youtube.com/watch?v=NGFhc8R_uO4
http://www.youtube.com/watch?v=Nii1n8PYLrc
http://www.youtube.com/watch?v=NP9AIUT9nos
http://www.youtube.com/watch?v=OB-bdWKwXsU&playnext=...
http://www.youtube.com/watch?v=oCZMoY3q2uM
http://www.youtube.com/watch?v=oKg1hTOQXoY
http://www.youtube.com/watch?v=Own-89vxYF8
http://www.youtube.com/watch?v=PUv66718DII
http://www.youtube.com/watch?v=qlzM3zcd-lk
http://www.youtube.com/watch?v=tx082gDwGcM
http://www.youtube.com/watch?v=v7nfN4bOOQI
http://www.youtube.com/watch?v=Vt8jyPqsmxE
http://www.youtube.com/watch?v=vUf75_MlOnw
http://www.youtube.com/watch?v=yJDv-zdhzMY
http://www.youtube.com/watch?v=yjPBkvYh-ss
http://www.youtube.com/watch?v=YX3iRjKj7C0
http://www.youtube.com/watch?v=ZAf9HK16F-A
⬐ ricardobeatAnd here are them with titles + thumbnails:⬐ waqas-how awesome are you? thanks⬐ ExpezThank you so much for this!⬐ X4This is cool :) Btw. the first link was somehow (re)moved. The blip.tv link is now: http://www.youtube.com/watch?v=0JXhJyTo5V8
"Inseperable from Magic: The Manufacture of Modern Semiconductors" — an overview of semiconductor fabrication (and its current challenges) by a former Intel engineer. http://www.youtube.com/watch?v=NGFhc8R_uO4"The Atomic Level of Porn", by Jason Scott — a history of low-bandwidth pornography, from ham radio to telegraphs to BBSes. http://vimeo.com/7088524
How to build your own X-ray backscatter imager (aka "airport body scanner") by Ben Krasnow http://www.youtube.com/watch?v=vUf75_MlOnw
"The Secret History of Silicon Valley" by Steve Blank. Other, more recent versions of this talk exist, but the audio quality is poor in them. https://www.youtube.com/watch?v=ZTC_RxWN_xo#t=1m42s
⬐ brzed"Inseperable from Magic: The Manufacture of Modern Semiconductors"Hey that's me! Very cool and VERY humbling to be mentioned in such esteemable company. I tried to cram way way to much into 50 minutes...