Hacker News Stories and CommentsAll the comments and stories posted to Hacker News that reference this course.
A whole lot of people know how to make a CPU. It's not as hard as you probably think it is. What's hard is making them as complex, fast, and small as we do nowadays. But if you can get by with something not as complex or fast, and that takes up most of your desk, thousands of people know how to do it.
Got 120 hours and want to become one of them? Take these two courses from EdX, which are 60 hours each:
"Computation Structures - Part 1: Digital Circuits" 
"Computation Structures 2: Computer Architecture" 
The first teaches "[...] digital encoding of information, principles of digital signaling; combinational and sequential logic, implementation in CMOS, useful canonical forms, synthesis; latency, throughput and pipelining". In the homework and labs you design and implement (in a simulator) a 32-bit ALU.
The second covers "[...] instruction set architectures and assembly language, stacks and procedures, 32-bit computer architecture, the memory hierarchy, and caches". In the homework and labs you design and implement (in a simulator) at the gate level a 32-bit RISC CPU, except for memory. Memory for registers and program is given as a black box--by this point you know enough to design that, but it would just add a lot of components and complexity, and slow down the simulation, and the time spent dealing with it would distract from learning the topic of this part of the course. That would fit better with the first part of the course.
I've taken these, and can say they do a good job of teaching what they say they teach.
There's also a third course in the series:
"Computation Structures 3: Computer Organization" 
That covers "[...] pipelined computers, virtual memories, implementation of a simple time-sharing operating system, interrupts and real-time, and techniques for parallel processing".
In the homework and labs for that one, you optimize your CPU from the second part for size and speed, and make it support time sharing operating systems.
I've not taken this one.
⬐ acqqIt’s not the logic but the technology that is not widely known: the new fabs cost billions.
I notice that The things you list are things that can learn online from youtube and video games.
- Geography Now: https://youtu.be/DxxZOsfsIUM
- Crash Course Computer Science: https://www.youtube.com/playlist?list=PL8dPuuaLjXtNlUrzyH5r6...
- Hearts of Iron, Europa Universalis, Sid Meyer’s Pirates, Railroad Tycoon, bunches of kids games teach geography or arithmetic
Whereas there isn’t really a good digital experience you can build to teach brain-management skills... or is there?
⬐ indigochillOne of Coursera's top courses is a course on learning how to learn, which is really about learning the balance of self-discipline (both how to do it and why it works). It works not just for learning, but also for time management and other things:
Don't know about online courses for things like emotional health, but I see no reason there couldn't be.⬐ afarrell⬐ gowldYou know, thats a good course to reference.
Giving it a second think, a lot of CBT is skills-based, so I think I’d like to agree. Though, I suspect that such a mooc for anxiety does not yet existNot sure why this is downvoted. Learning facts is the easiest thing to do in any settings. Learning skills like "how to read and manage workloads, and advice on what to rea" is what needs guidance and training.
• MITx "Introduction to Solid State Chemistry" . I've never been good at chemistry, but this course managed to make it clear to me.
• MITx "Circuits and Electronics"  (three links because they have split it into three courses since I took it). Most electronics courses have not worked well for me. Some fail by using analogies that don't work for me. The analogies are either to things I don't understand, or to things I understand too well compared to the target audience for the course.
The latter might seem odd--how can understanding the analogous system too well cause a problem? It's because there usually isn't a perfect match between behavior of the analogous system and electronics. The more you know about the analogous system, the more likely you are to know about those places that don't match. If the author expects the students will not know about those parts, they won't mention the limitations from those parts. So you can end up expecting too much of the analogous system to apply.
Other courses have not worked for me by being too deep and detailed. For instance at one time I knew, from a solid state physics intro I took, how a semiconductor diode worked at a quantum mechanical level. I could do the math...but the course gave me no intuition for actually using the diode in a useful circuit.
The "Circuits and Electronics" course struck for me a perfect balance.
• MITx "Computation Structures" . At the end of this three part course (of which I only took the first two parts), you will know how digital logic circuits work at the transistor level, and you will know how to design combinatorial and sequential logic systems at the gate level, and you will know how to design a 32-bit RISC processor...and you will have done all those designs, using transistor level and gate level simulators.
As I said, I only took the first two parts (didn't have time for the third). In the first two parts we did cover caching and pipelining, but we didn't use them in our processor. I believe that in the third part those and other optimization are added to the processor.
• Caltech "Learning From Data" . The big selling point of this course is that it is almost the same as what Caltech students get when they take it on campus. The only watering down when I took it was the homework was multiple choice so it could be graded automatically.
The most outstanding thing about this course was Professor Abu-Mostafa's participation in the forums. He was very active answering questions. I don't know if he still does that now that the course is running in self-paced mode.
⬐ sizeofcharAlso did Computation Structures from MITx and I think it was the best of the roughly 20 MOOCs I took. Too bad few people seem to have done it as well.
In the third part of the course, the content moved to the software connecting to the BETA, the processor we built in earlier parts. The last problem set was to build a very simple OS, in assembly, with interrupts, privileged mode, and running up to 3 concurring processes, all in less than 1000 instructions, macros included.
For software people who find this interesting and would like a great introduction, consider taking the course "Computation Structures" from MITx on EdX.
It's in three parts. The next run of part I, "Digital Circuits", starts September 6th. Here's the link: https://www.edx.org/course/computation-structures-part-1-dig...
Part I cpvers basic logic gates, at both a high level (how you use them) and a lower level (how you build them on a chip), their characteristics (propagation delay, contamination delay, and things like that), combinatorial logic, sequential logic, state machines, pipelining, and probably more that I don't remember. The labs are done via a browser-based simulator, and by the end of the course you will have designed and implemented a 32-bit ALU with add, subtract, logical and arithmetic shifts in both directions by up to 31 bits, all the boolean operators, and the usual comparison operators.
Part II builds on that, taking you through designing and implementing a full 32-bit processor. Caching is discussed in the lectures, but not used in the processor.
Part III (which I did not have time to take), I believe, adds caching and pipelining to the processor, and covers parallel processing and device handling, and also operating system stuff.
For a while I wanted to actually build the processor from Part II using discrete 7400 series logic, but the chip count came in too high for me. My gate counts were: 295 AND2, 8 AND3, 3 NOR2, r OR2, 96 OR3, 20 OR4, 226 XOR2, 6 NOT, 563 MUX2, 161 MUX4. (That's not counting whatever I'd need for the control ROM and the 32 x 32-bit register file).
At 4 MUX2s per chip, and 4 AND2s per chip, that's 215 chips. Another 81 for the MUX4s and 57 for the XOR2s brings it up to 353. Without even tossing in the rest, I'm way over my limit.
I could cut this down quite a bit by taking out the shift unit (which uses 353 MUX2s), making the shift instructions generate an illegal instruction trap, and have the trap handler emulate the shift instructions. That would save 88 chips. (Well, not quite 88 chips...I think I'd have to add a "logical right shift by 1" instruction to make it so this approach would not be too slow, but a dedicated "logical right shift by 1" unit is a lot simpler than a "shift logical or arithmetic in any direction by any amount" unit).
The cool thing though, is that I, a software guy, could could actually make those hardware changes now. A lot of things about computers now make a lot more sense to me. I highly recommend it to those curious about what goes on at a lower level than we software guys normally deal with (even if we are writing software that interfaces with devices).
⬐ tptacekAlso highly, highly recommend buying the book; this is one of those cases (unlike, for me, linear algebra) where you can soak up a pretty good fraction of the whole topic just by plowing through the text, and this particular text is pretty great.
I read code a couple of times and did Nand2Tetris. I also found the edX 6.004 mit course to be extremely useful to fill in some of the gaps. I'm waiting for part three to start in a couple of weeks.