HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Mythical Man-Month, The: Essays on Software Engineering, Anniversary Edition

Frederick Brooks Jr. · 23 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Mythical Man-Month, The: Essays on Software Engineering, Anniversary Edition" by Frederick Brooks Jr..
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
Few books on software project management have been as influential and timeless as The Mythical Man-Month. With a blend of software engineering facts and thought-provoking opinions, Fred Brooks offers insight for anyone managing complex projects. These essays draw from his experience as project manager for the IBM System/360 computer family and then for OS/360, its massive software system. Now, 20 years after the initial publication of his book, Brooks has revisited his original ideas and added new thoughts and advice, both for readers already familiar with his work and for readers discovering it for the first time. The added chapters contain (1) a crisp condensation of all the propositions asserted in the original book, including Brooks' central argument in The Mythical Man-Month: that large programming projects suffer management problems different from small ones due to the division of labor; that the conceptual integrity of the product is therefore critical; and that it is difficult but possible to achieve this unity; (2) Brooks' view of these propositions a generation later; (3) a reprint of his classic 1986 paper "No Silver Bullet"; and (4) today's thoughts on the 1986 assertion, "There will be no silver bullet within ten years."
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
I can give you the names of a handful of books that might be useful. Some are more technical, some less so. Some are more about personalities, some about the business aspects of things, some more about the actual technology. I don't really have time to try and categorize them all, so here's a big dump of the ones I have and/or am familiar with that seem at least somewhat related.

The Mythical Man-Month: Essays on Software Engineering -

Hackers: Heroes of the Computer Revolution -

The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage -

Where Wizards Stay Up Late: The Origins of the Internet -

Open: How Compaq Ended IBM's PC Domination and Helped Invent Modern Computing -

Decline and Fall of the American Programmer -

Rise and Resurrection of the American Programmer -

Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can't Get a Date -

Softwar: An Intimate Portrait of Larry Ellison and Oracle -

Winners, Losers & Microsoft -

Microsoft Secrets -

The Friendly Orange Glow: The Untold Story of the PLATO System and the Dawn of Cyberculture -

Troublemakers: Silicon Valley's Coming of Age -

Hard Drive: Bill Gates and the Making of the Microsoft Empire -

Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture -

The Supermen: The Story of Seymour Cray and The Technical Wizards Behind the Supercomputer -

Bitwise: A Life in Code -

Gates -

We Are The Nerds -

A People's History of Computing In The United States -

Fire In The Valley: The Birth and Death of the Personal Computer -

How The Internet Happened: From Netscape to the iPhone -

Steve Jobs -

The Idea Factory: Bell Labs and the Great Age of American Innovation -

Coders -

Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software -

The Pentagon's Brain: An Uncensored History of DARPA, America's Top-Secret Military Research Agency -

The Imagineers of War: The Untold Story of DARPA, the Pentagon Agency That Changed the World -

The Technical and Social History of Software Engineering -


"The Mother of All Demos" by Doug Englebart -

"Jobs vs Gates" -

"Welcome to Macintosh" -

"Pirates of Silicon Valley" -

"Jobs" -

And while not a documentary, or meant to be totally historically accurate, the TV show "Halt and Catch Fire" captures a lot of the feel of the early days of the PC era, through to the advent of the Internet era.

And there's a ton of Macintosh history stuff captured at:

"On the criteria to be used in decomposing systems into modules" (1972) - because the core principles of modularity haven't changed []

"The Mythical Man Month" (1975) - because human nature hasn't changed []

"The History of Fortran I, II, and III" (1979) - because this historical piece by the author of the first high level language brings home the core principles of language design []

"The Unix Programming Environment" (1984) - because the core basics of the command line haven't changed []

"Reflections on Trusting Trust" (1984) - because the basic concepts of software security haven't changed []

"The Rise of Worse is Better" (1991) - because many of the tradeoffs to be made when designing systems haven't changed []

"The Art of Doing Science and Engineering: Learning to learn" (1996) - because the core principles that drive innovation haven't changed [] []

"xv6" (an x86 version of Lion's Commentary, 1996) - because core OS concepts haven't changed [] []

I recommend reading The Mythical Man Month[1] to anyone interested in proper SWE project management. There are so many wannabe managers out there that will just through more people at a project which is slowing down.

To quote the book: "Adding manpower to a late software project, makes it later". There's some really good stuff in there, even for those of us that are less interested in management.


I keep recommending it to my younger colleagues but they just make polite noises and forget it.

Some of Brooks' suggestions for the make up of a development team are interesting. In particular I have always liked the idea that there should be a toolmaker. Most of us are less efficient than we could be because we lack specific tools to do the job and we don't personally have time to make them. If we had someone who's only job was to make tools I think a lot of things could go faster. Of course this only applies to moderately large teams.

I'm my team's toolmaker. There's just so much lost opportunity in custom tools doing repetitive tasks, it's strange that you would leave that on the table.

But I'm not getting paid for it, my manager doesn't order it. He's kind of happy to see us getting more efficient with less bugs. But it would be good to have some support.

Mythical Man-Month, Fred Brooks [0]. Very informative series of essays on his experiences and lessons learned with IBM. If nothing else, helps to properly frame my expectations on projects with respect to resources needed to properly coordinate with others, and the pros and cons of adding people to projects at different stages (and in different roles).

Getting Things Done, David Allen [1]. Useful toolkit for getting things out of my head and onto paper (or org-mode or OmniFocus) so that I can properly focus and prioritize my time on the things I need to get done.

Communicating Sequential Processes, C.A.R. Hoare [2]. Strongly influenced the way I think about programs in general, but specifically in the embedded field where I work. (NB: I've not actually read or worked through the full text, but mainly taken what was needed to properly communicate ideas in my designs or to analyze designs and systems others have produced. This is a task for myself for early next year.)

Moonwalking with Einstein, Joshua Foer [3]. I've always had a good memory, I actually picked this up to give to a girlfriend who had a terrible memory and read it in a couple days before giving it to her (she was out of town when it arrived). Helped to explain methods that I'd somehow developed over the years, and gave me concepts and a better understanding of other methods of memory acquisition (for either short or long term purposes). If you really want to improve your memory, there are probably better resources to learn specific techniques, but this was an informative and entertaining overview. WRT work, we have to keep large systems in our minds all the time, and potentially dozens of different systems written in different languages. Memory is critical for this, even if it's just the memory of where to find the information and not the information itself.

Fluent Forever, Gabriel Wyner [4]. This one is my current read. Goes back to Moonwalking with Einstein. While the book is itself about language acquisition, it's actually given me quite a bit to think about with respect to general learning and memory acquisition (in this case, specifically for long term retention and recall). We have a couple training programs (we need more) for our new hires on development and testing. There are some concepts in here and in related readings that I think would greatly improve how we teach these folks what they need to know and in a way that would improve their retention of that information. We have a lot of people retiring in the next 1-3 years, so this is actually quite critical right now, though management is quite lackadaisical about it.








The Toyota Way, Jeffrey Liker [5]. I grokked Lean from this. Hardware focused, but the concepts can be (and have been) generalized to other process focused fields. This has helped with understanding what business processes really need to be codified, what feedback mechanisms need to be present for improvement, the criticality of bottom-up feedback and improvement (employee investment in the company/product cannot be overvalued if you want quality and good craftsmanship).

The Little Schemer, Friedman & Felleisen [6]. Going back to the comments on Fluent Forever. The structure of this is fantastic for conveying and helping students retain information. The Socratic method is very useful, and structuring courses and introductory material in this format is useful, this happened to be my introduction to it (well, I'd heard it before, but my first time really encountering it in practice). It's a useful tool for solo-study of a topic (pose your own questions and construct answers), and as a method of guiding someone to a conclusion or better understanding. Also useful in debugging software or decoding software you didn't write, after a fashion.



Or The Mythical Man Month ( )

Honestly I'm kinda surprised that didn't come up sooner...but only half-surprised.

This one should be read (and understood) by anyone who fancies him/herself a manager.
I second that observation. I wish the manager at my day job read ANYTHING, at the very least some informercial-laden industry publication.

In my case, I'd like him to read Fred Brooks' "The Mythical Man-Month", to make him understand how a programming system costs a lot more than a simple module.


In second place, I'd place "Peopleware", so he'd understand the importance of communications, and how the current office arrangement is losing the company a lot of money:


Amazon link to both:

I think reading this book[1] is even more important than learning to use a keyboard (and mouse) with the intention of creating software or any complex system.

Knowing your limits is one thing, but understanding why/how they are being manipulated by outside forces (e.g. overestimating ability) is another. And how to counter those forces is also included in these pages.

Thank's for the sanity and well-managed project advice Fred!


Not sure if this is complete:

Joe Armstrong's paper on the history of Erlang (of which he was one of the authors) is superb (though it's less about corporate culture than about the language):

There's The Mythical Man-Month:

Showstopper, the book about the development of Windows NT, is great:

Thank you kindly. I knew about the mythical month, but the others are completely new to me.
Indeed, the history of the development of very large software systems has been littered with disasters. Don't people read The Mythical Man-Month any more?

Apparently not. Communication between team members has a huge cost factor ... not only ito dollar/time but also ito the quality of the final product.
Amazon links for the lazy:

"The Mythical Man Month" -

"Peopleware: Productive Projects and Team" -

"The Pragmatic Programmer" -

"Managing the Unmanageable: Rules, Tools, and Insights for Managing Software People and Teams" -

"Joy Inc: How We Built a Workplace People Love" -

"A Lapsed Anarchist's Approach to Being a Better Leader" -

"Becoming a Technical Leader: An Organic Problem-Solving Approach" -

"Code Complete" -

Thanks for the shortened URLs. Question: I always obtain those by hitting "Share" and then copying the link. Is there a better way?
I'm a huge believer in going back to primary texts, and understanding where ideas came from. If you've liked a book, read the books it references (repeat). I also feel like book recommendations often oversample recent writings, which are probably great, but it's easy to forget about the generations of books that have come before that may be just as relevant today (The Mythical Man Month is a ready example). I approach the reading I do for fun the same way, Google a list of "classics" and check for things I haven't read.

My go to recommendations: - The Structure of Scientific Revolutions, Thomas Kuhn, (1996) - The Pragmatic Programmer, Andrew Hunt and David Thomas (1999)

Things I've liked in the last 6 months: - How to Measure Anything, Douglas Hubbard (2007) - Mythical Man Month: Essays in Software Engineering, Frederick Brooks Jr. (1975, but get the 1995 version) - Good To Great, Jim Collins (2001)

Next on my reading list (and I'm really excited about it): - The Best Interface is No Interface, Golden Krishna (2015)

Your list of primary sources ends at 2001 :)

No classics beyond that date?

The top was, "choose your own adventure" advice, the bottom was, "here's some clickable immediate gratification" advice. But point well taken, haha.
That reminds me, I'm due a re-read of The Pragmatic Programmer. I re-read it about once a year, and every time seem to get something new from it.
Apr 02, 2015 · lostphilosopher on Silver bullets
If you're interested in this topic, might I recommend:

No Silver Bullet: Essence and Accidents of Software Engineering, Frederick Brooks (1987) -

No Silver Bullet Refired, Frederick Brooks (1995)

Both can be found (along with lots of other good ideas) in The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition -

Fred Brooks, author of No Silver Bullet[0] which GP is referring to, and the fairly famous Mythical Man Month[1].



You might be interested in this book. It's required reading in many CS curricula:

Someone already brought it up in a sibling comment.
Note: in the 20th anniversary 2nd edition of The Mythical Man Month (, in one of the new chapters, Brook's backs off of his original "Build one to throw away" advice.
It's been a long time since I've read either, but I can state in general that software engineering concepts don't tend to get outdated. The state of the art stuff I learned in the late 1970's is still almost entirely valid, although when it comes to low level stuff you'd want to e.g. adopt the advice about goto statements to other non-local things we are confronted with nowadays.

So reading an old classic like The Mythical Man Month isn't going to be a waste of time at all and is strongly recommended, but later really good stuff like these two books have additional lessons we've learned since then. So read the 20th anniversary version of that book (, in which the author revisits issues and e.g. modifies his advice to "plan to throw one (version) away, because you will". The potential state of the art has gotten better, you can often get away with not doing that.

I did the same; taught myself css, php, javascript and quit my FTJ last Christmas. Best thing I ever did.

I also sent the following as advice to someone wanting to get into web dev:

"I was just thinking of 'easy ins' to the world of web development and a good source of information is there's a lot of information from people who work in the world of tech startups and it's good information.

Also - if you are wanting to do php dev the key things to learn are: Software engineering techniques and practice - object oriented development and abstract patterns are key to how to think about good development. Database design and development (1st normal form, third normal form etc) Learn SQL. (SQL for dummies or similar is good for the basic commands and syntax etc.) - it's the best source of help for software development on the internet. read books, the ones that come up again and again when people talk about learning to program:

also - look at - that's where programmers keep their source code.

Learn about Object Oriented Programming, Design Patterns, MVC (which is a design pattern) is specifcally useful for web development.

Also - demand for javascript programmers will increase over the coming years because of things like jQuery and Ajax.

That's my starter for ten - if you are interested in a career as a web programmer.

If you want to focus more on the html/css design side you could do worse than focusing on one CMS - such as wordpress and learning it inside out - you could then offer that to clients and it's a good way to provide web sites cheaply with very little effort."

Nov 27, 2012 · robomartin on I’m writing my own OS
OK, if you don't have any real experience in low-level embedded coding (relevant to device drivers), RTOS or OS design in general, file systems, data structures, algorithms, interfaces, etc. And, if you have "hobby level" experience with Assembler, C and C++. And, if your intent is to write a desktop OS, from the ground up, without making use of existing technologies, drivers, file systems, memory management, POSIX, etc. Here's a list of books that could be considered required reading before you can really start to write specifications and code. Pick twenty of these and that might be a good start.

In no particular order:






















































54- ...well, I'll stop here.

Of course, the equivalent knowledge can be obtained by trial-and-error, which would take longer and might result in costly errors and imperfect design. The greater danger here is that a sole developer, without the feedback and interaction of even a small group of capable and experienced programmers could simply burn a lot of time repeating the mistakes made by those who have already trenched that territory.

If the goal is to write a small RTOS on a small but nicely-featured microcontroller, then the C books and the uC/OS book might be a good shove in the right direction. Things start getting complicated if you need to write such things as a full USB stack, PCIe subsystem, graphics drivers, etc.

The guy don't want to write a perfectly designed OS, neither he wants to design a RTOS by the way ... He didn't even specify whether his OS would be multitask. If not, no need for any scheduling at all, huh ... he just "writes his own OS". It's a fun experience, very rich and very teaching. But at least now, we all can admire the wideness of all your "unlimited knowledge" especially in terms of titles of books. Afterall, no needs for creating a new OS, if you get all your inspiration from things that already exists. You will end with Yet Another Nix ... Not sure he wants a YanOS, though.

Without talking about filesystems, drivers, memory managment or existing norms and standards (which it seems he wants to avoid ... which is not that stupid ... all having been designed 40 years ago ... who knows maybe he'll have a revolutionary idea, beginning from scratch), going from scratch with no exact idea of where you go could be a good thing. Definitely. You solve the problem only when you face them. Step by step, sequentially. Very virile, man.

I would advice the guy to start obviously with the boot loader. Having a look at the code of GRUB (v1), LILO, or better yet, on graphical ones (BURG, GAG or this good one XOSL which is completely under GUI - mouse pointer, up to 1600x1200 screens, and many supported filesystems, written in ASM ... it could actually be a hacking basis for a basic OS on itself) could be a great source of inspiration, and beginning to play directly with the graphical environment.

You could also have a look on these things and of course on Kolibri OS

Blah, blah, blah ...

Better advice would be: Try, make mistakes, learn from them, continue.

Way, way, way less discouraging, pessimistic and above all pedantic.

> If the goal is to write a small RTOS on a small but nicely-featured microcontroller, then the C books and the uC/OS book might be a good shove in the right direction. Things start getting complicated if you need to write such things as a full USB stack, PCIe subsystem, graphics drivers, etc.

I've always wondered if there could be created some way to skip this step in [research] OS prototyping, by creating a shared library (exokernel?) of just drivers, while leaving the "design decisions" of the OS (system calls, memory management, scheduling, filesystems, &c.--you know, the things people get into OS development to play with) to the developer.

People already sort of do this by targeting an emulator like VirtualBox to begin with--by doing so, you only (initially) need one driver for each feature you want to add, and the emulator takes care of portability. But this approach can't be scaled up to a hypervisor (Xen) or KVM, because those expect their guest operating systems to also have relevant drivers for (at least some of) the hardware.

I'm wondering at this point if you could, say, fork Linux to strip it down to "just the drivers" to start such a project (possibly even continuing to merge in driver-related commits from upstream) or if this would be a meaningless proposition--how reliant are various drivers of an OS on OS kernel-level daemons that themselves rely on the particular implementation of OS process management, OS IPC, etc.? Could you code for the Linux driver-base without your project growing strongly isomorphic structures such as init, acpid, etc.?

Because, if you could--if the driver-base could just rely on a clean, standardized, exported C API from the rest of the kernel, then perhaps (and this is the starry-eyed dream of mine) we could move "hardware support development" to a separate project from "kernel development", and projects like HURD and Plan9 could "get off the ground" in terms of driver support.

A lot depends on the platform. If the OS is for a WinTel motherboard it is one thing. If, however, the idea is to bypass driver development for a wide range of platforms it gets complicated.

In my experience one of the most painful aspects of bringing up an OS on a new platform is exactly this issue of drivers as well as file systems. A little google-ing quickly reveals that these are some of the areas where one might have to spend big bucks in the embedded world in order to license such modules as FFS (Flash File System) with wear leveling and other features as well as USB and networking stacks. Rolling your own as a solo developer or even a small team could very well fit into the definition of insanity. I have done a good chunk of a special purpose high-performance FFS. It was an all-absorbing project for months and, realistically, in the end, it did not match all of the capabilities of what could be had commercially.

This is where it is easy to justify moving into a more advanced platform in order to be able to leverage Embedded Linux. Here you get to benefit and leverage the work of tens of thousands of developers devoted to scratching very specific itches.

The down-side, of course, is that if what you need isn't implemented in the boad support package for the processor you happen to be working with, well, you are screwed. The idea that you can just write it yourself because it's Linux is only applicable if you or your team are well-versed in Linux dev at a low enough level. If that is not the case you are back to square one. If you have to go that route you have to hire an additional developer that knows this stuff inside out. That could mean $100K per year. So now your are, once again, back at square one: hiring a dev might actually be more exoensive than licensing a commercial OS with support, drivers, etc.

I was faced with exactly that conundrum a few years ago. We ended-up going with Windows CE (as ugly as that may sound). There are many reasons for that but the most compelling one may have been that we could identify an OEM board with the right I/O, features, form factor, price and full support for all of the drivers and subsystems we needed. In other words, we could focus on developing the actual product rather than having to dig deeper and deeper into low-level issues.

It'd be great if low level drivers could be universal and platform independent to the degree that they could be used as you suggest. Obviously VM-based platforms like Java can offer something like that so long as someone has done the low-level work for you. All that means is that you don't have to deal with the drivers.

To go a little further, part of the problem is that no standard interface exists to talk to chips. In other words, configuring and running a DVI transmitter, a serial port and a Bluetooth I/F are vastly different even when you might be doing some of the same things. Setting up data rates, clocks, etc. can be day and night from chip to chip.

I haven't really given it much thought. My knee-jerk reaction is that it would be very hard to crate a unified, discoverable, platform-independent mechanism to program chips. The closest one could possibly approach this idea would be if chip makers were expected to provide drivers written to a common interface. Well, not likely or practical.

Not an easy problem.

> It'd be great if low level drivers could be universal and platform independent to the degree that they could be used as you suggest. Obviously VM-based platforms like Java can offer something like that so long as someone has done the low-level work for you. All that means is that you don't have to deal with the drivers.

Another thought: if not just a package of drivers, then how stripped down (for the purpose of raw efficiency) could you make an operating system intended only to run an emulator (IA32, JVM, BEAM, whatever) for "your" operating system? Presumably you could strip away scheduling, memory security, etc. since the application VM could be handling those if it wanted to. Is there already a major project to do this for Linux?

The Mythical Man Month, although old, has some great tips for estimating software engineering work.

Does The Mythical Man Month by Brooks

qualify as a programming book for this thread (maybe not)? It has no information on how to write "Hello, World" in any language, and little how-to information about coding, but a lot of information about effective programming, and it is a very interesting, readable book.

By Fred Brooks, author of The Mythical Man-Month. I'm reading the book right now, it's great.

Here's a link to MMM in case someone hasn't read it:

See p. 7 of Brooks' The Mythical Man-Month, for "The Joys of the Craft"

He lists 5: creation; usefulness; intricacy; constant novelty; tractability.

Some of these are in common with graphic design; but the "constant novelty" perhaps addresses your "boredom". Turing said that programming need never become boring, because any repetitive coding (or concept) can be captured in a function or module. Once you're worked out the solution to a problem, you write a reusable module to deal with that problem, and embodies your understanding of it, and you don't have to do it again. So it's always new problems.

Now, in practice, it isn't always that easy. Code that can be reused generally is much harder to write than code for one specific case (the literal meaning of ad hoc: "for this"). The hardest part is specifying what it do, not coding how. This approaches AI: to go from a problem that initially you cannot even understand, to one that you can automate 100%, is transcendent. Almost Frankensteinian... (aka The Modern Prometheus).

For me personally: I get a simple pleasure from making something happen on the screen (like any act of creation); but I actually don't like programming much for its own sake. I enjoy solving problems, and making them real. It's easy to dream something; but to do it is a real accomplishment. And anything on the road of that journey becomes equally important.

I don't know what stage you're at, but it's possible that you're not yet up to wrapping up the common parts of your coding, so you don't have to do them again. If you keep on typing the same predictable, mechanical things, that would be boring. Computers are ideal for this kind of mechanical work.

HN Books is an independent project and is not operated by Y Combinator or
~ [email protected]
;laksdfhjdhksalkfj more things ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.