HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Coders at Work: Reflections on the Craft of Programming

Peter Seibel · 21 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Coders at Work: Reflections on the Craft of Programming" by Peter Seibel.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
This is a who's who in the programming world - a fascinating look at how some of the best in the world do their work. Patterned after the best selling Founders at Work, the book represents two years of interviews with some of the top programmers of our times.
HN Books Rankings
  • Ranked #29 all time · view

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
At Apple? Maybe not. But elsewhere?

Maybe no articles written about them but books or chapters ([0], [1]) written by them.

In the spotlight? You betcha, i think watching talks by Brian Cantrill is highly entertaining. Rich Rickeys' talks are highly regarded on HN (making a mental note to watch them). Carmack talking at Quakecon for hours about many different things.

(edit) formatting and links

[0] https://www.amazon.com/Coders-Work-Reflections-Craft-Program...

[1] https://www.amazon.com/Masterminds-Programming-Conversations...

Uehreka
> At Apple? Maybe not.

Chris Lattner (inventor of LLVM and Swift) was pretty prominent when he was at Apple. He got stage time during at least one keynote and was well known in the macOS/iOS community. Not to mention that their engineering leads in general get to present their work every year to the devs who will be using it during the WWDC sessions.

scns
Glad to be proven wrong! Learned some iOS development years ago but are out of touch what is happening in the appleverse at large, no snark intended.
Yeah I wondered about the origin too. One thing I did during quarantine was re-read some old books on my bookshelf, including 2009's Coders at Work:

https://www.amazon.com/Coders-Work-Reflections-Craft-Program...

The Brendan Eich interview was really interesting, because it talked about bridging research and software engineering, specifically with regard to memory safety in C++ and debugging tools.

I think he was talking about manual annotations on C++. It seems clear that this interest morphed into sponsoring of Rust. I think Rust was a side project since 2006 and it was "announced" around 2010 (?).

And Mozilla also sponsored the "rr" reversible Debugger, which is also some very impressive engineering. (Sadly it seems to get less attention than Rust!)

Anyway, for PL nerds, I recommend reading this interview.

----

I think this work on software engineering tools is great, but as a Firefox user of 15+ years, it would be hard for me to argue that Firefox received sufficient attention. It does feel like the situation where the engineers work on the tools for too long and then management cancels both projects (which I've seen happen in my career).

It's about even more than that: it's about the complexity.

There's an inherent complexity in the problem being solved.

And there is an "accidental complexity" in the implementation of the solution.

Throwing away everything, people typically believe that they can avoid handling a lot of the "inherent complexity." But typically there is a good reason why the inherent complexity was addressed in the previous version of the program, and there's a big chance that the new "from the scratch" designers will have to relearn and rediscover all that, instead of transforming the already existing knowledge that is encoded in the previous version.

For anybody interested in the topic, I recommend the number of case studies presented in:

https://www.amazon.com/Search-Stupidity-Twenty-Marketing-Dis...

"In Search of Stupidity: Over Twenty Years of High Tech Marketing Disasters"

See about new rewrite of Wordstar simply not having the printer drivers that the previous had, and also other features people already expected, leading to Wordstar's demise.

Or what Zawinski's names "Cascade of Attention-Deficit Teenagers" (search the internet for that, the link here wouldn't work!)

"I'm so totally impressed at this Way New Development Paradigm. Let's call it the "Cascade of Attention-Deficit Teenagers" model, or "CADT" for short."

"It hardly seems worth even having a bug system if the frequency of from-scratch rewrites always outstrips the pace of bug fixing. Why not be honest and resign yourself to the fact that version 0.8 is followed by version 0.8, which is then followed by version 0.8?"

Or an interview with Jamie Zawinski from Siebel's "Coders at Work."

https://www.amazon.com/Coders-Work-Reflections-Craft-Program...

... "even phrasing it that way makes it sounds like there’s someone who’s actually in charge making that decision, which isn’t true at all. All of this stuff just sort of happens. And one of the things that happens is everything get rewritten all the time and nothing’s ever finished. If you’re one of those developers, that’s fine because there’s always something to play around with if your hobby is messing around with your computer rather than it being a means to an end — being a tool you use to get whatever you’re actually interested in done."

If one is able to cover all the complexity, and it is not destructive to the goal, the rewrite is OK. Otherwise, one should be critical to the ideas of rewrites as they could be potentially secretly motivated by simple (jwz again): "rewriting everything from scratch is fun (because "this time it will be done right", ha ha)"

Aug 17, 2019 · bhntr3 on Arcs of Seniority
Yes. The article is a bit tautological. It's not defining seniority in terms of time spent. He actually says this at the end. Instead he's defining a "senior developer" as someone who has a necessary mix of the qualities he talks about. It would be more accurate to say "These are the traits your manager will tell you to develop to get promoted."

Personally, my favorite book about old coders is Coders At Work (https://www.amazon.com/Coders-Work-Reflections-Craft-Program...). It's a bunch of interviews with programmers who have created programming languages and lots of the fundamental software we all rely on. It showed me what the journey to being a great programmer can look like.

To me, senior is a corporate term. Great programmers build great things. Senior developers get promoted. I sometimes ask young programmers this: Do you care about the craft or the career? I think being a great programmer will make a person money. There aren't that many truly great programmers. But if they're impatient or they don't think they can be great, they can probably be senior.

Becoming senior is easy: Just help your boss accomplish their goals. Pay attention and develop skills that will help you do this no matter where you are and what you're working on. If you over-specialize in a specific organization or person's need, you become the expert beginner and you can't leave or you will struggle.

Some of the things that help a person be senior can also make them great. But the path to being great is a very different and personal one (at least that's the impression I got from Coders At Work. I make no claims for myself.) Jeff Dean is undoubtedly a great programmer and was also the most senior developer at Google for a long time. They made new levels just for him. So they can overlap. If someone is lucky and their job is great, the things that make them senior can also make them great. If their job sucks and management is terrible, they'll have to choose every day between doing something great or doing something to get promoted (being senior.)

My favorite article on the journey of a software engineer is this one: https://medium.com/@webseanhickey/the-evolution-of-a-softwar... . To me it's the story of someone who started off trying to be senior but then started to become great.

scarface74
Do you care about the craft or the career?

I would say both. But the field is littered with idealistic developers who “care about their craft” while their employers care only about profit and then are surprised when years later they discover that they have been taken advantage of and underpaid.

All the guys from "Coders at work" [1]: - Jamie Zawinski - Brad Fitzpatrick - Douglas Crockford - Brendan Eich - Joshua Bloch - Joe Armstrong - Symon Peyton Jones - Peter Norvig - Guy Steele - Dan Ingalls - L Peter Deutsch - Ken Thompson - Fran Allen - Bernie Cossel - Donald Knuth [1] https://www.amazon.com/Coders-Work-Reflections-Craft-Program...
You might enjoy Steven Levy's Hackers: Heroes of the Computer Revolution[1]. It's not too focused on specific people or companies, although you'll encounter some well known people like Bill Gates, Steve Jobs, and Richard Stallman in the book. It's an interesting read because it gives you a great background that helps you understand how we ended up with the tech culture and environment we have today.

In the reply to another comment, I also mentioned Coders at Work[2]. I found that it provided some great insight into the early days of some fascinating companies from a technical perspective.

[1] https://www.amazon.com/Hackers-Computer-Revolution-Steven-Le... [2] https://www.amazon.com/Coders-Work-Reflections-Craft-Program...

I know you didn't ask for books but here are some interesting ones. The first two cover individuals and the last two cover the works of others.

Coders At Work (https://www.amazon.com/Coders-Work-Reflections-Craft-Program...)

Founders At Work (https://www.amazon.com/Founders-Work-Stories-Startups-Early/...)

Architecture of Open Source Systems (https://www.amazon.com/Architecture-Open-Source-Applications...)

Architecture of Open Source Systems - Vol 2 (https://www.amazon.com/Architecture-Open-Source-Applications...)

Oh wow, I was just reading about this yesterday in Coders At Work[0]. Douglas Crockford[1] worked on this at a company that was trying to do some distributing computing work in the 90s-00s. They originally based it off the JVM but SUN had issues with that so they turned it more in to what he described as a scripting language "which is what we have today."

[0]: http://www.amazon.com/Coders-Work-Reflections-Craft-Programm... [1]: http://javascript.crockford.com/

evgen
So in the beginning Electric Communities was going to do large-scale, distributed virtual worlds (think MMORPGs as shopping malls...) with real security. The original plan was to use Joule, but negotiations didn't work out and while waiting around a couple of engineers created a reasonable subset in this new language called java. This subset eventaully because the original E language. Crock was the slides and presentations guy of the core team, so after knowing him for a while as mostaly a non-coder it was odd to see him pop up again and have such a big impact on the javascript world.
There are two excellent books that will answer most of your questions. The second book is harder to obtain (more expensive), as it is older. Each of the interviews are pretty detailed, down to nitty-gritty, often mundane details about the craft of programming.

http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

http://www.amazon.com/Programmers-at-Work-Susan-Lammers/dp/0...

nekopa
Susan Lammers has out the programmers at work interview up on line:

https://programmersatwork.wordpress.com

Oct 29, 2015 · spion on Problems with Go's design
I keep seeing this argument all the time. Lets put a counter-argument-by-authority: Fran Allen [1] thinks that the C was a huge step backwards in language design [2], and there is no reason to think that Go didn't repeat the same pattern.

Interestingly, the article you quoted mentions functional programming and immutable data as the step to go from 200K to 2M lines. Go is fundamentally incapable of functional programming* and its builtins allow pervasive mutable state.

[1]: https://en.wikipedia.org/wiki/Frances_E._Allen

[2]: http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

* Its impossible to support FP without generics. Even the most basic higher order functions require type variables.

ngrilly
The guarantees offered by functional programming and immutable data are great, but they mostly evaporate when you move from a single process application to a multiple processes system, distributed on many machines and communicating through IPC.

Are you aware of a 2M lines code base that fully relies on FP and immutable data?

kibwen
Erlang's very existence is a pretty solid refutation of the claims you're putting forth. :P
ngrilly
I don't think so. An Erlang process has no mutable state. But it can communicate with another Erlang process to set/get some state. This is how a lot of mutable structures are implemented in Erlang.
spion
Do they, now? Or do they only evaporate in Erlang?

Since most FP codebases do the same with a few orders of magnitude less code, no, I'm not aware. Though I imagine that if they had to copy their code for every single instance of their generic typeclasses, they would quickly amass many millions of lines.

deepnet
I beg to differ: the guarantees functional programming and data immutability offer are exactly the sorts of guarantees needed for scalable parallelisation.

LOC is not a useful metric for the utility of a program, in fact the reverse is true in terms of maintainability and places for bugs to hide.

This chap claims to write very robust programs, entirely functionally that are worth billions in their operation.

http://logicaltypes.blogspot.co.uk/2015/08/pure-functional-p...

ngrilly
I agree that "the guarantees functional programming and data immutability offer" are very useful for "scalable parallelisation".

But this is not what I was answering to.

I was answering to the idea that FP and immutability are useful when you scale from 200K to 2M lines of code:

> the article you quoted mentions functional programming and immutable data as the step to go from 200K to 2M lines

In my opinion, this argument is irrelevant.

deepnet
The argument being that non functional programming has fundamental limits to its scalability, seems like a very important question as the complexity of code begins to grow exponentially.

Large imperative codebases are inherently fragile, hence the need for test driven development - good batteries of tests offer demonstrable reliability.

Functional programming offers provable reliability.

Provability is not just for scaling complexity but also for much simpler software that absolutely must not fail.

Imperative code becomes hard to reason about before 200K LOC is reached.

ngrilly
The counterargument is that very large codebases tend to be composed of multiple services. Each service lives in its own process and maybe in its own machine, and communicate with other services through IPC. At that point, you've lost the benefit of functional purity and immutability, because each service can maintain some "hidden" state.

You're right, above some threshold, it becomes hard to reason about imperative code with mutations. Your preferred solution is to adopt functional programming and immutability. Another solution is to decompose your system in multiple communicating services. Google's codebase is a very well known example of the latter approach.

spion
This is false. Well, it may be true in Erlang, but its false in Haskell, where there is always clear separation between equations and effects/state no matter the source (IO to other services or local state). Not to mention that there are tools such as STM that let you deal with state in a much safer and less error-prone way.

Separating the state between multiple communicating services is basically the same strategy as OOP (objects that communicate via messages and encapsulate state) but with stronger encapsulation requirements (access only via API, you can't just "reuse" methods willy-nilly all over the place but you have to come up with a sensible library or a third service, etc etc).

It works, yes but it works about just as much as OO does. Which is... not very much. And it also comes with its own tradeoffs (good luck getting atomic changes over multiple microservices)

Peter Norvig often covers really interesting stuff. A long time ago here on HN I read his article on the probability of there being no set in the card game SET [1]. I had never heard of SET then, but picked it up based on the article. It's a simple game, but has interesting properties. Analytical answers are hard to find for a lot of the cases, so simulation to the rescue. I ended up running more simulations than Peter did, with some interesting results [2].

Also, the interview with Peter Norvig in the book "Coders at Work" [3] is great - one of my favorites in the book (actually, the whole book is great).

[1] http://norvig.com/SET.html

[2] http://henrikwarne.com/2011/09/30/set-probabilities-revisite...

[3] http://www.amazon.com/gp/product/1430219483/

norvig
Very nice write-up on SET, Henrik. I hadn'e seen it before, and wasn't aware of your interesting results.
Coders at Work: Reflections on the Craft of Programming - Peter Seibel

http://www.amazon.com/gp/product/1430219483

great book, easy reading, inspiring and insightful, highly recommended!

Jun 02, 2013 · danso on Learn C
For anyone who hasn't browsed through Peter Seibel's "Coders at Work," one of his subjects is Fran Allen...it's kind of funny because I do agree that learning C has been valuable to the high-level programming I do today (but only because I was forced to learn it in school). But there's always another level below you that can be valuable...Allen says C killed her interest in programming...not because it was hard, but because of, in her opinion, it led engineers to abandon work in compiler optimization (her focus was in high-performance computing):

(Excerpted from: Peter Seibel. Coders at Work: Reflections on the Craft of Programming (Kindle Location 6269). Kindle Edition: http://www.amazon.com/Coders-Work-Reflections-Craft-Programm... )

Seibel: When do you think was the last time that you programmed?

Allen: Oh, it was quite a while ago. I kind of stopped when C came out. That was a big blow. We were making so much good progress on optimizations and transformations. We were getting rid of just one nice problem after another. When C came out, at one of the SIGPLAN compiler conferences, there was a debate between Steve Johnson from Bell Labs, who was supporting C, and one of our people, Bill Harrison, who was working on a project that I had at that time supporting automatic optimization...The nubbin of the debate was Steve's defense of not having to build optimizers anymore because the programmer would take care of it. That it was really a programmer's issue....

Seibel: Do you think C is a reasonable language if they had restricted its use to operating-system kernels?

Allen: Oh, yeah. That would have been fine. And, in fact, you need to have something like that, something where experts can really fine-tune without big bottlenecks because those are key problems to solve. By 1960, we had a long list of amazing languages: Lisp, APL, Fortran, COBOL, Algol 60. These are higher-level than C. We have seriously regressed, since C developed. C has destroyed our ability to advance the state of the art in automatic optimization, automatic parallelization, automatic mapping of a high-level language to the machine. This is one of the reasons compilers are ... basically not taught much anymore in the colleges and universities.

habitue
I think she might have given up on programming a bit prematurely. The pendulum has obviously swung completely the other way with high level languages like Haskell pushing forward compiler optimization, and JIT VMs pushing forward in other directions. It's actually an exciting time for "smart compilers".
anuraj
There is no pendulum swing - Haskell remains a niche - C and its off springs are still mainstream. Have you ever checked statistics?

http://www.tiobe.com/index.php/content/paperinfo/tpci/index....

http://www.langpop.com/

C still remains the only accessible low level language to do systems programming. C compilers have been optimizing for long.

irahul
> I think she might have given up on programming a bit prematurely. The pendulum has obviously swung completely the other way with high level languages like Haskell pushing forward compiler optimization

C compilers were optimizing long before Haskell. From her interview, I don't understand why she couldn't work on optimizers even if someone else advocated optimization being programmer's repsonsibility?

1123581321
I believe she worked on optimizers her entire career as a logician, scientist and manager rather than as an implementing programmer.

http://www-03.ibm.com/ibm/history/witexhibit/wit_hall_allen....

http://amturing.acm.org/award_winners/allen_1012327.cfm

habitue
> C compilers were optimizing long before Haskell

Right, and they've come up with some excellent tricks too. But the reason she felt optimization wasn't going to progress as far is because C is a lower level language than some of the other languages out at the time, and there are necessarily less tricks you can do in a lower level language because you have to infer the intent of the programmer more, and rely on optimizing idioms etc, rather than optimizing actual constructs of the language.

How does a compiler optimize a single "goto"? There isn't much it can do unless, for instance, it notices that the goto is found in an idiomatic pattern that results in a loop. Then it can make a decision whether to unroll the loop or not. If the language gives you the loop construct, it can skip the "recognize the idiom" step (and the associated risk of guessing wrong), and go right to optimizing loops. Similarly, in higher level languages than C, the programmer can express their intent more directly, and therefore the compiler can take less risks when guessing "Ah, I see what you're trying to do, here's the fastest assembly that accomplishes that"

waps
To convince C users why Haskell has the potential (but currently only potential) to optimize better, it is best to just point to one example that C can never hope to optimize : deforestation.

http://book.realworldhaskell.org/read/profiling-and-optimiza...

C will never be able to do that. Before optimization : the programmer requests a list to be created, fill it by calling functions and then passes the completed datastructure list (or tree) along to another function, which executes commands according to what the list contains.

After optimization there is no more list. Instead the function the list is passed to will call a generated function that generates exactly the needed elements of the tree just-in-time. Result: no list, no memory (aside from 1 element on the stack), no allocation, no clearing of memory afterwards.

Of course the downside is that it's very tempting (and encouraged) to write programs that don't contain these optimizations you'd have to do manually in C/Java/... and just have them run. What you'll miss as a Haskell programmer is that the program is effectively dependent on those optimizations for it's complexity (for example: optimized program is O(n), program as written is O(n^n). Then you insert what looks like a tiny change, say, sorting the list, which prevents optimization from happening and boom, your binary switches from O(n) to O(n^n). All tests will obviously pass, yet your boss is unlikely to be happy ... At this point it is extremely hard to figure out what just happened)

mc-lovin
While I'm very sympathetic to Allen's viewpoint, it seems that there was, and still is, a large class of problems where C-like languages (where the programmer does the optimizations) are better than high level languages with optimized compilers.

The best example for her case would be Fortran, where the language is both higher level, and faster, because the compiler can make much more assumptions (my understanding is the restrict keyword somewhat evens the playing field with C, but that is kind of a hack).

However, plenty of numerical work is also done in C, in spite of Fortran's availability.

munin
writing optimizers in C is much harder than in Fortran because of reduced abstractions and reasoning about memory and pointers. in Fortran they were working on making automatic multi-threaded optimizations where you would write Fortran and it would auto-parallelize into multiple threads ... before C was invented.

C did kill a lot of work on making computers easier to program and safer. it was a devils bargain for speed and we paid for it with decades of crappy code with buffer overflows and shared state race conditions. it's tricky to know if it was worth it.

foobarbazqux
Actual song from PLDI'07 where she received the Turing Award, which we all sang to the tune of Take Me Out to the Ball Game:

  Let’s all sing to Fran Allen
  For the great things she’s done.
  PTRAN and Blue Gene and E C S
  Fran, we’ve gathered to toast your success.
  As we ponder all you’ve accomplished,
  Our colleague extraordinaire,
  Here’s to you, Fran Allen, you’re truly beyond compare

  Optimizing compilers
  Parallel transforms too
  Intervals, call graphs, and data flow
  Keep our programs from running too slow
  So we root, root, root for Fran Allen
  Her heart, and spirit, and voice
  For she’s won the Turing Award and we all rejoice!
She said that she believes functional languages and programs are key for scalable parallelism on massively multicore processors, because of the unavoidable performance problems associated with synchronization on shared data dependences in imperative languages.

The back story you quoted about C motivates her position a little, so thanks. It seemed like few of the hardcore compiler people that I spoke to at the conference seriously believed functional languages were going to be the future because of performance reasons, although personally it doesn't seem like an entirely unreasonable proposition.

tracker1
Using functional paradigm with immutable types does take care of a lot of the complexities of parallel computing... It allows you to make a lot of assumptions (from the underlying platform's perspective) and can break up work in very interesting ways, not just across cpu cores, but even in distributed computing environments. If you combine this with a scriptable language, you can even carry the workload with the code to operate on the load.. which takes things a bit farther. You could create a literal farm of workers that grab a bit of data, process it against the defined load, and then return the result with the next step to run against the data...

I've wanted to create such a system for a while.. right now the closest I could come up with would be to use nodejs with json and a message queue to handle requests/loads. Unfortunately, there's a pretty serious cost to the json serialization, and other issues... but it's an interesting idea that has merit. I think in the end such a system will have a quickly serializable binary expression of both the data and the work to be done.

The catch is that most developers I know aren't used to breaking work up in such a way that it could be very parallelized.

sirclueless
The kicker for me that makes me think that working too hard to make easily parallelizable constructs isn't worth it is that communication is expensive and single cores are actually pretty darn powerful on their own. The implication is that you don't want to be parallelizing at too low a level. You want to be doing it at a much higher level if you possibly can.

If you have an 8-core machine and thirty independent tasks at the top-level, then all the functional programming research and parallel algorithm wizardry in the world isn't going to make you want to parallelize anything except the top-level tasks. So even if that's a verbose and error-prone task due to procedural programming constructs, at least you only need to do it once. Heck, in my own (admittedly limited) experience 90% of the hard work I have needed done has been handled by my OS's process scheduler and Redis.

The fact is that business applications, the web, and consumer software are all perfectly happy to accept the current limitations of hardware -- 12 processes running on 4 cores is efficient and easy, even with no parallelization at all below the OS process level (with the possible exception of GUI threads in desktop applications).

The only really interesting work happens in areas like HPC where you have 1 task and 900 cores and if you can't parallelize you are dead in the water, and latency-critical applications where the sacrifices made to run intra-task parallelization pay big dividends.

_pmf_
You are missing the per-CPU parallelization via vectorization that lies at the core of every modern video decoding library. The conundrum with C is that it is one of the very few languages that allow (via assembler callouts) to directly use CPU vecorization instructions, but at the same time does so in a manner that absolutely prevents automatic optimizations (i.e. by observing data dependencies).
tracker1
I think with the increasing availability of more, weaker CPUs with a decent thermal envelope, and efficient processors like ARM, then distributing work makes even more sense... in a SOA system, where a lot of processing may well be IO bound, why not break out the load even more... I think the future will be thousands of compute nodes/workers along with hundreds of service nodes handling millions of simultaneous requests.

Functional constructs make such scaling nearly effortless once the practical issues of breaking up work are handled. Yes, communications has a cost, but there are faster channels available than what are used... and distributing data persistence into redundant, sharded clusters can yield a lot of other benefits.

Given, most line of business applications are fine on current hardware... the problem is scaling to 10x or 100x the workload. You can do this by creating a system that can scale horizontally, or to be more performance oriented with current hardware. One solution gains you a single generation of increased output... another gets you N scale expansion.

I don't think it's just macro parallel tasks... but many-micro tasks that can work within such a system.

Assault is a loaded word to describe a honest observation based on a needless graphic picture you painted. You might not need help to sort out your anger issues. But you certainly need some work on your politeness skills. Here's a tip on the importance of that, politenes is appreciated as much as rudeness is abhorred.

Now onto your claims about open source developers from Brazil, Russia, India, Poland, and China. I'm going to ignore the fact you only backed it up with your imagination, and focus on the interesting part, your assumption. You assumed that all these fellow programmers didn't have access to books, older programmers, nor even computers, in the seventies. Despite of the fact that programmers from these countries have been consistently shipping great software for decades. What really staggers me is that you assume on behalf of all these people that they lack culture, just to prove your point.

Do humanity a favor, and go read a book [0]. Or at least try to leave home so you can talk to people, and finally work on your poor social skills.

[0] http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

guard-of-terra
Why should I be polite to someone who doesn't support his claims with any facts? And I'm still waiting for any. I still didn't see any not based on ancient history.

You don't help your point by stating that some people had access to computers in non-first world countries. Sometimes they built those computers before programming those. You also have to prove they had exposure to line-counting culture, which you didn't yet. Note: you can quote books, but not just refer to those.

wildranter
A delusion is a belief held with strong conviction despite superior evidence to the contrary.[1] Unlike hallucinations, delusions are always pathological (the result of an illness or illness process).[1] As a pathology, it is distinct from a belief based on false or incomplete information, confabulation, dogma, illusion, or other effects of perception. [0]

You're delusional. Get help.

[0] http://en.wikipedia.org/wiki/Delusion

guard-of-terra
Слив засчитан.
Sadly I don't really read quite as much as I used to; but following are the books I read this year (though none of them were released this year).

- Founders at Work: Stories of Startups' Early Days

http://www.amazon.com/Founders-Work-Stories-Startups-Problem...

Excellent book covering interviews with founders of companies that became really big. I thought this book was really insightful and inspirational.

- Coders at Work: Reflections on the Craft of Programming

http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

I just started this book, but already like it - the format is the same as the Founders at Work book but on the developer side of things.

- World Changers: 25 Entrepreneurs Who Changed Business as We Knew It

http://www.amazon.com/World-Changers-Entrepreneurs-Changed-B...

It was a good book, but not as inspirational as the Founders at Work book. Some of the stories are good, but since the majority of the people are not in my sector, the book just wasn't as interesting to me.

- Ready Player One

http://www.amazon.com/Ready-Player-One-Ernest-Cline/dp/03078...

An excellent story that really made me nostalgic to my younger years - definitely recommend this one.

- The Mystic Arts of Erasing All Signs of Death: A Novel

http://www.amazon.com/Mystic-Arts-Erasing-Signs-Death/dp/034...

I have a weak spot for Charlie Huston books - he's not the best author (sorry Charlie), but his books are really easy to approach. This is one of his best ones and is about crime scene cleaners - a nice departure from all the Joe Pitt vampire novels.

- World War Z: An Oral History of the Zombie War

http://www.amazon.com/World-War-Oral-History-Zombie/dp/03073...

It's OK... I read it half way through and then once I got busy I just couldn't get myself to pick it up again. I will finish it eventually.. just not yet.

- Hyperion

http://www.amazon.com/Hyperion-Dan-Simmons/dp/0553283685

A friend recommended this book to me - I could not get past the first chapter.

ericfrenkiel
I'm a huge fan of Dan Simmons, and his Hyperion series is well worth the time. Once you make it beyond the first few pages of Hyperion, I guarantee you'll be hooked. There are a total of 4 books in the series and each is truly a masterpiece in science fiction.

I would also recommend "The Terror," which is a historical fiction piece loosely based on the first expedition to the North Pole.

Simmons has won multiple awards in Science Fiction and leaps across categories with aplomb. I highly recommend any of his work!

nhebb
I read Hyperion a few months ago, and I only read to the end so that I wouldn't be left with that nagging feeling of leaving a book unfinished. Much like the Grammy's and the Emmy's, I've found that book awards may be a good indicator of what other people like, but often not a good indicator of what I like.
Original article: (http://gawker.com/5520339/mac-genius-slams-his-google-job)

Which quotes a little chunk from the book Coders at Work (http://www.amazon.co.uk/Coders-Work-Reflections-Craft-Progra...)

(Anyway I can trim that amazon URL into something nicer without using url-shorteners?)

Of a similar note, but comes off very different (to me at least):

Coders at work http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

I'm more of a fiction kind of guy so I'll have to recommend this:

Cryptonomicon http://www.amazon.com/Cryptonomicon-Neal-Stephenson/dp/B004R...

(The kindle edition is more than the paperback but if you do want to travel light the kindle edition will definitely be worth the extra bucks as it is one of those thick thousand page mass market paperbacks)

mindcrime
+1 for Cryptonomicon. It isn't the easiest book to get through, but it's very worthwhile.

Another couple of possibilities might be:

The Soul of a New Machine - Tracy Kidder

http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...

The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage - Clifford Stoll

http://www.amazon.com/Cuckoos-Egg-Tracking-Computer-Espionag...

Hackers & Painters - Paul Graham (yes, that Paul Graham)

http://www.amazon.com/Hackers-Painters-Big-Ideas-Computer/dp...

My strong anti-C++ bent comes from having used in in demanding production situations for 12 years. It is a beast. When Scott Meyers of http://www.amazon.com/Effective-Specific-Addison-Wesley-Prof... fame reads a book on C++ template programming and is surprised, no, astonished! at some of the things done there, and attempts to write auto_ptr and fails at least twice.

It takes way to long to learn (probably 2 years for a developer working with it 8 hours a day), and the grown ups don't like it either: http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

This isn't even an ugly chick.

I don't necessarily know of any one book that meets all of your friends requirements, but...

Tracy Kidder's The Soul of a New Machine might be good for your friend.

http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...

Another good option might be Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Or, how about Coders at Work?

http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

Another one that I have (but haven't had time to read yet) is Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software by Scott Rosenberg. It might have something that your friend would find interesting.

http://www.amazon.com/Dreaming-Code-Programmers-Transcendent...

Another one that may be inspirational, although it's more about personalities than computer science per-se, would be Steven Levy's Hackers: Heroes of the Computer Revolution.

http://www.amazon.com/Hackers-Computer-Revolution-Steven-Lev...

pgbovine
thanks for the references! i really appreciate you taking the time to reply to my question.

btw "Dreaming in Code" is the only one of those that I've read, and I don't think it's a good fit for my friend because it's basically the story of software project management gone awry ... hardly inspirational for someone aspiring to learn about the beauty of CS :)

Jul 27, 2010 · jmatt on You Can’t Take It With You
I think Coders at Work and Founders at Work did a pretty good job with this.

http://www.amazon.com/gp/product/1430219483/

http://www.amazon.com/gp/product/1430210788/

Programming Language Pragmatics by Michael L. Scott: The explanations of many things I'd read in other sources are no less than fantastic, I now understand a bunch of things I had only superficially "got" previously. http://www.amazon.com/Programming-Language-Pragmatics-Third-..., check out the overview and reviews.

Coders at Work by Peter Seibel: By far the best of this type of book (well, not counting the '80s classic Programmers at Work which I haven't read since then), one of the best Lisp authors interviews in depth a lot of really interesting and/or important people, from James Zawinski to Donald Knuth, with Javascript, static FP and PARC people, Guy Steele, Peter Norvig, Ken Thompson, Fran Allen (really important interview which points out how C/C++ to the exclusion of truly high level languages have been a disaster when used beyond their proper niches), etc. All are masters who've gotten their hands dirty, many are theory people as well. http://www.amazon.com/Coders-at-Work-Peter-Seibel/dp/1430219...

Garbage Collection by Jones Lins: Pretty much the only book in the field (except for the forthcoming Advanced Garbage Collection sequel in the middle of this year), covers the territory as of the mid-90s. Much more fun than trying to track down 100 individual papers and trying to make sense of it all. Exposition is clear and you get a real feeling for the subtleties of the field (especially when you try fun things like generational and/or concurrent GC). http://www.amazon.com/Garbage-Collection-Algorithms-Automati...

HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.