HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations

Nicole Forsgren PhD, Jez Humble, Gene Kim · 9 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations" by Nicole Forsgren PhD, Jez Humble, Gene Kim.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
Winner of the Shingo Publication Award Accelerate your organization to win in the marketplace. How can we apply technology to drive business value? For years, we've been told that the performance of software delivery teams doesn't matter―that it can't provide a competitive advantage to our companies. Through four years of groundbreaking research to include data collected from the State of DevOps reports conducted with Puppet , Dr. Nicole Forsgren, Jez Humble, and Gene Kim set out to find a way to measure software delivery performance―and what drives it―using rigorous statistical methods. This book presents both the findings and the science behind that research, making the information accessible for readers to apply in their own organizations. Readers will discover how to measure the performance of their teams, and what capabilities they should invest in to drive higher performance. This book is ideal for management at every level.
HN Books Rankings
  • Ranked #5 this year (2022) · view

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
You can skip the tar pit of the Scrum Industrial complex (e.g. "here! you MUST take our certificate program!") and the drag setting pointless sprint boundaries and focus on just repeatedly, successfully, ship high quality software. The Accelerate book is a very good starting point -
You mean this book, right?

Yes, I read it. Yes our org put it in use.

Actually for today's year (2022) the book is not so much of a great news compared to when it was released (2018). Today almost everyone already understood that software development is done this way (at least the right software development process).

In 2018 for me this was an excellent book, it changed the way our organization positioned ourselves regarding software development, and we are very grateful we did that. At that time, before 2018, the "modern" software development companies were talking about "digital transformation", we were already in the agile world for some years but the book gave us a name to go for, "accelerate".

It's an excellent book and I agree with almost everything that is written in it (I work in the software development business for over 26 years).

Hope I helped with this comment.

And I have absolutely read that evil lizard people invented COVID to inject us with microchips. Doesn't make it true.

What I mean with this, is that there'll be many opinions and feelings posed as a fact. About testing too.

Which is why I dislike the OP's article, as I pointed out elsewkere in the comments: the things he poses as "rules" (earlier on he calls them tips, which is better imo) aren't rules: they are highly controversial opinions. As would the statement you read be.

Daily TDD, is more of an "art" than a science. It requires experience, and fingerspitzengefühl.

Though, the science on TDD and BDD is strong and convincing, summarized in "Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations"[1]: TDD and BDD significantly helps to build better software.


> we need to figure out another way of working that isn’t agile,

Isn't devops the next generation? (I mean devops as promulgated by Accelerate: , not 'smoosh the devs and ops teams together, add Jenkins and hope for the best'.)

I'm not convinced it's materially different. At scale it looks similar to large agile organizations of today.
And you'd be wrong.

First, I am not aware of any operating systems that are developed using agile methods, and second the empirical evidence is in that agile improves quality, so it would be better if they were.

It was joke. Your position looks rather like the true Scottmans fallacy to me.
1. The fallacy is called the "No true Scotsman". The "No" is rather important. :-)

2. And no, my position has nothing whatsoever to do with the "no true scotsman". I am not saying that they couldn't possibly be using agile because they have bad quality. I am saying that I am not aware of any of them using agile, and I was closely associated with at least one of them for a while.

And I also separately cite the evidence that has now been produced that agile improves quality, and conclude that if there are quality problems as you claim, a more agile approach would likely help, at least according to the empirical evidence.


Given your rather patronising replies, I suspect the original joke was lost on you. The issue is not Agile per se, it is organisations assuming that "Agile" will solve all their problems, where "Agile" usually means vendor products, forced rituals and tick boxes. Yet more Agile evangelism does not address this problem.

What organisations need more than anything is open minded individuals that are not slaves to established dogma. The Agile manifesto actually says that all methods should be constantly re-evaluated to see if they fit the particular organisation/team. Your empirical "evidence" likely comes with a lot of context and is unlikely to be applicable to all organisations and/or domains.

And all that was supposed to be read into

> Is there any operating system left now that isn't riddled with bugs? I blame Agile.


I think not.

If anything, that deserved even more patronizing replies.

> And all that was supposed to be read into.

Of course not, I am however attempting to explain in detail why such sacrilegious "blame Agile" comments might be made.

> If anything, that deserved even more patronizing replies.

I am highly suspicious of anyone who is so certain of their position, we should all be riven by doubt!

Yes, we're in agreement. A coworker of mine pointed me to the book Accelerate [1], which I admittedly still haven't read, which also mentions that organizations where management tries to make sure engineers are at 100% utilization end up being dysfunctional, partly for the reasons you pointed out.


It's one of those things that works for a quarter or two, but then you start making your engineers hate their job and productivity drops quickly when morale is suffering.
I prefer studies over anecdotes and found this:

According to this study, you can measure a team's progress and ability with:

* number of production deployments per day (want higher numbers)

* average time to convert a request into a deployed solution (want lower numbers)

* average time to restore broken service (want lower numbers)

* average number of bugfixes per deployment (want lower numbers)

I am curious about other studies in this area and if there is overlapping or conflicting information.

Haven't read the linked study, but it seems like all of those metrics (except time to restore broken service) can be very easily gamed - thinking about the recent SO blog post
Why would you want to game metrics that honestly can help improve your business outcome? The study is worth reading and to put it simply: the metrics seem to make sense anyway and if science proves that they do, that’s even better.
> Why would you want to game metrics that honestly can help improve your business outcome?

Q: If your manager imposed OKRs/KPIs on your team, especially if there were a financial incentive linked to achieving "results"/"performance indicators", how would you feel?

Note that what is good for the business may or may not have much to do - at least in the short to medium term - with what is good for the employees.

Definitely a reason to try and game the metrics (and probably easy too). In my experience the metrics make sense when you use them to improve as a team, I would actually try to avoid using them as OKR or KPI.
Though I posted the study, I do want to see more work in that area to repeat the observed effects or to refine the KPIs. For example, I am personally skeptical of the notion of "number of defects per release" and want to see an exploration of "amount of time spent on defects per release".

Science doesn't tell us anything (that would be authority or religion), and it doesn't prove things. It is a process that can assist us in logically determining what is NOT true about a cause-effect relationship to the point where we can make more accurate and practical predictions about those causes and effects. More experiments refine the current beliefs.

So...shouldn't the focus be fixing the fact that release cycles take several weeks?

This is well researched and discussed in Nicole Forsgren's book: Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations

Sometimes long test cycles are just the nature of the business, especially in hardware-software co-development, and especially in high-reliability development. Nobody will believe that running software tests on a workstation or server somewhere counts for release readiness if it gets deployed to a wind turbine, robot, or megawatt UPS.

The software team's automated tests indicate that the software is ready to start an expensive physical testing cycle, not that its ready to go to the field.

If we can all agree that "shipping" is a feature[1] and that "shipping faster" is a competitive advantage, then "shipping fast" is probably one of the first features we should invest in as a team, from leadership to private chef.

If it takes 10 people and six hours to accomplish the release of even the smallest of features, then it sounds like there's a bug with your "ship fast" feature and the team should invest some or all of its resources in fixing that. You probably can't do it over night, but you gotta chip away at it over time at the very least.

If you are looking for justification backed up by real world numbers, I highly recommend the book "Accelerate" by Nicole Forsgren,[2]



Yeah, I know. And I've read the book. But I work in enterprise devops. This is the reality of many, many teams. Which means there's a lot of unnecessary process that needs to get tossed and a lot of automation that needs to be built in order to do short iterations and get out of the 3-6 month window.
Me too! I feel your pain.

Have you considered giving it as a gift to your peers and leadership? I really think it's a great read and resource for anyone trying to sell organizational change.

I wonder if any of those "automation needs" could be turned into startup ideas?

Done that, too.

I actually tried a startup in the monitoring space, but sadly failed (it was great fun losing $100k or so, tho... well worth the experience!). But the automation needs problem is more a consultancy thing than a product thing. There are lots of products. Teams buy them, with the best of intentions. Lots of management wants to believe they can buy their way out of the hard problems of not having any discipline. Sigh.

HN Books is an independent project and is not operated by Y Combinator or
~ [email protected]
;laksdfhjdhksalkfj more things ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.