I appreciate your work promoting modern engineering practices and agree wholeheartedly on their power. However, I believe widespread industry adoption remains significantly slow.
I observe that the challenge isn't technical; it's organizational and educational. Developers often lack this training from universities or employers, leading to a reliance on individual drive (e.g., through channels like Tech Excellence) or the foresight of a few companies to provide a learning budget.
My central hypothesis is that this slow adoption stems from a failure to translate "Engineering Excellence" into a clear, compelling business case for top-level decision-makers.
The DORA/Accelerate metrics provide excellent general statements, but they don't give a board or CEO the specific, calculated ROI needed for investment approval.
I propose that the community needs an exemplary, quantifiable model.
Have you considered developing a case that directly compares the total cost of ownership (TCO) for a project:
Team A (fewer, highly skilled developers) vs. Team B (more, average developers).
Calculating the long-term ROI on dedicated coaching/training investments.
Such a document, based on a credible model, could be the key to unlocking executive buy-in.
Thank you for your thoughts on this interesting problem.
Yes, it's very unfortunate that industry adoption is low. Many ideas that are decade(s) old are still not yet implemented in many companies.
The root cause is: lack of education. Developers don't learn about this at university. They also don't learn about it at their employer (many employers might not buy into training). So then the only cases I've seen is there certain individuals go above-and-beyond to learn in their own time, to make their own GitHub demo projects for practice, and to try it out at work, and to be the driving force despite the resistance.
Even when they read DORA book, or read case studies, they might see it as "But it won't work for us". And if we're asking for a complete overhaul (e..g introducing adequate automated testing), the investment is too big, in terms of time and money.
That's why I do *not* advocate any more for asking for any big investment, or doing big ROI calculations, because it's often seen as fluff.
Instead, I try to keep it simple, for example: "We already have 3 QA Engineers, but even with them working full time, it takes 2 weeks to do pre-release testing, and they often find regression bugs. Due to these problems, our last release was delayed by 4 months. If instead we had automated testing, we could significnatly reduce the number of regression bugs and reduce QA testing. I recommend that we get setup with automated testing for at least the critical scenarios".
Then, my advice would be to build a basic pipeline architecture, and get a handful of E2E tests running. That's it - nothing else, no Unit Tests, no Static Analysis, etc... Then, do a showcase to demonstrate how theose E2E Tests helped something (e.g. they reduced manual testing time by XYZ for a specific team).
In this way:
- We move incrementally, so hence our "ask" for resources is kept small
- At every increment, we showcase our success (by showing "our" metrics before and after)
- Then, since we've built trust by demonstrating results, we have the permission to make the ask for the next increment
Interested to hear what you've tried and your thoughts too...
I really like the practical approach you describe. It reminds me of the Lencioni book, "Getting Naked: A Business Fable About Shedding The Three Fears That Sabotage Client Loyalty."
What I've tried in my current role at my current company includes: Sharing information about engineering excellence and practices**,** and sharing my own experiences with it. Bringing people into the organization with know-how and experience. Organizing trainings like https://alcor.academy/foundation-training-program and coaching programs like https://ase-coaching.com/coaching/fuer-entwickler/. Advocating learning time, so people can study good quality learning materials. Organizing internal communities.
But now, I'm a bit exhausted.
I have also reflected about my activities.
So, my current plan is to lean more into the direction of the business case for education / learning organization when speaking with decision-makers. And to move more into a coaching role (coaching as defined by organizations like ICF) in general.
Based on your experiences in organizing trainings, coaching, learning time, internal communities, etc., I'm interested to hear your perspecitves:
1. What obstacles did you see, or pushbacks, in getting those events to go forward?
2. What were the outcomes of the programs? What did the participants gain? But also what were the limitations when it came to real life practice)?
3. "So, my current plan is to lean more into the direction of the business case for education / learning organization when speaking with decision-makers" Interesting direction, in what way do you plan to do that?
Regarding 1) and 2) I see several challenges. The first one is the "unknown unknowns", so people often don't know things. So you make sure, that people hear about things. Then the dunning-kruger effect hits. Like after watching an one hour YouTube video people think, they are an expert on TDD now. So, you do some practicing to go deeper, but then people realize this will take serious time to master and they know other ways to get things done, "I don't have time for that". These are the typical issues is see. There are at least two solutions for this: 1. actually working closely together on real tasks and demonstrating these skills, so people experience the benefits and 2. deliberate practice like Anders K. Ericsson described in the book "Peak: Secrets from the New Science of Expertise" (highly recommended!!!)
Regarding 3) I set up up fictive scenarios like "low skilled Team A vs highly skilled Team B" and ask, if this is a plausible scenario. And then ask, what are the costs for each of the scenarios. Or I do ROI calculations for learning and coaching time, which are an investement.
The #1 lesson from the book for me is the following: There is a clear recipe on how to become an expert or a peak performer in disciplines like chess or tennis, where there are a) clear and objective indicators for performance b) known and well working training methods and c) good mechanisms for feedback. It's not about talent, it's about deliberate practice over an extended period of time (which is this recipe).
Now the follow up question is the following: do a) b) and c) also apply to the discipline of Software Engineering? Sort of, I think. That's why I find the book important for everyone who is involved in learning.
Deadline Driven Development – definitely a trap too many teams fall into! Skipping TDD and ATDD means your system is always breaking and it feels impossible to fix.
Hi Valentina,
I appreciate your work promoting modern engineering practices and agree wholeheartedly on their power. However, I believe widespread industry adoption remains significantly slow.
I observe that the challenge isn't technical; it's organizational and educational. Developers often lack this training from universities or employers, leading to a reliance on individual drive (e.g., through channels like Tech Excellence) or the foresight of a few companies to provide a learning budget.
My central hypothesis is that this slow adoption stems from a failure to translate "Engineering Excellence" into a clear, compelling business case for top-level decision-makers.
The DORA/Accelerate metrics provide excellent general statements, but they don't give a board or CEO the specific, calculated ROI needed for investment approval.
I propose that the community needs an exemplary, quantifiable model.
Have you considered developing a case that directly compares the total cost of ownership (TCO) for a project:
Team A (fewer, highly skilled developers) vs. Team B (more, average developers).
Calculating the long-term ROI on dedicated coaching/training investments.
Such a document, based on a credible model, could be the key to unlocking executive buy-in.
Thank you for your thoughts on this interesting problem.
Best regards from Switzerland,
Peti in collaboration with Gemini Flash 2.5
Hi Peti,
Yes, it's very unfortunate that industry adoption is low. Many ideas that are decade(s) old are still not yet implemented in many companies.
The root cause is: lack of education. Developers don't learn about this at university. They also don't learn about it at their employer (many employers might not buy into training). So then the only cases I've seen is there certain individuals go above-and-beyond to learn in their own time, to make their own GitHub demo projects for practice, and to try it out at work, and to be the driving force despite the resistance.
Even when they read DORA book, or read case studies, they might see it as "But it won't work for us". And if we're asking for a complete overhaul (e..g introducing adequate automated testing), the investment is too big, in terms of time and money.
That's why I do *not* advocate any more for asking for any big investment, or doing big ROI calculations, because it's often seen as fluff.
Instead, I try to keep it simple, for example: "We already have 3 QA Engineers, but even with them working full time, it takes 2 weeks to do pre-release testing, and they often find regression bugs. Due to these problems, our last release was delayed by 4 months. If instead we had automated testing, we could significnatly reduce the number of regression bugs and reduce QA testing. I recommend that we get setup with automated testing for at least the critical scenarios".
Then, my advice would be to build a basic pipeline architecture, and get a handful of E2E tests running. That's it - nothing else, no Unit Tests, no Static Analysis, etc... Then, do a showcase to demonstrate how theose E2E Tests helped something (e.g. they reduced manual testing time by XYZ for a specific team).
In this way:
- We move incrementally, so hence our "ask" for resources is kept small
- At every increment, we showcase our success (by showing "our" metrics before and after)
- Then, since we've built trust by demonstrating results, we have the permission to make the ask for the next increment
Interested to hear what you've tried and your thoughts too...
Thanks a lot for your reply, Valentina 🙏
I really like the practical approach you describe. It reminds me of the Lencioni book, "Getting Naked: A Business Fable About Shedding The Three Fears That Sabotage Client Loyalty."
What I've tried in my current role at my current company includes: Sharing information about engineering excellence and practices**,** and sharing my own experiences with it. Bringing people into the organization with know-how and experience. Organizing trainings like https://alcor.academy/foundation-training-program and coaching programs like https://ase-coaching.com/coaching/fuer-entwickler/. Advocating learning time, so people can study good quality learning materials. Organizing internal communities.
But now, I'm a bit exhausted.
I have also reflected about my activities.
So, my current plan is to lean more into the direction of the business case for education / learning organization when speaking with decision-makers. And to move more into a coaching role (coaching as defined by organizations like ICF) in general.
With best regards from Switzerland!
Peti
Based on your experiences in organizing trainings, coaching, learning time, internal communities, etc., I'm interested to hear your perspecitves:
1. What obstacles did you see, or pushbacks, in getting those events to go forward?
2. What were the outcomes of the programs? What did the participants gain? But also what were the limitations when it came to real life practice)?
3. "So, my current plan is to lean more into the direction of the business case for education / learning organization when speaking with decision-makers" Interesting direction, in what way do you plan to do that?
Regarding 1) and 2) I see several challenges. The first one is the "unknown unknowns", so people often don't know things. So you make sure, that people hear about things. Then the dunning-kruger effect hits. Like after watching an one hour YouTube video people think, they are an expert on TDD now. So, you do some practicing to go deeper, but then people realize this will take serious time to master and they know other ways to get things done, "I don't have time for that". These are the typical issues is see. There are at least two solutions for this: 1. actually working closely together on real tasks and demonstrating these skills, so people experience the benefits and 2. deliberate practice like Anders K. Ericsson described in the book "Peak: Secrets from the New Science of Expertise" (highly recommended!!!)
Regarding 3) I set up up fictive scenarios like "low skilled Team A vs highly skilled Team B" and ask, if this is a plausible scenario. And then ask, what are the costs for each of the scenarios. Or I do ROI calculations for learning and coaching time, which are an investement.
What's the #1 lesson you personally got from Anders K. Ericsson's book "Peak: Secrets from the New Science of Expertise"?
P.S. Later when you do get to doing ROI calculations, I'd be interested in knowing how it goes...
Awesome and tough question Valentina! 😀👍
The #1 lesson from the book for me is the following: There is a clear recipe on how to become an expert or a peak performer in disciplines like chess or tennis, where there are a) clear and objective indicators for performance b) known and well working training methods and c) good mechanisms for feedback. It's not about talent, it's about deliberate practice over an extended period of time (which is this recipe).
Now the follow up question is the following: do a) b) and c) also apply to the discipline of Software Engineering? Sort of, I think. That's why I find the book important for everyone who is involved in learning.
Deadline Driven Development – definitely a trap too many teams fall into! Skipping TDD and ATDD means your system is always breaking and it feels impossible to fix.
Yes, it's a vicious cycle. The team is busy fire-fighting, they have no time to introduce automated testing, hence continue with fire-fighting.