Programming: "You've built the wrong thing!"
Developers are told they built the wrong thing - but they followed the user story exactly. So what’s going on?
👋 Welcome to the premium Optivem Journal. I’ll help you apply TDD to escape the nightmare of Legacy Code. Join our paid community of 160+ senior engineers & leaders for support on your TDD journey, plus instant access to group chat and live Q&As:
The dev team is told: “Build this.”
Then later they’re told: “You did it wrong - that’s not what we wanted.”
Then someone else says: “Actually, this other feature is the real priority. Can you start on that instead?”
Your team gets the user story. Everyone agrees it’s clear. The developers start building. The QA team starts testing. And then…
QA says: “This doesn’t work right.”
Developers say: “We followed the user story.”
Then the PO says: “This isn’t what I meant at all.”
So, why does this keep happening?
The problem with vague requirements
Everyone’s frustrated - but no one had the same picture to begin with.
When requirements are too vague, the developers, QA, and the PO each have their own idea of what the feature should do - and those ideas don’t always line up.
Developers are told that the feature is not working correctly, even though they thought they did do it based on what business wanted.
And they’re right - they built what they thought was requested.
But here’s what’s really going on: there are gaps in understanding. Between the PO and developers. Between assumptions and examples. Between what's said and what's meant.
We need example-based requirements.
But what if, instead of vague requirements, we had clear, concrete examples? Inputs and outputs. Real, testable scenarios.
Instead of vague user stories, if we defined specific examples before coding begins, we would eliminate the misunderstandings.
That’s what ATDD is all about.
User Story: Unclear vs clear Acceptance Criteria
User Story:
As a customer, I want to receive free shipping for large orders so that I am incentivized to make larger purchases.
Unclear Acceptance Criteria:
The Business might not think of precise scenarios. They tend to communicate requirements with imprecise but straightforward language.
Free shipping for orders above $50
There are two problems:
What’s the meaning of “above $50“? We don’t know whether it means > $50 or >= $50, or something else. Here interpretations of Developers, QA & PO can be different.
What about the case of orders below $50? It’s not handled at all.
Sample conversation:
Developers say: “Mathematically, above means larger than $50”
QA says: “We think ‘above‘ means from $50.01 but not $50”
Then the PO says: “Business means of course, $50 is included”
Clear Acceptance Criteria:
At the start of the sprint, before we begin working on a user story, the team comes together to convert vague, generalized requirements into clear, executable examples.
Unlike general requirements - which are often “gray” and open to interpretation - these examples are black and white. They leave no room for guesswork and provide a clear definition of Done: what we build either passes or fails these examples.
Developers need to know what “done” actually looks like - before coding.
Scenario Outline: Free shipping for orders above $50
Given the order total is <order_total>
When the customer places the order
Then the shipping cost should be $0
Examples:
| order_total |
| 50.01 |
| 51.00 |
| 60.00 |
Scenario Outline: Standard shipping $5 for orders $50 or less
Given the order total is <order_total>
When the customer places the order
Then the shipping cost should be $5
Examples:
| order_total |
| 40.00 |
| 49.99 |
| 50.00 |
Here, the argumentation about the meaning of “above $50“ is black-and-white. Developers, QA & PO can’t argue about it after implementation, because prior to implementation, we’ve aligned on the meaning.
Want to Learn How?
How can you introduce ATDD & acceptance tests into your development process?
See the TDD in Legacy Code series, where we have a section ATDD in Legacy Code.
Want the full roadmap? I'm hosting a workshop breaking down exactly how to implement this. I’ll show you how to align the team regarding requirements and eliminate manual QA regression testing, so that you’ll reduce delivery time and reduce regression bugs.
🚀Join me on Wed 6th Aug (17:00 - 19:00 CEST), I’m hosting ATDD in Legacy Code Roadmap (Live Workshop). During this workshop you’ll learn:
Why ATDD will reduce software delivery time and reduce regression bugs, by aligning the team regarding requirements & replacing manual regression testing
How to introduce ATDD in Legacy Code in an incremental way, while still keeping up with existing feature delivery
How to convince Engineering Managers, Software Engineers and QA Engineers to adopt ATDD in Legacy Code
Let's transform your development process together!
P.S. Regular price: $97 (free access for Optivem Journal paid subscribers - see event description for details)
How do you usually help teams respond when the PO says “this isn’t what I meant” after the feature’s already built?
Yes, the Acceptance Tests give you the "Definition of Done," and Example Mapping makes the "implicit" things "explicit." This aligns with what Kent Beck said about Stories and the 3 Cs of Stories: Card, Conversation, and Confirmation.
You talk with the business, hear their stories, and write them down on index cards—that’s a promise for future conversations. The business then decides on priorities. Next, you dive into the selected stories, having detailed conversations to understand them thoroughly. After that, you write the acceptance test scenarios, which provide the confirmation for the story.
Before moving to Acceptance Tests, I believe a User Story Map is required. It builds a shared understanding among the team and it also helps prevent the problem of building the wrong thing and decide on the scope of MVP. This way, we move fast not by writing code faster, but by eliminating the building of the wrong things!