I was wrong about "agile" software development
(This is a repost from my personal blog)
Way back in 2004, I did a short contract job with a certain major insurance company. It was starting the process of getting out of the old mainframe world. To help with this, they brought in a certain well-known consulting company to coach Agile software development. I capitalize it because they do, to this day, and when I use it this way, I'm referring to the more formalized process. But it isn't the same thing as being agile, which to me means that you move quickly, with focus on delivering stuff. These folks introduced me to a lot of formalized stuff like use cases, user stories, sprints and stand-ups.
By the time I got to Microsoft in 2009, the ideas, based around the Agile Manifesto, were starting to take root in the industry, through Microsoft was still very much arrogant to think it generally did stuff the "right" way without Agile. They were still handing out fancy "ship it" awards, a relic of shrink-wrapped software, which seemed wholly absurd working on a team that "shipped" every few weeks. My team used these Agile methods, and I had been exposed to so many consultants and experts that I wholly bought into it all.
As time went on, I became less interested in Agile, and really started to question if it was adding value. A turning point was a contract that I did in 2013. There I encountered the whole dictionary of Agile terminology and ceremony, and I found it completely ridiculous, especially for a team of just four developers. They spent an entire day doing planning once every two weeks, and they did all of the things, like planning poker (if you know, you know), anonymous retrospectives, and deep analysis about why any particular thing didn't get finished inside the sprint boundaries. It was awful. I quit after just a couple of months.
An entire industry has grown around Agile, and coaching people on how to use it. This global consultancy shows you how to use tools and do all of the ceremony that is generally associated with Agile. It's just become accepted, and people don't question it. But I do question it, because I think if you really get back to the manifesto, the values have been hijacked to enforce misplaced accountability, and to do a lot of box checking around habits. I've been part of the problem.
So here's my manifesto, and I'll explain how it overlaps and uses some of the values of Agile, to make you agile.
- Everything is an assumption until proven otherwise. Whether you're deciding on a "user journey" (ugh) or technical design, every choice that you make is predicated on some assumptions about how you get to the outcomes. In practice, these assumptions are often wrong, and teams either commit anyway or they use the iterative feedback and data to go in a better direction. That's why you have to right-size your investment in planning. There are too many things that you don't know. Things will change.
- Estimation is a crude tool. Software people never get estimation right. I've sat in a million seminars that explain the right way to do it, and it doesn't matter. The entire industry sucks at it, and has for decades. Humans desire to gain approval from others, and believe they can do things, and that bias always bleeds through. At one point, I even had engineers estimating in half-days, which some of them hated, and it turned out that others were using it as an accountability tool, which is not the point. As a rule, my macro estimation for a project is in rough developer-weeks, plus 25%, plus an understanding that there will be a hardening phase after to shore it up. This has never failed me. For stories that developers work on, points are fine, but should only be used for allocating work in a sprint, not accountability metrics.
- Context matters everywhere. It's easy to generalize about how to do anything, but humans are not cogs. Sometimes less experienced people need more structure, while the opposite just needs you to get out of the way. Hard problems with many unknowns take longer to solve, and don't fit conveniently into boxes. Embrace the need to adapt.
- The process is a product. We spend so much time iterating on the machine, but never iterate on the machine making the machine. You can and should get into a rhythm of using the process that works for you, but don't let habit hide better ways to do stuff. Be critical of habit, and toss what isn't working, look for ways to improve what is working.
- Prototypes as early milestones. There is probably some core thing that you want to happen, and it's surrounded by UI or persistence or both. People often want to start at the ends instead of the middle, but doing so takes longer to prove you have anything useful, delivered. If you're inventing a car, you don't build the dashboard or the doors first, you build the motor to prove you can build a car.
- Delivery over timing. The usual business discussion involves asking when they can have the widget. See the previous bits about assumptions and estimations. Instead, ask what could be delivered in a timeframe. It might be enough to satisfy an outcome. This creates better focus and keeps you on the right side of the 80/20 problem.
- Perfect is the enemy of good. I obviously didn't make this up, but people need to be reminded of its truth. I've seen teams labor over decisions or be overly critical of design, when good enough allows you to deliver and validate what you've got. Few decisions are one-way doors, and when you need to go back through them, it just means that you're reconciling the assumptions with reality.
What does this mean in practice? It varies from team to team, but you'll notice that, like the Agile Manifesto, I don't get into ceremony or mechanics. That's intentional. A lot of the ceremony and mechanics that folks do out of habit adds little value, or worse, gets in the way of delivery of working software. So in my case, my current team really does what some call "scrumban," a cross between scrum and Kanban, but even that's an imperfect label. We do two-week sprints, starting and ending on Wednesday, but the the cadence serves only to have a consistent time for planning and grooming. I concede that specific routine is necessary because of time off, holidays and such. We don't make "commitments," we just load up each developer with enough work to last the two weeks, and if things carry over, that's fine. We don't have formal iterations or releases, and we continually integrate work into production as we go, mostly everyday (which makes the two week time frame even more arbitrary). The product owner, a product manager, in our case, reviews the backlog and determines what the priorities are with the team. We do a retrospective with people named to discuss the point (context matters). We demo and celebrate completed stuff. And every morning, we spend under 10 minutes in a stand-up meeting.
A project starts out with requirements, which are socialized appropriately. A lead developer looks at the requirements when they're good enough, and sketches out a solution, bucketing the work in chunks and estimating them in developer-weeks. At this point, with the crude estimation in mind, the business can decide if the project is worth doing, or it can negotiate some subset of functionality to satisfy some outcome. Once that's socialized and given a green light, the team breaks up the work into the known stories that map to the buckets of work. After the work starts, unknowns are discovered and are added to the backlog. Over the course of the project, change happens, and some work is cut, other things are added.
I can smell the questions, so let me answer a few.
- But Jeff, how do you measure developer productivity? I've never understood this question, because it's pretty obvious when someone isn't pulling their weight. "Points" are already arbitrary, and not very contextual, but you know when a developer isn't finishing work. Stories age, software is not delivered.
- And team productivity? The most abused thing in software engineering is managers weaponizing estimation and velocity. I've had it as an accountability metric myself, and it's nonsense. The purpose of estimation is to help set expectations and and create understanding around the effort versus value, the ROI. It's all assumptions, remember? The only metric that ultimately matters is the delivery of working software. MBA's don't understand this. Nobody cares about the arbitrary numbers when you're delivering software. And when the team isn't delivering, you don't need the numbers to know that. Time tracking is extra terrible, and I regret ever being bullied into imposing it on my teams.
- So no planning poker, all-day planning meetings? No. In my experience, where these tactics have been used, they do not materially affect the delivery of software.
- But letting stories rollover to the next sprint is wrong! When you worry about that, here's what happens. A dev finishes their assigned stories a day early. Because of the arbitrary sprint timing, the product owner and the dev find some item that will fit in the time remaining. It is never the most important thing, it's just the size-appropriate thing. I'd rather they start the most important thing, and if that rolls into the next sprint, so be it. Also, sometimes you don't know where the bodies are buried, and stuff takes longer than you expected. That's not failure, it's change, and it shouldn't have to be justified.
- What about less experienced teams? Context matters, so you need to have the appropriate habits and process to accommodate that. My current team doesn't need to have everyone sit and watch each other write stories, so we try to do as much of that asynchronously as possible. With a less experienced team, I'd rather they spent more time together learning how to do that.
- What kind of artifacts do you work with? For projects, a decent requirements document is a starting point. It doesn't have to be perfect, but hopefully lays out at least 80% of the intent. Technical leads generally produce some diagrams that have a contextually appropriate amount of detail, not an amount of detail dictated by some process acronym or TLA. There are stories, which must have acceptance criteria, and probably notes for testing. There are always dashboards to measure the right things, both technical and business oriented, because they inform decisions as the product shapes into something useful. The dashboards generally link to runbooks that describe how to support the thing. The most important artifact is working software.
- This sounds a lot like winging it. Far from it. A project starts with a lot of ambiguity, and a technical leader and product leader work together to continuously reduce that ambiguity. That informs what to do next. The thing that you're always monitoring is delivery, instead of a bunch of metrics that aren't really actionable. Making software is a strange profession that feels like science but requires creativity. You can't "manage" its creation like you do durable goods or artifact creation. As long as you have written down, agreed upon outcomes, the process is about seeing how close you are to those outcomes. Solve the contextual problems blocking that path to outcomes.
It's hard for me to truly codify this approach, because it's not about ceremonial habits, which a lot of people believe is "Agile." Maybe those habits work for some people, but it seems like they have to spend a lot of time practicing them. That's energy that I feel distracts you from delivery. My earlier points are more cultural expectations than they are habits. So for whatever habits your team does develop, ask yourself if they're counter to these cultural expectations.
Whatever you do, don't let developers spend 15 minutes debating the points to put on a story. Don't let them do it for more than 60 seconds. It won't matter, I promise.