Waterfall to Scrum: The Value of Agility

Podcast Ep. 132: Waterfall to Scrum: How to Measure the Value of Agility with Sam Falco

Waterfall to Scrum: The Value of Agility
Share on facebook
Share on twitter
Share on linkedin
Share on email

Episode Description:

This week, Dan and Sam are answering another fascinating listener question. This listener wrote in, “We are in an organization that has been going through an agile transformation. Our leaders have been asking for metrics to compare Waterfall vs. Scrum. They want to know if we are delivering more with Scrum. How do we measure this?”

Navigating how to measure values when transitioning from Waterfall to Scrum can be a difficult challenge. How do you measure whether something is a successful initiative or not? What are the key differences in what metrics you should be measuring between Waterfall and Scrum? How do you measure the value that Scrum brings? What are some of the key metrics you should be measuring? Tune in to find out!

If you have a question of your own (or would like to share your thoughts on this episode), you can email the podcast at Podcast@agilethought.com or Tweet @AgileThought using #AgileThoughtPodcast.

Listen on Apple Podcasts

Key Takeaways

  • How do you measure the value your organization gets from Scrum vs. Waterfall?
    • Typically what is measured in Waterfall is “iron triangle” stuff (i.e. “Were we on time? Did we stay on budget? Did we complete the scope?”) However, these things don’t really indicate whether or not you are actually getting value out of the effort
    • “Too often, what we’re actually measuring in Waterfall was: ‘Did we get all the work done in a certain amount of time and within budget?’ And that’s not what we’re interested in, in an agile effort.” — Sam Falco
    • Find different ways to measure the value, whether it’s in actual dollars earned or cost savings or goals achieved
    • It is more important to measure the return on investment and the value it brought to customers rather than “Did we push it across the finish line?”
    • Measure based on value rather than on velocity
    • It’s not uncommon for those from a Waterfall world to measure how many tasks were completed rather than delivering valuable increments (which is critical with Scrum)
    • Regardless of whether it is Waterfall or Scrum, you might measure: Generated revenue, saved spending, etc.
  • Evidence-based management as a method for measuring the success of an agile effort:
    • Reference the Evidence-Based Management Guide from Scrum.org
      • It will provide you with a number of key value areas to help determine whether you are actually adding value, as well as some suggestions for key value measures for each of those areas
    • Listen to episode 107 of the Agile Coaches’ Corner, “Evidence-Based Management 101 with Sam Falco”
    • Think about the metrics you have in place now and whether or not they apply now (i.e. “Does this metric we’re using help us determine the current value for our products?”, “Does it help us determine time-to-market?”, “Does it help us determine our ability to innovate?” And if not, ask, “Do we really need it?”, “Are we measuring something that matters and drives results?”)
    • Measure lead time (If it shrinks dramatically, this is a good sign that this is going to work for your orgoanization)
    • “Take a look at what metrics you’re measuring now. And if those were valuable in telling you whether or not your efforts were worthwhile, they might be appropriate now.” — Sam Falco
      • Even though your process is different, you can still measure using a lot of the same metrics — you just have to look at them in a different way
    • The most important things to measure are around the value (“Is the stuff we’re producing actually valuable to people?”)
  • What are some of the ways we can measure value?
    • Quality metrics (Has your quality increased?) — Agility tends to come with increases in quality because of a limit of work-in-progress and helping the teams focus
      • Look at trends; not single points in time
      • Have your escape defects gone down? Has your technical debt gone down? 
    • Look at things like, how long does it take you to get a release ready for production? Is it taking less time? 
    • One of the values of an agile way of working is that you build in feedback loops for looking at your own processes more frequently than Waterfall
      • Through this, analyze trends and track process improvement 
    • There is no single way to measure output from Waterfall to Scrum
    • You have to always be aware of what the metric is actually telling you and evaluate not only your processes but whether you are using the right metrics (and whether they are telling you things that are helpful to you)
    • Releasing in Waterfall tends to be infrequent, painful, and fragile. Releases should be frequent and near-automatic in Scrum. Measuring the pain of releasing is an indicator of how easy or difficult it is to capture value
    • Measure deployment frequency (“How often do you actually deploy to production or customers and users?”)
    • Measure: “Are customers happy with the number of releases? Are they happy with the features they’re getting?”
    • Measure “mean time to first failure” — though it is an old metric in software development, it can still be valuable (i.e. “How long does it take for you to break something?” This should be getting longer and longer as your quality goes up)
    • Keep track of process improvements

Mentioned in this Episode:

 

Transcript [This transcript is auto-generated and may not be completely accurate in its depiction of the English language or rules of grammar.]

Intro: [00:03] Welcome to the Agile Coaches’ Corner by AgileThought. The podcast for practitioners and leaders seeking advice to refine the way they work and pave the path to better outcomes. Now here’s your host, coach, and agile expert, Dan Neumann.

Dan Neumann: [00:16] Welcome to this episode of the Agile Coaches’ Corner podcast. I’m your host, Dan Neumann and today’s guest is recurring frequently recurring guest, Sam Falco.

Sam Falco: [00:27] How you doing Dan?

Dan Neumann: [00:28]
I’m doing well. Looking forward to see just how this goes today. Sure. Yeah. We’ve uh, we’ve clicked stop a couple of times today. I think take 19, and we have a listener question that’s going to be the subject of today’s episode, which is if you have a question so they can email us podcast@agilethought.com and Sam, do you want to read the question.

Sam Falco: [00:55] Yes, yes. The question we are in an organization, a bank that has been going through an agile transformation, our leaders have been asking for metrics to compare waterfall versus agile. We use Scrum. They want to know if we are delivering more with Scrum, et cetera. Do you have any ideas to share?

Dan Neumann: [01:13] I think that’s a really tough question.

Sam Falco: [01:15]
It is. And it’s a pretty common one. The first time I ever encountered that I was working for an insurance company and they hit me with that. And my answer was something like, Oh, well it’s, if we can do a little bit better than that.

Dan Neumann: [01:35] Yes. I mean the first, well, the first part that comes to me is like, what did you measure to know if you’re getting anything for waterfall, right? Where are you?

Sam Falco: [01:44] Right. Uh, typically what we measure in waterfall efforts is iron triangle stuff. Where are we on time? Did we stay within budget? Did we complete the scope? These don’t actually tell us if we’re getting value out of the effort that we were undergoing. We might measure things like did this generate revenue and how much? Did it save us money, how much, et cetera. And if you were collecting those metrics, then you can apply the same metrics from the previous effort to what you’re, what you’re doing now. But too often, what we’re actually measuring in waterfall was did we get all the work done in a certain amount of time and within budget? And that’s not what we’re interested in, in an agile effort. Those, those three things are, I’m not saying they’re not valuable to look at, but they are less important than many other metrics we could. We could use.

Dan Neumann: [02:54] Let’s talk about that one for a second. Um, because with agile is specifically with Scrum time is a big deal. We have right time box for the sprint. We have time boxes for the Scrum events that are in there. So time and delivering valuable increments is a big deal.

Sam Falco: [03:16] It is, but we sort of take that for granted, right? We have a Scrum length, a sprint length rather, um, that tells us what our budget is. We should know what our, what our run rate for our team is. We should know a team costs say $50,000 per sprint. Um, and so we can look at, well, it took five sprints to produce this release. It costs us $250,000. Was it worth the $250,000 that we spent? And we can compare things that way, but we’re less interested in, did we, did we spend $250,000 or not? It’s sometimes in waterfall organizations, we had that budgeting mentality too. That is, did you spend the money we budgeted you for? Because if you didn’t, you’re not going to get any next year. So that gives us an incentive to spend a lot of money and not worry so much about, did we give value in an agile world we care more about, did we provide value and did we provide value commensurate with the expenditure that we expended brilliant English sentence there. Um, and is that what we want to keep doing?

Dan Neumann: [04:29] Yeah. So finding different ways to measure the value, whether it’s in actual dollars earned or whether it’s in cost savings or goals achieved, whatever, whatever your goals are. Have you achieved those for, for the effort?

Sam Falco: [04:50]
Yeah. In a waterfall war, we might say, uh, Oh, Hey, we are, we successfully created this app for our customers to use great. What we care about, what we should be caring about is are users happy with the application? Are they using it? Is it giving us the return on an investment? Um, rather than did we, did we push it across the finish line at the last minute often with defects to be fixed later, et cetera, et cetera.

Dan Neumann: [05:18] I find, as we think through this, then, um, I think I’m going to make a meme out of it someday because one of the managers said, if I ever do your review, based on the velocity of your team, you can leave a flaming pile of dog poo on my desk. I think facilities management would have something to say about that. I think they would do. And I don’t know if that would activate the sprinklers, but, uh, it was nice to hear a manager say it’s not velocity. Like that is not going to be the measure of, did I get more with Scrum more from this team, this sprint, a previous one. So I just wanted to, that’s a not for what to measure.

Sam Falco: [06:04]
Right. And it’s not uncommon for folks coming from a waterfall world into agile, to glom on to metrics like that because it makes sense to them. Right? I know how to measure. Did people do a lot of things, but is that what you really need to be measuring? And it takes some effort from a coach or a Scrum Master to work with leaders who that’s their mentality to understand where they’re coming from. Remember, we want to have empathy. This is not usually not done in a, I want to destroy people’s lives sort of way. I want to understand. I want to help. I want to make sure that my, my area is successful. And then we can say, well, velocity doesn’t really help you measure that. Let’s talk about how we know whether or not this is successful initiative.

Dan Neumann: [06:55] Well, let’s talk about it. Do you want to explore then Evidence-based management as one of the ways of measuring success of agile efforts?

Sam Falco: [07:04] I always want to explore evidence-based management.

Dan Neumann: [07:10] You ask a Scrum trainer a silly question, you get a silly answer.

Sam Falco: [07:14] Uh, the evidence-based management guide is something that you can find on Scrum.org website. Of course, we’ll, um, put the link in the show notes. And for those of you who haven’t listened to our evidence-based management guide episode, I highly encourage you to go back and take a look at that one or take a listen to that one. Um, we covered a little more detail there, then we’re going to hear, but this provides you with a number of key value areas that can help you determine are you actually adding value and some suggestions for key value measures for each of those areas. It’s not the extent of it. It’s to get you started. You can certainly think about the metrics you’ve got in place now. And do they apply here? Does it help us? Does this metric we’re using help us determine what our current value, which is one of the areas as for our products or the unrealized value? Does it help us determine time to market? Does it help us determine ability to innovate? And if not, do we really need it? Are we measuring something that actually matters and drives results?

Dan Neumann: [08:25] Do we want to take maybe, uh, an example of one of these, uh, value areas and maybe share it? So in an organization that’s saying, Hey, we used to be doing the waterfall. We, we kind of knew if we were good at that, or not based on achievement of expected features for money spent by date forecast. Um, historically waterfall has not been good at that. Hence the reason for the agile manifesto, when people got together and said, Hey, we’re doing something different and it seems be working right. And that’s where the agility stuff came from.

Sam Falco: [08:59] Exactly. And some of the metrics that are mentioned in the Scrum guide are still are things that you will recognize from a waterfall world. Things like cycle time, lead time, these are often called Kanban metrics, but it’s not just come on. These are, these are good metrics to look at. How long does it take from when we propose an idea to when we actually can put it in the customer’s hand, um, lead time for changes, lead time for any sort of whatever subset of, of your effort, uh, cycle time similar, but we can go beyond that.

Dan Neumann: [09:44]
I suppose you could compare your lead time actually across the waterfall approaches to agile methods. So I know I’ve been on various sized projects that were highly predictive and we would, uh, often in a consulting place, it starts with a request for proposal. And then you do a requirements phase and a design phase and a build phase and a test phase that, you know, gets compressed because the build phase ran over and then there’s some piloting and then some value. So you could measure your lead time, your customer desires, something that’s when the RFPs, um, get put together, potentially it’s actually way before that because the organization has the idea. They have to get budgets for it before they even ask for proposals for solutions. So typically a pretty massive lead time.

Sam Falco: [10:34] And that was used in one organization was that they were very skeptical about this whole agile thing. That’s a quote from one of the leaders, uh, the whole agile thing. And he would always say it with the air quotes and was open-minded enough to say, well, we can try it, but didn’t think it was going to work. And they had a little pilot project. They had picked something that was meaningful, but not critical so that if it did blow up, it wasn’t going to cause problems and looked at lead time and well, the time shrank dramatically. And that was a really good sign that this was a good thing and that it was going to work for this organization. So taking a look, as we said, at the beginning of the podcast, take a look at what metrics you’re measuring now. And if those were valuable in telling you whether or not your efforts were worthwhile, they might be appropriate now. But how can we look at different ways of measuring that? Because lead time doesn’t measure value. It’s just how fast did you get stuff through the pipe? That’s great, but you can get stuff through the pipe that isn’t valuable and not be doing yourself any good.

Dan Neumann: [11:51] Although at least we’re getting it through and we’re not lingering on the invalue unvaluable thing that it is. Yeah. It’s over and done. And then you can see if it’s valuable, right? Potentially there’s an upside there.

Sam Falco: [12:05]
Yeah. Quality metrics are another thing that you might compare from your pre agile to post agile world, has your quality increased. It should, if you are focusing on it, uh, agility typically tends to come with increases in quality because we limit work in progress. We help the teams focus, et cetera, whether you’re using Scrum or some other, uh, agile framework, how’s your quality, are escaped defects going down. And as always, you want to be looking at trends, not single points in time, but have your escape defects gone down? Has your defects detected before you send it out? Is that gone down, is your technical debt. If you’re measuring how much technical debt you have, is that going down? So these are all good things to be looking at.

Dan Neumann: [12:56] In a Scrum world where we’re trying to deliver increments potentially many times a sprint you’re limiting your work in process. So hopefully that is reducing your defects because it’s increasing your focus. And then hopefully when we do have an issue it’s quickly resolved. So the time to resolve a defect hopefully would, would shrink too. Cause you’re not hopefully building queues that you review every month and do, uh, uh, defects triaged or something that kind of pre agile, right? You’d collect all the bugs and somebody would scrub them once a month and put them into a queue and schedule them. And you can get much faster turnaround time to remediate issues.

Sam Falco: [13:51] And you can look at things like, uh, how long does it take you to get a release ready for production? So typically in a waterfall environment, we, you know, we, we code like crazy and then we hand it off to QA who finds some stuff and then it gets kicked back and forth for a while. Often that’s referred to as a hardening period. Sometimes people carry that forward into, uh, an agile world, but maybe look at first of all, is it taking less time? And the, one of the values of an agile way of working is that you build in feedback loops for looking at your own processes more frequently than waterfall. Not saying you never do it in waterfall. I’m saying that we do it as a matter of course, and more frequently in an agile way of working. So, all right, well, let’s look at it. What can we do to reduce the amount of time we have to spend fixing things, improve quality during sprints, if you’re using Scrum, how do we do that? Um, and again, looking at trends, just single points in time, look at our trends.

Dan Neumann: [14:54] And you could track process improvements to keep track the numbers of process improvements or impacts of those. I know we did do retrospectives-ish when I was doing waterfall, we called them postmortems. And we typically did the once per project, right. Uh, in a Scrum world, since one of the Scrum events is the Sprint retrospective. Hopefully you’re doing that at least once a sprint, if not more. Right. I’ve, I’ve worked with Scrum teams where they’ve kind of had the, uh, the pull the cord approach where if something goes wrong during the sprint, they stop and they deal with it. I’ve heard of teams doing a daily retrospective as a checkout, just a really brief, how do they go? Anything we need to fix?

Sam Falco: [15:37] Yeah. Lots of opportunities to improve. So there’s a lot of, we’re not directly answering our, our listener’s question. Um, I think we’re not saying here’s how you compare it although we’ve touched on that, but help me out here, Dan I’d have no idea where I was going with that.

Dan Neumann: [15:55] Well, I think you are correct that there’s not a, this is the way to measure output from waterfall to Scrum. I’ve always been interested. This is things Dan is interested in, but, uh, for 25 years now hasn’t really got his head wrapped around it. I remember being excited about function points back in the mid nineties and the circa 1996, cause function point estimating was one of those things that if we could just figure out how many things the software has to do, or how many decisions it has to make, that is a concrete way or held the promise to be a concrete way of measuring. How many of those decisions have we created in this time box as a way of really measuring developer output either I’m too stupid or I think it’s actually, it’s too complex because it’s not going to be me, but like that never became a, a, a standard that gets used for measuring any kind of output. All we have is the sense that, well, is that a good developer? Is that a bad developer? Are they producing enough? Are they fast enough? Um, so I think it’s a tricky question.

Sam Falco: [17:06] Yeah, that’s a game of well metric, right? If, if you’re being measured on function, points by God, you’re going to create them.

Dan Neumann: [17:12] I know which ones are valuable and which ones aren’t.

Sam Falco: [17:15] Right, exactly. So you have to always be aware of what the metric is actually telling you and be evaluating not only our processes, but are we using the right metrics? Are these telling us things that are helpful to us?

Dan Neumann: [17:31]
We talked about the ease of releasing software in waterfall because we did it so infrequently. It was painful and it was fragile. So if you’re in a Scrum world where you’re embracing DevOps practices, releases, ought, be frequent and darn near automatic. And so for me, the measuring, the pain of releasing is an indicator of how easy or difficult it is to capture value.

Sam Falco: [17:59] Yeah. And there’s actually evidence-based management has recommends that metric as part of the time to market key value area deployment frequency. How often do you actually deploy to production or to customers and users, it should be more frequent. And then you can also measure our customers happy with this amount of the number of releases. Are they happy with the features they’re getting out of it? I mean, you can overwhelm a customer with too frequent a release, I think.

Dan Neumann: [18:36]
I’ve got an app that does just that. I’m like, Oh dear God, that’s the second time this week I’ve had to take an update. So, uh, if you can do it in a way that makes sense for your customers, um, I was at a place, it was a financial institution product. And the customer’s like, please don’t send us that more than once a year. Right. We have to back test. We have to make sure something broke. That’s make sure something didn’t break super expensive. We won’t take your software more than once a year. So that was interesting. Yeah. We still were able to use Scrum and make sure our increments were working every sprint, but as far as releasing to an individual customer, once a year was fine. Thank you very much.

Sam Falco: [19:22]
Right. Yeah. You have to know your customers and that’s pretty typical in a financial world or an insurance world where there’s an enormous cost to absorbing a new release. But one of the ways that you can bring down that cost is making sure that your quality is high. So they’re not likely to find that it breaks things. Uh, so those are some other things you can look at measuring again, those that, that quality, um, what was the, what was the one we used to use meantime to first failure, right? Uh, yep. It’s an old metric that’s been around for forever in software development. Maybe looking at how, how long does it take for you to break something that should be getting longer and longer because your quality goes up. I’m not saying you have to measure that. Just saying there’s a lot of these measures now that we’re really kicking this idea around. At first, I thought, ah, really not many will transfer over. They do. You just have to look at them in a different way and add new metrics that are around is the stuff we’re producing actually valuable to people. Because I think that’s the piece that a lot of times waterfall, project management, planning misses, we have the project, we completed the project. It’s good that we completed the project.

Dan Neumann: [20:37] We didn’t have to ask for more money and we’re done. Yeah. I know I’ve been on waterfall projects where there’s been a forecast of ROI when the project, I can’t think of an actual instance where anybody went back and looked at, did we achieve that ROI? And if they did there sure were no consequences for it.

Sam Falco: [20:58] It kind of ties into, you know, postmortems sometimes called lessons learned that they get written up and no one ever looks at them again. Yeah. Did we actually make these improvements? Did they, do we measure them? Did they have the intended effect? So similar thing in the waterfall world, it really is typically about, did we do the amount of work we said we were going to do in the amount of time we said we were going to do it. Yay. Move on.

Dan Neumann: [21:24] So there’s another thing that a Scrum team could measure and keep track of would be process improvements. Yes. If you’ve made a process improvement that, uh, increased, uh, your team’s ability to deliver features, every sprint that’s cool. If it made it easier to release because it automated a step that was manual. Pretty sweet. Yeah. How much automation do you have? So that you’re cutting the tail off the manual testing because that’s super expensive and hard to reproduce. That’s cool. So Sam, thanks for taking some time to explore the question that we got about different ways to compare waterfall and agile specifically in this case using Scrum. And it’s not an easy question. I mean, it’s a legitimate question, right? The answer is certainly not easy.

Sam Falco: [22:14] Absolutely.

Dan Neumann: [22:16] The evidence-based management guide definitely worth picking that up off of the Scrum.org website. Looking through that, um, in addition to then are we delivering things on time? In this case with the forecast we maybe gave stakeholders, are we delivering within a spend that makes sense for the value we’re going to get kind of that concept of budget. Um, and is the scope appropriate? I don’t want to say we do all of the scope because sometimes you learn that there’s stuff in there that you shouldn’t do. It makes no sense to do. Cut it out. Right. Don’t do it. Right. So it’s not just going through a requirements doc, that happens to be a story backlog. Um, so we’d love to, we’d love to hear from folks what maybe if they’ve done a comparison between their waterfall days and their new Scrum days, um, maybe people have some ideas to share. They can always email them to us at podcast@agilethought.com.

Sam Falco: [23:18] We love getting listener questions I think we, whenever we get one, um, my comment to Dan and in our internal teams thing as always all we, you know, we got, we got a live one, so it makes it fun for us to, to be talking about real problems. Real people are having that’s super cool. We’ll keep doing that.

Dan Neumann: [23:37] I usually ask people what they’re doing, but it’s your turn to ask me. Yeah.

Sam Falco: [23:42] Are you reading anything?

Dan Neumann: [23:44] I am. Um, I, uh, was introduced to the book, the hard thing about hard things and the author’s last name was Horowitz and it, uh, so I’ve started reading through it using audible. Cause it’s hard to use a book while I drive and that’s where I find myself. I look, I look a skew at people on the highway when they’re reading, while driving. So I figured I should try not to. Um, and I finding like an interesting, like almost emotional reaction to part of it because when Horowitz goes through and he talks about some of the scaling of businesses, some of it to me sounds like, um, the awful part of the.com startup world, you know, um, senior sales accountant yelling at another sales accountant in front of their peers because they transgressed somehow like that embarrassing people in front of a large group, not my style. Yeah. Embarrassing people in front of a large group was the catalyst to work on my resume at one place. And I’m like, you know what? This is not my environment. Yeah. Thank you. Goodbye.

Sam Falco: [24:54] No, I’ve seen you operate and you’re perfectly happy to embarrass yourself in front of a large group of people.

Dan Neumann: [24:58] I will, I will. And on a podcast apparently, but, and so there’s, there, there are several stories in there about working unsustainable paces and celebrating them. There’s stories about yelling at an employee in front of a large group. That again, I would put that into the toxic environment, not a behavior I hope gets emulated. And then at the same time, there are insights in there about, um, you know, one-on-ones for managers or, or with managers as a manager, the one-on-one is not for you. It’s for the person that reports to you and keeping that in mind. And so there’s, there are elements of the book where I’m like, ah, that, that seems like a reasonable tip. I can take that and put it into practice. So it’s, I’m finding it just all over the board from how I perceive value, agree with it, yell at the, you know, the steering wheel because I’m like, that’s ridiculous and I’d never worked there. So it’s definitely been all over the board. I don’t know. Maybe humans are complex and that’s part of the reason. So yeah, it’s a little bit, a little bit of a hit and a little bit of a miss and it’s interesting and interesting. Yeah. So for what it’s worth, that’s what I’m reading.

Sam Falco: [26:20] Cool.

Dan Neumann: [26:21] Let’s see if we don’t have more takeaways at some point.

Sam Falco: [26:22] All right. Well, I’m still reading what I was reading last time we recorded a podcast cause it’s a really big book. So something new next time.

Dan Neumann: [26:31] All right, well, I’m looking forward to next time.

Sam Falco: [26:33] Me too.

Outro: [26:35] This has been the Agile Coaches’ Corner podcast brought to you by AgileThought. The views, opinions and information expressed in this podcast are solely those of the hosts and the guests, and do not necessarily represent those of AgileThought. Get the show notes and other helpful tips for this episode and other episodes at agilethought.com/podcast.

Stay Up-To-Date