In today’s episode of the Agile Coaches’ Corner podcast, Dan Neumann is joined by George Dinwiddie, an independent software consultant and coach who works with organizations both large and small to develop software more effectively. He strives to help organizations, managers and teams solve the problems they face by providing consulting, coaching, mentoring and training at all levels.
Dan and George will be taking a deep dive into George’s newest book, “Software Estimation Without Guessing: Effective Planning in an Imperfect World,” which addresses both the technical and sociological aspects of estimation. In this episode, George takes listeners through several chapters of the book, key points and best practices, as well as myths and misconceptions, all to help your organization achieve its desired goals with less drama and more benefit.
Transcript [This transcript is auto-generated and not completely accurate in its depiction of the English language or rules of grammar]
Intro [00:03]: Welcome to Agile Coaches’ Corner by AgileThought, the podcast for practitioners and leaders seeking advice to refine the way they work and pave the path to better outcomes. Now here’s your host coach and agile expert, Dan Neumann.
Dan Neumann [00:16]: Welcome to the Agile Coaches’ Corner podcast. I’m Dan Neumann and today I’m excited to be joined by George Dinwiddie. George is an independent software development consultant and coach and he works with organizations large and small to make life better for people in software development. In the particular focus, George, you and I are going to be talking about today is your offering of the book “Software Estimation Without Guessing.” Thanks for joining.
George Dinwiddie [00:41]: Thank you Dan.
Dan Neumann [00:42]: So George, what took you along this journey of deciding to write a book about software estimation?
George Dinwiddie [00:49]: Oh, well, it’s not that I, um, I really wanted to dig deeply into estimation. Um, you know, most programmers aren’t terribly interested with this topic. What I saw is that so many people were, were getting beat up from so many different directions over this topic. Uh, so I felt like it was one that needed to be addressed. Um, and that was, that was the principle reason for, for digging into this.
Dan Neumann [01:19]: Yeah. And as we were talking, you know, you said you want to make life better for people in software development. And I, boy, I could think of several, um, B ratings or disappointments or difficulties that were made because something went wrong in the estimation phase very early on.
George Dinwiddie [01:40]: Yeah. Or during the estimation or, or, uh, you know, the estimation was fine. It’s just that situation’s changed and nobody thought to look at the estimates again. And there’s so many things that people do that they call estimation that aren’t really estimates. Um, you know, if someone asks you, Hey, can you estimate that lower? Well, that’s not an estimate. That’s a negotiation.
Dan Neumann [02:06]: Yeah. Are there some other things that people call estimations that aren’t?
George Dinwiddie [02:10]: Oh, uh, commitments. Uh, people get estimates and commitments confused. They get plans and estimates confused. Oftentimes they’ll say, ah, Oh, estimate this. And then they’ll treat it like a plan rather than have that inform the plan. Um, so it’s, there’s lots of different things. Um, I, uh, I decided early on in the book to not try to enumerate all the, the ways that estimates can go wrong because people are familiar enough with that and there’s no point in giving people new ideas that they hadn’t thought of one of those. And instead I wanted to write a book that can help people, uh, work with estimates given a desire to have things come out well.
Dan Neumann [02:56]: Yeah. And you, you refer to estimation as tool. Okay. And so, um, there are misconceptions I think that people have about agility. I know I’ve heard people say things like, well, in agile we don’t estimate or we don’t tell you when it’s going to be done or some of these other misnomers about what agility is and how expectations are set or not set with development teams and with their stakeholders.
George Dinwiddie [03:23]: Right. And, and it, you know, those things could be true in a given circumstance. Um, but you’re not gonna know that they’re true just by assuming them. You don’t have to have a conversation.
Dan Neumann [03:33]: Some of the, um, one of the places you, you start in the book, you, you kinda, I liked that you break it down into kind of different scenarios if you will. So starting a new project and then you know, the differences between small companies, maybe how they would start a project in a large regulated company, maybe how they would start a project. Can you talk about kind of how those different characteristics influence the estimation approach?
George Dinwiddie [04:00]: Well, one of the examples I use for a small company is a couple of guys who come up or a couple of people. Um, one thing I did in the book is tip is intentionally, uh, used a lot of, uh, gender ambiguous names so you can make up your own mind. But, um, you know, starting a, a project just on a whim and having it become a business. Certainly that’s something that, uh, typically doesn’t have any estimate. It’s just done on the other hand and in a large corporation, then in order to get funding, you need to, to justify the funding. You need to justify the project. What’s the return going to be and to know what the, uh, you know, what, what the return on investment is. You’ve got to have some idea of what the investment is. So, uh, if you need to staff it with people, you need to know, well, what’s a reasonable staff for this? Um, there are lots of questions that a large company needs to figure out partly because the people making decisions are not close to the work itself. They’re somewhat removed from it and they can’t see it directly.
Dan Neumann [05:18]: And the, and the bigger the organization gets, the farther apart. Those people doing the delivery of the project are often from the people that are making decisions about whether we should as a company invest in those particular efforts or not. You, uh, one of the things I liked in the book you talked about kind of asking why but not in the context that I would think of it like a three year old or a four year old saying, well why, why, why you’re right. I’m the big boss. That’s why I want the estimate. But you, you gave some other rephrasing of the question and I thought that was helpful.
George Dinwiddie [05:51]: Well, and this is something I think I learned this originally from Dale Emory. Um, if you change a why question into a what question, then you get a lot more specific about what you’re asking and a lot less threatening. Why can seem like it’s challenging someone. It’s decision making process, um, you know, or the result, you know, the decision that they made. Whereas what can be a lot more factual.
Dan Neumann [06:18]: I like that. That’s something I think that I could probably try to, um, include that language that, that what type of language cause I a year, right? When you ask somebody, well, why? Well, because I want it or you know, whatever else. Like don’t get all up in my business. But what, you know, what this decisions will this estimate allow you to make, or what is the importance of estimating lower, as you correctly pointed out, was a misnomer. Like what’s, what’s driving the interest in having a smaller investment? Those types of questions.
George Dinwiddie [06:48]: Right.
Dan Neumann [06:50]: Very cool. And then you, you don’t, um, go on a long deep dive on things like planning poker, the Fibonacci sequence. But you do have a nice then example of different comparison based approaches for doing sizing. Yes. I think for some reason planning poker became the defacto way that agile teams are supposed to size in some people’s minds. But you go through a whole number of different ways of how to do comparison based estimates.
George Dinwiddie [07:19]: Right. This is, um, this book was a little hard for pragmatic programmers to deal with because it’s not very recipe driven. Um, you know, it, it, it’s more things to think about and in options to consider, um, which so, you know, their books tend to be a lot more technology oriented and they want to give, Oh, here’s a set of steps you can follow to get started, you know, and figuring that people can then make it on their own from, from that point. Uh, something like estimation, you’re kind of in the deep end before you get started. Um, and the reason everybody does planning poker is cause Mike Cohn wrote a book on agile estimation and he gave one way of doing it. And so everybody thinks this is the agile way of estimating. And it’s interesting to me that James Grinning doesn’t use planning poker. He used it once and uh, to deal with the situation at hand. He’s tried some other things since then. And so planning poker is fun. Um, it has some advantages. It has some disadvantages, but it’s not at the heart of what estimating is it really the purpose of planning poker was to get people to go ahead and say something then they were stuck on on and nobody wanted to say something first. So he had them say it all at the same time, you know, reveal at the same time. And this has some real advantages to avoid anchoring and imprinting on the first thing that someone says, but it’s not about the estimation itself.
Dan Neumann [09:02]: Yeah. The, the planning poker, at least as I recall it, and Mike Cohn did my CSM class back in the day. And I typically, when people ask me what they should read, like if, if Mike puts out something, it’s usually of quality and worth reading, but, um, it enables the conversation. It’s not just the approach. Um, and there, there are many other approaches that can be used as well. And I think it’s interesting that, you know, those types of practices can be common things that people follow blindly and think like, well, if you’re not doing, let’s say planning poker, then you’re not a good Scrum team or you must not be agile. And, uh, that is also a mistake that people should avoid making. Um, so I liked that it’s not at the heart of estimate. I’d be curious to see what James Grinning is up to. I always enjoy going to his conference talks, so hopefully I’ll have something interesting of his to go see at a agile 2020 here in a little bit. Um, expert estimators, are still, um, an approach to that that you, uh, refer to. And I think I read the book agile or no, not agile estimating software estimating, demystifying the black art. And you allude to that book or you refer to that book. And, um, I think that one for me felt more formulaic on here are the different steps you go through. You know, the 120 tips or whatever in there, and yours is yours is different from that. Yours is, Hey, here’s, here’s estimating the whole thing, here’s decomposing it. Here’s how to estimate for unknowns,
George Dinwiddie [10:32]: Right. So, so rather than have people who walk through a series of steps, um, I’d rather people think about, you know, what they’re trying to accomplish and, and how what they’re doing is accomplishing that. So, so having a series of steps, um, you know, to, to estimate it may be perfectly fine for you or it may not fit your organization at all. Um, the, uh, yeah, the, um, basically that book holds a model that you can use to, to, uh, drive your estimates and get into that and you know, in model based estimation in this book. But that model may not be true for your organization. Uh, that model hides a lot of the subtleties of how things work. And so you have to test the model. And my experience is people don’t do that. They say, Oh, well this model has been published. Uh, it must be right.
Dan Neumann [11:47]: George, you had mentioned using models for estimating and one of the challenges there is not recalibrating the model and I’m not terribly familiar with, with models and an estimating standpoint. And then even if I’m familiar with them, I may have a different context than what you had in mind. Could you maybe elaborate on the use of models and how that works?
George Dinwiddie [12:06]: Well, it can, it can be a lot of things. It can be very, something very simple. Um, for example, I’ve seen people estimate an eCommerce site by the number of screens or an atti site, but the number of screens that are needed, user input screens and, and that’s a simple model to get some indication of the size of the work. Uh, it’s, or the other end of things. And it, Capers Jones has the, uh, COCOMO model, which, uh, you know, I, um, it’s a large proprietary expensive model based on lots of huge projects. Um, I’m not convinced that you don’t need to pretty much do the project to get the input data port or you could, but I’ve never used it. So that’s just an outsider’s, uh, you know, viewpoint of it. So there, there can be a lot of things. It basically a model is just, um, in encoding the, um, you know, how you compare it to, to other experiences.
Dan Neumann [13:09]: Okay. Fair enough. So the counting of backlog items, um, Oh, I dunno. Function points is one of those that always seemed interesting to me, but I never quite figured out how to operationalize function points into an estimate, those types of things.
George Dinwiddie [13:24]: Yeah. And that’s part of the COCOMO is using function points. Okay. It’s only part of it.
Dan Neumann [13:29]: I see what you mean by making it, yeah. Hard to do.
George Dinwiddie [13:34]: Um, so it’s, uh, you know, I’ve seen people come up with models, well, you know, we’ve got these different features, you know, these are hard, these are complex ones. And these were simple ones. So we’ll, we’ll, you know, give this coefficient to the complex ones and this coefficient to the simple ones. Oh. And then these have to talk to these external systems and so we’ll, we’ll put in a term for that. Um, and a model can be very helpful, but if it doesn’t really track reality, then it’s going to lead you astray.
Dan Neumann [14:10]: Okay. Yeah. Thanks for, uh, thanks for expanding on that a little bit. So one of the quotes that I heard, and I don’t know who to attribute it to, but I thought it was pretty interesting. It was, um, estimates are wrong. If they were right, they’d be called measurements. And that I think is a nice place to go. Um, they’re going to be wrong. What do we do when estimates are wrong?
George Dinwiddie [14:35]: Well, the first thing to consider is how wrong? You know, are they right enough for your needs? Um, and don’t worry about it too much otherwise. If they’re going to be wrong, he might consider which direction you’d rather they be wrong. If you’re estimating something to give someone a, a firm fixed price, then you’d rather your estimate be wrong on the smaller side than the larger side, you know, I mean, I guess backwards, but you’d rather not, um, you know, underbid the contract and then get into trouble, you know, so you might be able to see, you know, uh, cover yourself by declaring exactly what the estimate includes.
Dan Neumann [15:19]: Yeah. You get into all the different, the, the, the terms, the, um, the caveats, all the nuance. And that’s where I think a lot of times traditional management, we would just, we would just rigidly change request that thing, um, to death because we knew we were exposed on the fixed bid estimate.
George Dinwiddie [15:35]: Right. And there’s an old saying, you know, to underbid and make it up on the change request, um, which is, you know, one way of doing business.
Dan Neumann [15:46]: That is, that is certainly a strategy that I think has been used in the past.
George Dinwiddie [15:52]: Uh, you know, for simple estimates like how much work to take on for the next two weeks. You don’t really, you know, you don’t need a lot of precision. You don’t need, need a lot of, um, accuracy and my opinion is you really should be off in each direction about equal. So you should be shooting for the midpoint. So you know, you know about how much to take on and you can do a little projection into the future from that. So those sorts of estimates, um, you know, can have a lot of leverage and can get you off track pretty quickly. Uh, but you know, the, the consequences are also very low. You know, if you don’t quite get everything done well, you continue to work on it. Um, you know, and if you work on it so that you’re getting things completely done rather than getting everything almost done, you know, it’s no big deal. If you you know, run out of things to do, I’m sure you can find something to pick up and start working on. That’s never been a problem in my career.
Dan Neumann [16:59]: No. It’s interesting and I don’t know what drives the behavior with a lot of teams who want to do things like, uh, the phrase may be stretch goals get used or a concern that they’re going to have some Slack in their sprint somehow for those that are using Scrum. And my reality has been a lot like yours, which is there’s always something else. Like if we do what we forecast and we build confidence with our stakeholders that way, there’s always something else. It’s not like we’re just going to, you know, pack up the tents and go home before the Sprint’s over.
George Dinwiddie [17:30]: Yeah. You could, you know, first of all that indicates that there’s insufficient communication between the parties, uh, because if you run out of stuff, the best thing to do is what’s the, you know, what’s the next thing to do now? Uh, because that could have changed in two weeks. Um, but if you can’t ask the, you know, the person making that decision, maybe they’re, they’re out of touch, um, then rather than a stretch goal, you should just have at the top of your backlog. Oh, this is the next thing to pick up and start working on it. That’s no big deal. And to be honest, if you pick up the wrong thing and start working on it and, and they come back and they say, Oh, you’ve spent a whole day on something that is no longer important. Okay, you spent a day.
Dan Neumann [18:20]: And in that, then you know, you’re talking about which way and how wrong. Um, we’re okay with being essentially in the estimates, whether it’s a fixed bid contract or something where human life is in danger. You know, you’d want to be more conservative than even a fixed bid contract. I would, I would hope. Um, and then if you’re talking about sprints, maybe more along the 50/50, you know, half the time we’re a little high, half the time we’re a little low, no big deal. Um, and then we don’t know that we are right or wrong unless we’re able to track some progress. And you talk about some ways to kind of track progress as well with your estimates to create a feedback loop,
George Dinwiddie [19:00]: Right. I mean, one of the big purposes of estimates is for the business to be able to look ahead and make decisions, you know, anticipate the future and we don’t want to anticipate the future at the beginning of the project and then believe that we know the future based on this estimate. So we want to, you know, continue to anticipate the future as we go along. Um, and one of the easiest ways to do that is, you know, track our progress and see is it playing out the way we thought? Oh, you know, I find burnup charts the easiest way to do that. Um, it can, it gives you, um, you know, you have to deal with all the usual stuff about, well, you know, how big really is this and is it, you know, what where is our goal line, um, that you have anyway and, but you can change your mind on that. You can say, Oh well we’re going to take some stuff out of scope because we’re not going to hit the date we want and we need to hit the date. So let’s go with something smaller and, and cut some stories. And you can do that easily with a burn up and take that stuff that you cut out and Oh, that’s in release two, and and see how you’re doing on release two and probably some other stuff gets dumped into release two at the same time.
Dan Neumann [20:21]: So you’ve got that, you’ve got the scope as the, the thing that can can be levered back and forth or the richness of different features if you’re kind of tracking ahead or behind. And what occurs to me here is the value of having a lot of transparency, a lot of trust and psychological safety. The ability to say we maybe are not on track and we need to about things because I know when I’ve been on projects in the past where things have gone completely sideways. Note, this was not an AgileThought. They were, they were, when somebody would take a project that was very challenged and somewhere between there and the most senior people interested in it, the status would go from your red to your yellow to your green until the date was missed. And everybody was, um, seemingly playing a game of CYA as the, uh, challenges that we were under went forward. And so, um, maybe you can talk about how that, how important that is and the need to learn from the situation we find ourselves in versus maybe hiding it around back.
George Dinwiddie [21:27]: Right. Well, as Jerry Weinberg said in one of his books, you know, uh, you know, when a project gets, uh, uh, a slip date announced two weeks before delivery, it didn’t just happen. You know, uh, it’s a crisis that’s merely the end of an illusion. So people were maintaining an illusion that everything was okay, uh, until they couldn’t maintain it anymore. And that’s not the way the business really wants to work. Um, that may be the way that an immediate manager wants to work because they’re worried about how they’re going to look. Um, and that may be a bigger issue to them than how, how the business does probably because they don’t have visibility into what the business goals really are. They’ve just been handed a piece of work. And so again, it comes down to communication, but generally you’d rather know early so that you can take action. You’d rather know.
Dan Neumann [22:27]: Sorry I stepped on you there for a second. And then, um, the ability to plan for, I would expect having some contingency plans in place.
George Dinwiddie [22:35]: Well, and that’s one thing. An estimate is not the same thing as a plan, you know, estimate and if it’s critical, then you certainly want to put in some contingency buffers. If you’ve got, you know, loose environment where everybody’s, you know, willing to replan as they go, you may not need to do that because you can do that, you know, when something comes up. But if you’ve got a critical date, then yeah, you need to, to allow in your plan some space for the unexpected. You can expect that there’ll be something unexpected. You just don’t know what it is.
Dan Neumann [23:09]: Yeah. And I don’t know, I forget. Um, gosh, who was it? Was it Rumsfeld that had talked about the unknown unknowns and he got so much grief for that. And as a software person I’m like, well yeah, those are the things they’re going to bite you in the butt. I think he got, um, you know, politics aside there, there is stuff we don’t know and we don’t know what that is until it shows up.
George Dinwiddie [23:30]: Yeah. That, that’s saying predates Rumsfeld by a lot for sure.
Dan Neumann [23:33]: Yeah. I just thought it was interesting how, um, Oh, I dunno how much people like, Oh, what a fool. Unknown unknowns, huh?
George Dinwiddie [23:42]: Yeah. Well they’re there.
Dan Neumann [23:44]: Yeah. Well for fear of going into a political thing, we’ll just, we’ll step back away from the edge. Um, are there some tips for how to plan for what you know are going to be incorrect predictions? Like you said, estimating is not planning and we know that estimates are not precise. We want them or, yeah, they’re not precise. We want them to be accurate, but precision is a real challenge there.
George Dinwiddie [24:07]: So one thing though, when you find out that that your estimate is wrong, you know, when you estimate something will be done by a certain date and you get to that date and it’s not done or, um, or you know, you’re approaching it and you can say, no, there’s no way we’re going to make this, then, what that means is, is some assumption that your estimate is based on is wrong. And so there’s a lot of value in examining, well, okay, what assumption is wrong? What did we assume that’s not true and, and learn from it. And so this will take, you know, one of these unknown unknowns and turn it into a known unknown. Um, you know, Oh, we didn’t realize that this code base, you know, that we were going to be building on was such a mess. You know, nobody on this project worked on it before. You know, the people that worked on it have all left. I wonder why? Something like that. Or um, Oh, this feature that sounded so simple when we were first talking about it. This has a lot of nuance that that’s hard to do. Um, you know, nobody’s ever done anything quite like this before. So there’s certainly some unknown unknowns. And so you can get a lot of value and since you can get a lot of value, I found that you could, you know, you might want to set some, some traps to discover where you’re unknown or you know, your incorrect earlier, um, to keep you out of danger. So if you set some near term estimates rather than just estimating the end of the project, which is way too late to do anything about it, you said, uh, a near term estimate, say, Oh, you know, we expect this feature to be done by this time and it’s not. Okay. Well first you want to take care of that problem. What, you know, what do we need to do to get it done? Do we need to trim scope, you know, um, do we need to give up a different feature? Uh, can we adjust, you know, the deadline, uh, but the, you know, next thing is figuring out, okay, what did we not know? So we can learn from it and then we can adjust and by setting earlier weight points, then we can become aware of some of our, false assumptions earlier when there’s still time to let people know about it and let people make a decision.
Dan Neumann [26:35]: That’s one of the things I like about the Scrum Framework is, and you’d alluded to it earlier, which was, Hey, we’ve got stuff that’s or not done, hopefully as opposed to lots and lots of stuff that’s partially done. And so I think that combination of what’s done and not done as well as those guideposts along the way of where we think we’re going to be as far as having completed stuff allows us to, to make decisions.
George Dinwiddie [27:00]: Right? Yeah. Being clear about whether something, you know, how far along you are is, is really crucial. Otherwise that’s another way you can fool yourself. Uh, when I first got into engineering, you know, uh, this was hardware engineering and I was a technician and they, you know, so I was given a task and boss would come by and say, you know how you’re coming on that, Oh, I’m, I’m half done and you know, then he comes back and next day, you know, how are you doing? Oh, I’m three quarters done. Oh, I’m 80% done. You know? And, uh, yeah. So it, uh, it starts to become a series of smaller and smaller steps because I realize, Oh, I was a little optimistic. There’s more to this than I thought there was.
Dan Neumann [27:46]: Yeah. It’s one of those limit problems I remember from when I was studying math back in the day, that the closer you get to the deadline, the more and more time needs to, to close that gap.
George Dinwiddie [27:56]: And so, yeah, having a good measure of whether something is done or not. Um, and I like to do that with, with test automation. You know, these tests, you know, when we finish this feature, these tests should pass. And, um, we might discover some other stuff along the way that we hadn’t thought about. Uh, you might do some exploratory testing. So, you know, it’s not, again, we can’t know reality precisely. Um, even in the moment, we may not be able to know it precisely, but the better we can get a handle on reality, the better off we’re going to be.
Dan Neumann [28:31]: That getting a handle on reality and estimating and forecasting the milestones. Uh, one of the last facets that you touch on is the people side of all that. So, you know, if we’ve got good models and, or models that are somewhat accurate, but we need to adjust them, this is all happening in the context of people and we don’t always get along with each other.
George Dinwiddie [28:54]: Yeah. And that’s really the, you know, the major reason that tipped me off on writing this book. Uh, it’s because I see so much pain over, you know, the, the topic of estimation. It’s really not about estimation at all. It’s really about the way people treat each other. Um, and that’s, you know, ultimately that’s a harder problem than estimation. It’s, uh, it’s harder to know what reality is. It’s, you know, um, but there are things that you can do and there, you know, I’ve studied a lot and learned a lot over the past, um, Oh, 20 years that, uh, you know, I didn’t use to have much of a clue about. Um, and that most programmers don’t know much about, you know, they got into a technical field so they could talk to a computer rather than the people. Uh, but it turns out you still have to talk to people and having some skill in that is as important as having some skill in programming.
Dan Neumann [29:55]: Are there a couple of the, kind of the nuggets around the people side, maybe you want to point to as far as kind of ways to smooth this out as a way to kind of button up our exploration of estimation?
George Dinwiddie [30:08]: Yeah. So, so, um, yeah, this is the last chapter of the book. So in there, uh, there’s two main things I talk about and both from Virginia Satir. Um, and, uh, uh, both of which I first got from Jerry Weinberg. Um, the, uh, one is Satir’s concept of congruence and what are the important things about, so I’ve done some research on, you know, congruence originally came from Carl Rogers. Satir kid of expanded it. Um and the way I learned it is it includes a balance in concern for the needs of self, the needs of the other and the needs of the context. And the better you can do that, the better off things are going to be. Even if the other person is not doing a good job of doing that, at least you can react better. Virginia Satir said, you know the problem isn’t the problem, it’s how you cope with the problem That’s the problem. So I guess she means to cope even if not means to avoid the problem. Um, but when you neglect, uh, you know, the, the needs of the other that comes out as blaming, you know, I can blame you for the problems. Well, that means I’m not paying any attention to your needs. You know what your troubles are. I’m just putting it all on you. Um, now. So you’re, you’re my manager and I just want to keep you happy no matter what. Um, and I’ll say yes to anything you ask, you know, that’s called placating and that’s ignoring my own needs. Um, and that leads to trouble too because I’m not likely to keep you happy that, uh, that way anyhow. Um, these needs are real. And you know, without meeting those needs, I can’t do the job that, that you want me to do. So there are a number of different, different off-balance, uh, coping stances there. And then it comes down to communication. So communicating between people can go astray really easily. We bring so much history and we interpret what’s said through history. Um, you know, we may have misunderstood in the first place. We may not have, we may have misheard and then we may have, you know, Oh, well we heard it, but we interpreted it differently than was intended. And then we’ve got all this history. Oh, well, you know, uh, I’ve had trouble with this person before, so they must be thinking this about me. And you know, it, it goes downhill from there. So this is what Virginia Satir calls the ingredients of an interaction. And you can slice it up very finely and, and take a look at it and see where things go wrong. And that can give you some, some insights into, you know, how to avoid some of these problems and maybe consider, Oh, this first thing I thought was, you know, this first meeting I took from what you said, maybe that’s not right. There’s some other options, you know, and I can check out, you know, which of these things did you mean?
Dan Neumann [33:26]: So, and that, that strikes me is just a really approaching it with a lot of genuine curiosity and trying to add, to add to that context, I think it was in crucial conversations. They referred to it as the pool of meaning. Um, when there’s something difficult, you want to add meaning to that. So that we’re getting more and more understanding of what each other are, are referring to.
George Dinwiddie [33:47]: Uh huh. Yeah, yeah. Get into dialogue rather than discussion.
Dan Neumann [33:51]: Absolutely. So congruence and communication I think were, were kind of those two main themes. Um, kind of in that when people clash chapter. Fantastic. Well, I appreciate you, uh, contributing to the community through your book, uh, software estimation without guessing as well as then taking some time from your other activities to share and explore with me and the folks that are listening to the podcast. So thanks a ton George. Appreciate it.
George Dinwiddie [34:19]: Oh, thank you, Dan. It’s been my pleasure.
Dan Neumann [34:22]: One of the things we often ask folks at the end is what, uh, what may be is along your continuous learning journey. Do you have something that you’re exploring now, uh, whether software estimation related or independent of that?
George Dinwiddie [34:38]: Well, I was working with a, a large corporate client on, on, uh, how to turn their IT department into a learning organization. Um, unfortunately that’s been put on hold because of budgetary concerns in the new year, but, uh, um, to me that’s a really interesting problem. Um, you know, they were in the process of trying to improve the quality of the software that was being built in their organization. Uh, hiring a lot of outside coaches to come in and, and help train, train up their workforce and, uh, you know, some of which is employees and some of which is contractors, the train up their teams. And, um, it’s, it’s a huge corporation. And, and I, I told him, you’re never going to be able to find enough coaches to make that work. So, um, uh, my immediate assumption is that there are things that the corporation is doing that inhibits people from learning the things they want to learn. I mean, people learn anyway, they’re gonna learn something. It’s just probably not what you want, but you know, for improving their software development, there are probably things in the corporation that inhibit that. And sure enough, you know, there are, um, and it was really getting interesting digging into that. Uh, one of the big things is time to learn and while they had a program that set aside time to learn every week, um, then you talk to the developers and then mostly they feel, Oh yeah, but we don’t have time to do that. We can’t take that time, you know, we need to use that time to catch up where we need to do this or uh, you know, there’s so much schedule pressure, there’s pressure put on by, you know, the people who want the software they’re building, there’s people put on by the managers who want to meet a schedule there and there’s pressure they put on themselves, Oh, we expected to have this done. So, uh, you know, you know, we want to meet our own self expectations. So it’s a hard thing. And, um, it was interesting to dig into it and I wish, I, I hope to get back there to help work on solutions.
Dan Neumann [36:51]: Definitely. Do you have anything for folks who are maybe hearing about learning organizations and would want to, um, pursue, uh, doing some reading on their own? Are there any, um, resources that you find or have found valuable in that area?
George Dinwiddie [37:05]: Uh, well, you know, Senge’s fifth discipline is, is one everybody immediately thinks of. That tends to be, you know, fairly high level, you know, um, senior management level. Um, I, uh, you know, I tend to look at things and look at the more immediate problems from a systems-thinking point of view. Uh, draw a diagram of effects. You know, what is inhibiting this? And you know, there’s a lot of anthropology here watching what’s going on, talking to people, trying to figure out, you know, what, what’s really happening. Um, that’s one of the things that makes it really exciting to me.
Dan Neumann [37:46]: No, that’s cool. Yeah, people are, people are tricky creatures and the way things affect each other is, is not always obvious. So yeah, definitely a systems view. Very valuable. Well, thank you George. I really appreciate you taking the time.
George Dinwiddie [38:00]: Okay, thank you. Dan.
Outro [38:02]: This has been the Agile Coaches’ Corner podcast brought to you by AgileThought. The views, opinions, and information expressed in this podcast are solely those of the hosts and the guests and do not necessarily represent those of AgileThought. Get the show notes and other helpful tips from this episode and other episodes at agilethought.com/podcast.
Contact us to share the challenges you’re facing and learn more about the solutions we offer to help you achieve your goals. Our job is to solve your problems with expertly crafted software solutions and real world training.
For a better experience on the web, please upgrade to a modern browser.