This week, Dan Neumann is joined by Quincy Jordan, Director of the Innovative Line of Service at AgileThought. In this episode, Dan and Quincy dive deep into the topic of metrics, especially in the aspects that should be taken under special consideration while running an agile team. These agile experts go through different areas in which metrics can be applied such as:
- Velocity and why leaders tend to be interested in it,
- Number of stories, testing, and how much ahead is healthy for a team to be,
- The value of tracking points per person in a team, and
- The Acknowledgement piece: giving credit should be a result of delivered value.
- Value, metrics, and how they relate with velocity
- There is a common struggle in applying metrics to a work that has already been valued
- What is the real intent of velocity? How can velocity help the team? It can be pervasive to apply velocity in the way Frederick Winslow Taylor suggested
- Putting output over the outcome is the opposite that an organization needs to do. The most important is what the team achieved and that it is valuable to the organization and its customers
- Velocity and its interest in leadership
- Leaders tend to compare teams’ velocities. A team’s velocity depends on its composition and its expertise in the actual work that they are doing. A team may appear to go slower but that might be due to how complex the tasks are they were assigned
- Instead of going after the metric of velocity, it is more efficient to check how consistently a team is doing its job
- Velocity can be used to anticipate when new capabilities can be expected
- Pay attention to how the stories are carried into a Sprint
- Comparing the number of stories that were tested in a Sprint versus the last Sprint can be tricky
- There is a problem in pushing things through without testing them
- Each Sprint is almost considered its own project
- Pay attention to an unusual number of stories that are completely refined and fully ready in the backlog
- Have you planned up too far? This can be a problem since things might change in the meantime
- It can be frustrating to have a lot of work done that will never be used
- Can tracking points per person have a healthy value?
- If points have been tracked per person, that information shouldn’t get through the team. Do not share those metrics across the board, they belong to a one-on-one conversation
- Ask yourself: Is the team making increments of value as the product backlog levels? Is the product built and tested?
- People seek credit; acknowledgement is important.
- Team encouragement is crucial to remain as enthusiastic as possible, but getting credit for an outcome that hasn’t been properly achieved is dangerous
- Teams need to get credit for delivering what was intended to be delivered, not just for doing the work. (Users and customers don’t really care about the work, they only care about the value that is being delivered)
Mentioned in this Episode:
Transcript [This transcript is auto-generated and may not be completely accurate in its depiction of the English language or rules of grammar.]
Intro: [00:03] Welcome to the Agile Coaches’ Corner by AgileThought. The podcast for practitioners and leaders seeking advice to refine the way they work and pave the path to better outcomes. Now here’s your host, coach, and agile expert, Dan Neumann.
Dan Neumann: [00:16] Welcome to this episode of the agile coaches corner podcast. I’m your host, Dan Newman. And today we’re going to be exploring the topic of what are you hearing metrics on agile teams that kind of makes your ears perk up and go Hm. And my colleague in this journey is going to be Quincy Jordan, Director in our innovate line of service and focusing on the coaching practice. Quincy, thanks for taking time and joining me on the podcast again today.
Quincy Jordan: [00:42] Hey, thanks Dan. Happy to be here. Always a pleasure to be on the Agile Coaches’ Corner. Looking forward to this conversation.
Dan Neumann: [00:50] Me too. So this is a topic actually you proposed. I know we as coaches end up a lot of times hearing about metrics and the way people intend to use metrics, and sometimes it does kind of make your ears perk up. Like, Hey, what about that? So kind of, what do you hear in regards to this topic? What made you feel it an important one?
Quincy Jordan: [01:12] Yeah, it’s, you know, so when you’re in coaching with, with clients and you’re having conversation, even outside of that, just, you know, in the agile community you know, one of the things that, that I think in general, the community that I struggle with really has been metrics, you know, with an agile space is really been in my opinion over the years, one of those types of things that people see that they’re getting value out of what’s being done, but then they struggle to put a metric against that value. Even though they see it. And so when people struggle, you know, to do that, then people tend to fall into old habits and they start measuring things that that they are accustomed to measuring. And those, some of those things may be things that are not really driving value. They’re not really helping to create the appropriate behaviors or the desire behaviors within, you know, an agile ecosystem. So yeah, so from time to time, I hear things, you know, that just makes me say, Hmm, okay. Tell me more about that. So you want to measure that particular thing. Well, what is, what do you think that’s going to help you solve by doing that? So, yeah, so I, you know, I think about one of the, I think one of the most classic things is probably velocity, you know, in many ways where as, because it’s a number, you know, and it’s a number that changes. So there’s sometimes a desire from maybe not the teams per, but maybe the Product Owner or, or a program level person that feels that there’s value in being able to share with stakeholders, how the team is doing based off of velocity in, as we know, that is not the intent of velocity. That’s, that’s not what it’s for. So when I hear that, that causes me to say, Hey, okay, well, you know, tell me, what do you think velocity is going to help you figure out by doing it or you, so, you know, the teams will ask you to is this. What do you think that’s going to help?
Dan Neumann: [03:36] Yeah. And with M.C. Moore and Sam Falco on the podcast immediately before this, we talked about Frederick Winslow Taylor, the father of scientific management. And I think we can blame him for this too. So folks are interested in scientific management and measurement. I think this is a holdover of some of that thinking how many tons of pig iron can a guy haul in a day? That’s really different than how’s the agile team making software doing. And I think that thinking of we should be able to measure tons of tons of code moved per day is really pervasive. And I like your question of, okay, so velocity what do you hope to learn from doing that? So the team does 20, 60, 50, 3000. I don’t know. What’s the point?
Quincy Jordan: [04:25] Yeah, because it’s, so let’s take, let’s go down that, that rabbit hole of, of code. All right. So if we take code for example, is it, is it more valuable code if you say as a metric, which I’m not saying anyone that has this metric, but as a metric, if someone says, oh, well, our developers are putting out, you know 20,000 lines of code, you know, a week or something, or whatever the case. So is volume. What you’re going for with that, or are you going for quality of code that provides the functionality that you’re looking for? Because that’s what matters. I mean, if, if you really have really good code, it’s going to be clean and it’s going to be minimal. You don’t want to add more things because let’s say, if you do and there are defects are errors, you know, there are bugs in there now they’re harder to find because you just quite frankly have more lines of health to go through. So, so that’s in, that would be another one of those types of things. Like if I, here, here’s someone, you know, especially some leaders, you know, talking about number of lines of code, or, you know, the amount of code that teams are putting out. It’s again, it’s one of those things that causes me to want to look closer and ask them questions.
Dan Neumann: [05:56] And so the code would be an example of measuring output over outcome. How much stuff did people make as opposed to did they actually achieve something that’s valuable to the organization and its customers? It’s very internally focused.
Quincy Jordan: [06:13] Right? Yeah, yeah. Yeah. Cause if they make the wrong stuff, but they made a lot of it. That’s not very helpful, not very helpful at all.
Dan Neumann: [06:23] I as a young developer, back in the day we had this bit of software, the place was, and there were literally copy and paste 300 line functions with four lines, different in the middle and oh, have mercy. It was, it was complicated. So yeah, it was, let’s focus more on outputs. Have we satisfied the customer? Have we delivered a business need? Have we reduced a risk instead of, have we been active in, in the application? So kind of a fun little digression there, we started with velocity and kind of went chasing after lines of code, but you’re talking about velocity and it’s interest to leadership.
Quincy Jordan: [07:01] Yeah. And so, you know, they tend to have that answer as to, because sometimes the question will come up. Okay. Well, well, why is this team’s velocity increasing, but this team, the team’s velocity is it and there could be a myriad of reasons for that. There are any number of things. You know, the, the team composition makes a difference the expertise on the team, the actual work that they’re doing makes a difference. And, and that is one of the things that I think is very important for leaders to understand that oftentimes you may have a team that seems to be going slower, but that’s because that’s the team that’s taking on all the real hard stuff, you know, they’re, they’re doing, they’re tackling the really difficult things. So it seems like they’re not doing as much. But in fact, they’re, in some ways they’re, they’re doing more because the type of work that they’re doing, the type of problems they’re having to solve are much tougher than, you know, the other team, the other team, you know, to your point of what you just described from, you know, your development you know, experiences back in the day around cutting and pasting. Okay. So there’s cutting, pasting, you know, they’re not creating new code. They’re not having to figure things out, you know, that are super difficult. You know, that makes a really big difference, but that question does come up a lot around increasing in velocity. And so and so let’s, let’s maybe not just say increase in velocity, but increase in output. So there’s that constant desire from leaders for teams to increase what they’re producing more. So whether that’s velocity or number of stories you know, just number of defects that are, that are fixed, you know, they just want it to be more and more and more and more and more. And so for me, and in my experience, that’s not really the metric that you want to go for. The metric that you really want to go for is how, how consistent are they at doing those things? So, yes, you do want it to increase to a point, but there’s always going to be a threshold. So it’s not the team produces more indefinitely. They do more stories indefinitely that their velocity increases indefinitely. That’s not what you’re really looking for. The metric that you really want to look for is, are they being consistent with the, roughly doing the same number of stories, every sprint and they can hit that, you know, the small volatility in, in terms of what they’re doing, same with the velocity. And so in my opinion, I think that is actually a lot more important to look at that. But when I hear things that sound like there’s so much attention on increasing the output of everything, let’s go faster, team moves faster, all that stuff. Those metrics tend to make people want to cause me to ask more questions.
Dan Neumann: [10:20] Yeah, certainly we want to be able to look at a system and improve the system so that items flow through it more effectively with less friction, you know, things that can be automated are tests can pass that are automated so that you get quick risk reduction through that validation. But yeah, the expectation that teams will infinitely continue to increase velocity. An interesting one for sure, kind of perks your ears up from a stability. I tend to view it not completely different, but a little maybe different than I think what you’re describing, which is as long as we can use a velocity plus or minus something, some, some deviation, and then look into the future in the backlog and use that as a forecast for when new capabilities might be expected you know, fluctuation isn’t bad. You know, if it’s zero, this sprint a hundred next four the one after that, and 200, well, that, that variance is going to be so massive as to be largely useless for forecasting. And I’d really be curious about what might cause a team’s velocity to fluctuate that wildly and then burrow in, on, on what they might be able to do to impact that, that variance and bring it down.
Quincy Jordan: [11:40] Yeah. I, and I completely agree with that sentiment in that perspective, in that. So when I, when I say as far as, you know, how consistent are they it’s for that reason, so that not consistent in terms of it on the head every single time, but, you know, again, that small, you know, variants as you put it I usually refer to as volatility, but that small deviation, you know, that we’re referring to because it does, if it’s way out of line, you know, it’s a different ball game, but it is primarily for the purpose. Well, not primarily for the purpose, but a really good value out of it is the forecasting that you can do and helping teams to become more predictable. And quite frankly, all of those things also help to add up to how you can fund producing products because if you can forecast better and things are more predictable now, you can say, okay, well, we should invest X amount because we have a pretty good idea as to what XML can now get us.
Dan Neumann: [12:50] Right. You know, I, the metaphor of taking a trip somewhere, okay. If it’s two people, you can take the convertible. If it’s six, we better take the van. How much capacity in the vehicle do we have and who are the, you know, who are the most important folks to get in that vehicle to get to the destination? And so kind of it’s can simplify the, the forecasting, the portfolio management, getting value delivered, audit teams like you were describing, for sure.
Quincy Jordan: [13:16] Yep. Yeah. You know, one of the other things that that I think about too, as far as like the metrics that people say that causes my ears to perk up is when I hear things around the number of stories or tasks tested this sprint versus the last sprint and that causes my ears to perk up because now I want to know, are, are you putting stories through that have not been tested or are there things that are getting too quote, unquote done, but they have not been tested. And not that it’s completely alarming, but it’s most definitely something to look at, you know, for sure. Because if you’re pushing things through and you’re not testing, I think that’s a problem. And so to me, the indicator that there is if there’s a retro happening or something like that. And I hear someone say, oh, well, you know we weren’t able to test any of the stories this front w we’re going to test those stories next sprint. Okay. So you, so you carried stories all the way through a sprint stop that sprint or completed that sprint. And now you’re going to test everything in the next sprint. And I know, I know some teams like to do that. I personally do not believe that it’s I don’t think it’s a good thing to do because you don’t actually know that the increment that you produce for that sprint is really actually complete. You have no idea because you haven’t tested anything.
Dan Neumann: [15:06] Yeah. There’s an interesting newer description in the scrum guide where it refers to each sprint basically as being its own project. And when you apply that mindset, you go, oh, so this project, we’re not testing anything in this project. Like that doesn’t make any sense, the, well, you know, it’s just a carry over and we split the story and we’re going to test like people. I feel like people can convince themselves that that’s a helpful behavior, but when you start going, okay, this sprint is a project. You’re not testing anything in your project, this okay. And I think that’s a really powerful way of thinking about, is this behavior a helpful one for the system, or is it a harmful one for the system? And I think it falls back to, you know, pick on Frederick Winslow Taylor again. Well, it’s it, you know, it’s more efficient if we have the testers only test and the developers build a hundred percent of their day, and that feeds into some of this curious, curious, approaching. So, yeah. Hey, we built 10 and we tested five would, would definitely make my ears perk up.
Quincy Jordan: [16:13] Yeah. And you mentioned about a carry over story. So that’s another one that when I hear a team say, oh, well, you know, we had six carry over stories, you know, this sprint. Okay. I think that’s a bit much you know, if it’s a couple of carry over, so I think that’s okay. And conversely, if there’s zero carry overs, I think that’s also something to pay attention to because if there’s zero carry overs, now, now what’s causing me to wonder, is the team actually challenging itself, you know, enough? Or are they giving themselves too much room you know, in there? And so then there are, there are also instances where if they say, you know, Hey, there, there were six carry over stories, but you know, two or two of those have been carried over stories for the past four sprints. All right. So we’ve had carry over stories that have carry over for four sprints. Wow. We definitely need to talk about that one. So let’s have a conversation. Let’s, let’s see what’s going on. So when I say that, you know, these are things that, that you hear or metrics that your agile teams saying to make ears perk up is not necessarily to say that something’s wrong, but it is to say, Hey, that thing that was just said, that metric is something that I should look closer at as a team, we should look closer at it as a coach, I should coach you to look closer at it. It’s not necessarily alarming. It could be, but let’s do some root cause analysis and, and see what’s going on. Not anything extensive, but we do need to look at it.
Dan Neumann: [18:05] Yeah. There’s the more in the coding community, there’s the concept of code smells, which is a metaphor of a diaper. Sometimes you smell something and you better take a look, something might need to change. Maybe not. So I know if that’ll resonate with parents out there, like some smells, do we need to change it? I don’t know. And so you’re bringing the curiosity when, when, when there are some of these things talked about, for sure.
Quincy Jordan: [18:34] Yeah. Then, you know, one of the other things that I, I think about around some of the metrics that cause my ears to perk up when there’s an unusually high number of stories that are completely refined or pointed, everything’s done, like they’re fully ready and in the backlog. So and it’s interesting because the initial thought, especially for teams that are earlier in their agile journey the initial thought is, Hey, we have seven sprints of stories that are ready to go. They’re already in the backlog. We have them all mapped out, you know, and so forth. And the initial thought is that’s great, you know, for many, but the reason it causes my ears to perk up is to say, well, have you planned out too far? Because there, there are many things that are going to change now. Sure. If, if you’re in an environment where you’re using safe and you have PR planning and, you know, you need to look out in eight to 12 weeks, you know, whatever the case for that program increment, right. That’s a little different, but you’re also knowing that there may be some changes, you know, going in but just, let’s just say, you know, your standard, if there is such thing Scrum team, if that product backlog has 7, 8, 10 sprints of stories, that definitely causes me to say, Hey, let’s, let’s have a conversation because I think you’re spending too much time planning, like way too much time. And because there’s so much of that, that’s probably going to change before we ever get to sprint five and six. Okay, you have a couple of sprints, that’s fine. Maybe three, but you go beyond that. I think it’s too much, in my opinion.
Dan Neumann: [20:49] Teams end up falling into the same trap, essentially where, you know, back when we did really long projects with lots of planning upfront, you know, if you’re doing two weeks sprints that’s 16 weeks of a waterfall, basically if you’re at eight sprints out and from a, from a lean perspective, and you know, it created all this inventory that needs care and feeding, and it’s very different than a progressive elaboration where at the top of the list is the small, most valuable things to do now. And farther down as things get farther away, they become less detailed because something down there may get displaced by a new great idea. And so teams have refined it to the nth degree. Ah, that’s a lot of time and energy taken away. And the scrum value of focus, you’re not focusing on the current sprint if you’re refining, you know, three months out as a backlog item. So
Quincy Jordan: [21:56] Definitely, and it can also become disheartening to the team as well, because there are many things that can frustrate teams. One of those things is doing a lot of work that never gets used, that can become very frustrating. So, which is a reason why I’m saying that if those stories are fully refined, their estimate, like everything’s done till they’re pointed out, like everything’s done then which implies that the entire team had some say in involvement with that. And then if those stories, quite frankly, just get trashed then that becomes frustrating. You know, it’s a lot of work to go into that only for to never materialize into anything.
Dan Neumann: [22:48] Absolutely. And it, it, you fall into that trap, potentially a sunk cost fallacy. We’ve put this much time and energy into it. We should do it. We should just finish it, whether it is the right thing to do anymore or not. So lots of dysfunction comes out of having too much refined backlog for sure. Yeah. Anything else around that, that you would suggest either being different or a cause that influences this desire to have lots of refined backlog?
Quincy Jordan: [23:17] I just think that not, not so much different, I would probably just kind of double down on saying, you know, the product owner oftentimes feels a certain level of accomplishment if their product backlog has, you know a significant number of refined stories and, and they shouldn’t feel, you know, that level of accomplishment for that. Because again, it’s, it’s too much. So finding that good, healthy balance of yes, let’s see. So one of the things that, that I oftentimes will Institute with product owners we may have like a product owner touch base, let’s say, and we’ll do this once a week, a couple of times a week at the most, something like that. And so part of the question that we’ll go through, and this is generally with the large program, but part of the questions that we’ll go through all the product owners will, you know, go through and step in and say whether or not they have 10 user stories that are ready to go fully pointed that refined 10 user stories that the team can pull from at any given time. Because I think I find that to be a pretty healthy number. You know, the team is they’re already in the sprint, they’re already doing what they need to do. If they turn out that they have extra capacity, then there’s some stories that are ready that they can pull from. But it’s not too many where the product owner has spent too much time along with the team, you know, getting those stories ready. But it’s not so few that you’ll run into a situation where there isn’t enough work if the team happens to move much faster for some reason than expected.
Dan Neumann: [25:16] Gotcha. That totally makes sense. Let’s talk about tracking points per person. I know I’ve heard that one. Oh, how many points of work does Bob have? How many points of work does Sally have? Why is Joe doing more than Ralph? Right.
Quincy Jordan: [25:31] So the way look at tracking points per person on the surface, it seems like, okay, there’s not a lot of healthy value in doing that. And, but I do think that it can be and this is where I think that it can be. So let’s say for if a Scrum master, is there tracking that number one, it’s not something I would broadcast to the team. So it’s not something I would share across the board. It’s something that to me is more of a one-on-one type of conversation. And with that, I would say, okay, kind of going back to the concept of the volatility or the virus. So that deviation that we talked about, all right. So for this particular person, they’re normally able to accomplish X number of points per team. So let’s say if at the, at the task level, if they’re tracking points, not hours. So let’s say if that’s the case, then they could look at that particular person’s personal volatility. Like they know more personal velocity, they normally get XML done per sprint, but for some reason, in this particular sprint, they’re taking on significantly more than what they would normally do. And so if you recognize that in you as a scrum master may step in and say, Hey you normally don’t take on quite as much. What do you think will be different this sprint that you’ll be able to get so much more done than you have in previous sprints. And maybe there is a good reason, but it’s enough there to kind of spark that conversation. Then I would not suggest calling them out in front of the team that say, what are you thinking? Why do you think you didn’t get so much done, but it’s, I think it’s safe to bring it up in a, in a safe way, you know, for that conversation to happen.
Dan Neumann: [27:34] Yeah. So I think what you’re describing just to check was maybe at the task level of people are using points as a mechanism for estimating task level work, where I often see the question coming up is actually at the user story or the product backlog item level. And for me, it creates a concern that are we actually making increments of value as the product backlog items, something that is built and tested and, you know, there’s a data piece and development piece, whatever, like which typically takes multiple people to contribute to. And that’s where, when I hear about tracking points per person at the product backlog item level, it’s like, what do we have backlog items that can literally be delivered by one person? And then no offense, Quincy, I’m not helping you. If you’ve got a story and you’re getting credit for the points and I’ve got stuff to do, I’m not going to be very inclined to help you because now you’ve set up not a team, but you’ve set up a group of individuals. I’m not dishing the ball because you know, you’re going to get the points and I’m not.
Quincy Jordan: [28:35] And that is the thing that I would say if, if someone is going to track points to the person you have to be very discreet with it. And you have to be very cautious to make sure that no one weaponizes that metric and I would keep that metric close to the chest. You know, like I would not, I would not share that metric across the board. That’s just more of a one-on-one conversation in, in my opinion because it, it can, it can, it can kind of get ugly fast and you don’t want it to and then you do start getting into those conversations around, you know, credit and so forth. And if you’re going to have the conversation around credit, let that conversation be credit for the team, not credit for the individual, which we can also talk about credit for the team as well. Cause I think that’s a interesting.
Dan Neumann: [29:34] Yeah, well, so coming into the, kind of the, the late part here of the podcasts, there, there’s a couple of things that we hear about, I think they’re related. So story splitting sizing remaining work, getting credit for items that are partially done. Like I hear that kind of all getting lumped in together. And even with sizing defects, right? Like we want, we want credit for activity. What do you think?
Quincy Jordan: [29:58] So I always find it interesting every time I hear a team saying, or someone say about the team getting credit for it, because it’s important to get acknowledgement that the team is doing good work it’s important that the team is encouraged. It’s important that the team remains as enthusiastic as possible. I think it is kind of a false sense of security of doing well to focus on getting credit for activity. If that activity does not yield the proper outcome. And that’s where I see that being rooted oftentimes is what we want to make sure the team gets credit for it. I mean, I don’t want to sound, you know, harsh about it, but let’s make sure the team gets credit about delivering what is intended to deliver, not credit for doing work because you can do a lot of work, but if you don’t deliver what is intended to deliver the user or the customer, they don’t really care about your work. And it may sound harsh, but it’s the truth, you know when I pull up in, you know, to go outside of the tech space, but when I pull up to a drive-through window at, you know, fast food restaurant or something I know there’s a lot of work going on behind there. I know there is, but guess what if my fries are not in there, I don’t care about all that work that just when I, it doesn’t matter to me.
Dan Neumann: [31:48] That I think is a great metaphor for what we’re doing here. Like the fries are only valuable if they’re hot, right. You know, you hear it right. If we worked really hard, but there’s no increment of value being delivered. It’s a waste of thanks for trying. Yeah. Thanks for trying. But we need, you know, show me the money and, and by splitting and sizing and crediting, it can mask the root cause that’s in the way of the team delivering, they might be lacking a skill. They might need a dependency from somebody else. And, and, oh, look, you know, we have a velocity of 40, 39 to 41, every sprint, but it’s all work in process. It, you know, maybe the velocity needs to drop in have one sprint. So people go, whoa, what’s going on? And then you can talk about the hard truth.
Quincy Jordan: [32:43] And I’ll tell you one of the areas that I see this happen in the most. And I know we don’t use this term that often you know, these days in the adult space is still a term that is relevant though, which is gold-plating right. So you’ll have, you’ll have a developer. They come up with some bright idea of how to make something so much better, but they don’t actually discuss it with the product owner. They don’t, it’s not one of the tasks from once the story was decomposed into task. It’s not any of that. And they think it’s this ultra cool thing and it might be, but they put so much work into it. And in the end, the story is not delivered, but they have these great, you know yeah. Features or using features differently, but they have these three features to it. But it’s not what was supposed to be delivered. So it was a lot of work that happened without delivering.
Dan Neumann: [33:42] Yeah. We cautionary tale from way back in the day had a developer who decided to put like rolling movie credits as an Easter egg in the application, we blow that. It blew the timeline so bad for the oh, but that’s an extreme example in that I believe was a career limiting move for that particular individual. But you see that all, oh, well, you know, we could, you know, this thing needs to be faster. Well, if the page is already loading, sub-second, doesn’t really need to be faster. Like yeah, maybe it’s okay. The way and not having those explicit conversations as a team with the product owner, with the stakeholders about where should the effort go next for the biggest bang of value. Yeah. It’s awesome. Metrics are always fun. And I, I you know, there’s so many, I feel like there’s so many bad things that can happen with them. It’s hard to make sure that we’re doing the, the valuable, simple things. So Quincy any, any closing thoughts here on metrics that kind of perk your ears up?
Quincy Jordan: [34:48] You know, I would just say when looking at metrics, you know, I know we had talked before about the book measure, what matters and, you know, and I think metrics have to have meaning and that meaning needs to be healthy. And I think in particular for those of us that are coaches or those who are Scrum Masters consultants, you know, in, in this space, you have to attune your ears to hear those things, that warrant a question weren’t more understanding doesn’t necessarily mean anything’s wrong, but it does mean a conversation probably should happen
Dan Neumann: [35:38] Absolutely/ data for enabling decisions and having conversations super valuable. So I want to want to thank you for taking time to explore this. And of course the show notes will be available at agilethought.com/podcast. So what’s on your continuous learning journey right now. Quincy, what’s got you either interested or reading or thinking about?
Quincy Jordan: [36:01] So I’ve actually periodically there are certain books I like to go back to almost like refreshers, you know, for me. And so one of the ones that I pulled out which I know is not necessarily popular these days, however this is the case for me. This particular book is actually cloud God is my CEO and it’s references. And this is actually from 2001. So I’ll just mention just like some of the chapters that, you know, help for me. So clash of two worlds which deals with, you know, bottom line working in a bottom line world and, you know, following those things that we believe that we should do, you know, purpose, success, courage, patience, leadership, by example, yielding control, tough decisions servant leadership, which I did not even think about was in this book, which was in 2001, which is well before a lot of, you know.
Dan Neumann: [37:12] It wasn’t super popular there wasn’t a buzzword back in 2001. Yeah.
Quincy Jordan: [37:17] No then integration priorities and a message of hope. So this is, this is one that, yeah. That I, I pull out every once in a while just to, just to help keep me aligned.
Dan Neumann: [37:33] Oh, sorry. Who’s the author.
Quincy Jordan: [37:34] This is by Larry Julian. Okay.
Dan Neumann: [37:38] Very good. We’ll be sure to put a link to that in the show notes there as well.
Quincy Jordan: [37:40] Yeah. He interviews a lot of top CEOs. And so they give a lot of reference points, you know, from them and, and I’m a strong believer in not repeating mistakes. I don’t have to. So if I can learn from someone else, then I don’t have to experience those mistakes myself. Absolutely. yeah. In those, I mean, there are plenty is it’s a lot of top CEOs in there.
Dan Neumann: [38:05] Awesome. Get to learn, learn from their mistakes and make new ones. I love it, Quincy. Well, thank you for sharing about agile metrics that make your ears perk up and thank you for the book reference God is my CEO until next time.
Quincy Jordan: [38:20] Thanks Dan pleasure as always.
Outro: [38:25] This has been the Agile Coaches’ Corner podcast brought to you by AgileThought. The views, opinions and information expressed in this podcast are solely those of the hosts and the guests, and do not represent those of AgileThought. Get the show notes and other helpful tips for this episode and other episodes at agilethought.com/podcasts.