877.514.9180Tampa | Orlando | Atlanta
Agile Blog

Podcast Ep. 29: How to Combat Cognitive Biases for Effective Agile Teams


How to Combat Cognitive Biases for Effective Agile Teams Podcast Banner

Episode Description:

In today’s Agile Coaches’ Corner podcast episode, host Dan Neumann is going solo to explore the topic of cognitive bias. It’s impossible for us to logically process every piece of information we receive, so our brain has come up with shortcuts, or simpler ways of processing information. Though this can serve us really well, unfortunately, our brain has developed biases to process information rapidly.

Today Dan covers many of the common cognitive biases, why they’re evolutionary helpful, and how they can affect agile teams. He also provides examples and gives solutions on how to combat these biases in an agile team. Some of the biases he covers are the anchoring bias, optimism bias, availability bias, illusory superiority, attribution bias and the Dunning-Kruger effect.

Tune in to discover the cognitive biases you may not even know you had and learn how to combat them.

 

Listen on Google Play Music

 

Key Takeaways:

  • What is cognitive bias?
    • An error in the way our brain processes information
    • It’s impossible for us to logically process every piece of information we receive, so our brain has come up with shortcuts; simpler ways of processing information
    • Our brain has developed biases in order to process information rapidly
  • Common biases:
    • Information biases: errors in the way we process information (includes: anchoring, optimism, and availability biases)
    • Ego biases: errors in the way we see ourselves (includes illusory superiority)
    • Anchoring bias (where an individual relies too heavily on an initial piece of information offered to make subsequent decisions)
    • Optimism bias (where we generally believe ourselves to be less prone to negative events)
    • Availability bias (where we judge how important or how likely something is by how easy it is to think of an example)
    • Illusory superiority (known as the “above average” bias; we consider ourselves)
    • Attribution bias (where we tend to attribute our own successes to our innate abilities and our failures to bad luck, and others’ success to circumstance and failures to incompetence)
    • Dunning-Kruger effect (where unskilled people tend to be overconfident and overestimate their abilities and highly skilled people underestimate their abilities and become frustrated with others)
  • How do cognitive biases affect agile teams?
    • Anchoring bias over-weights the first piece of information presented
    • Under pressure, anchoring can become even more pronounced
    • Anchoring in a retrospective can lead the meeting to revolve around the first topic mentioned
    • Optimism bias isn’t so helpful in a complex adaptive system, while building software, or working in teams
    • Availability bias can affect those design systems because when we’re not clear about the end user we’re targeting, we tend to use our own references
  • How to combat and moderate cognitive biases:
    • Combat anchoring with silent writing, planning poker, and by simply sleeping on it to remove the pressure
    • Combat optimism bias by processing more complex information using the “beyond budgeting” approach: estimate, budget and the expected outcome
    • Combat optimism by considering the opposite, many alternatives or alternative plausibility
    • Combat availability bias by really doing your research; be aware you’re most prone to recall your most recent or most memorable event and conduct experiments to prove or disprove your bias
    • Moderate ego bias by creating safety; model getting feedback, evaluate the work and not the individual, bring data and broaden your perspective

 

 

 

Mentioned in this Episode:

 

Like what you heard? Check out our podcast page for more episodes full of interesting discussions, agile insights, and helpful resources.

 

Transcript

Intro: [00:03] Welcome to Agile Coaches’ Corner by AgileThought, the podcast for practitioners and leaders seeking advice to refine the way they work and pave the path to better outcomes. Now here’s your host, coach and agile expert, Dan Neumann.

Dan Neumann: [00:17]  Welcome to this episode of the Agile Coaches’ Corner. I’m your host, Dan Neumann and thank you for downloading and listening today. In this episode we’re going to explore cognitive bias, but before we explore cognitive biases, just to remind her that the opinions you’re going to hear today are those of myself. They are not necessarily those of agile thought, other companies or other people. If you have a topic you’d like us to discuss in the future, tweet it to us with the #AgileThoughtpodcast or email it to us at podcast@agilethought-staging.ectfh4-liquidwebsites.com.

Dan Neumann: [00:53]  So for those of you who are not familiar with what a cognitive bias is, it’s essentially an error in the way our brain processes information. It would be impossible for us to logically process every bit of information that we receive. And so our brain has come up with shortcuts, simpler ways of processing information. And these have served us really well. So if we were a caveman walking through the bushes and we hear a rustling sound, you can’t sit there and logically process the information you need to decide, is the rustling in the bushes going to be something that will come out and try and kill and eat me? Or is it perhaps a food opportunity for me? And so our brain quickly has to process that information, then take action. It can’t sit there and logically make sense of every little bit of information. And while that can serve us really well, unfortunately to do things that rapidly, our brain has developed biases in the way that it weights different information that we receive.

Dan Neumann: [02:01]  One of the really common cognitive biases is that of anchoring. And to take an example that’s not within software, typically this can be seen in retail space a lot of times. So we’ll see something that’s priced at $200 and so we think that wow, this thing is worth $200 and then when it’s on sale for 40% off, we get really excited to pay $120 for it. Well, what the retailers have done is anchored you at that $200 price. And then when you see the lower price, you get excited about how much money you’re saving. And we don’t stop to think, is this thing really worth $120. so in very simple terms, that’s what anchoring is. So that’s great. Retail’s fine. How might this affect an agile team? You might ask, well, you can think of this in terms of sizing stories in a product backlog, sizing product backlog items. When you get the team together and we’re talking about a product backlog item, what can happen is the first person who offers an opinion will anchor that group on that number, especially if it’s somebody with some positional power, but it doesn’t have to be, it really is whoever offers that first number. So if somebody says, oh, I think this is an easy story, it’s a two point story, all of a sudden the future decisions or the future contributions by the team about how big that story is, are going to be weighted down more closely to two than if the first person who said something said, Gee, I think this is a big story. I think it’s 13 points.

Dan Neumann: [03:54]  So anchoring is overweighting that first piece of information and then subsequent decisions are made based on that initial information. What can make this even more insidious is when we’re under pressure, anchoring can become even more pronounced. So if you’re on a team, what do you do about it? You’re perhaps familiar with planning poker. That’s where instead of having somebody say, I think the story is a size X, he fill in the blank. What planning poker allows everybody to do is make up their mind independently and then everybody exposes their numbers at the same time. So there is no first piece of information, but what we do is get everybody to make a decision independently and show their numbers.

Dan Neumann: [04:42]  Another type of anchoring can happen in retrospectives. So when your inner retrospective and the first topic to get brought up often can anchor that remainder of the meeting on that first topic that was mentioned. So if somebody is talking about a challenge that they have in a particular area of the code or the particular backlog item or somebody outside the group, all of a sudden they’ve created an anchor for the group to talk in terms of that topic and it will preclude other topics from coming up. So I really like combating that with the technique of silent writing. So with silent writing, what we do is we create an opportunity for every individual to sit and to write their items down that they would like to talk about. So if you’re doing a retrospective with the framework of liked, lacked, learned, and longed for, everybody gets a chance to sit and reflect to themselves. How did the sprint go create cards for each of those topics. And then the group then gets to see what all the topics are that everybody came up with at the same time.

Dan Neumann: [06:01]  So by silent writing, we’re avoiding the anchor of the first talker or the loudest talker or the fastest talker. So planning poker is one way to combat anchoring. Silent writing is another way. And the third way, especially because we’re prone to anchoring when we’re under pressure, is to simply sleep on it. And I’ve seen teams do this to great effect when they are in sprint planning, especially if they’re doing sprint planning late in the day, they’ll sleep on it and they’ll come back the next day before settling in on their sprint commitment. It removes the pressure, it gives people a chance to start with a clean slate where maybe the prior day they’d been anchored by some goal they hope to achieve, some deadline that had been stated or some notion of how far through the product backlog the team might get. So you can do planning poker, silent writing, or just simply go home and sleep on it. And those are some ways to combat anchoring on your team.

Dan Neumann: [07:02]  The second cognitive bias I want to talk to you about is optimism bias. And this is where we generally believe ourselves to be less prone to certain events. So let me give you an example because I think this bias is a little bit tricky. Let’s say we have an individual who believes that their chances of getting cancer are 30% for whatever reason, whether it’s lifestyle or environmental issues, whatever the case might be, they believe their chances of getting cancer is 30% when they learn or are introduced to new information that says that they’re actual chance is much higher. They may update their belief, but it’s only going to move a little bit. So let’s say they moved from 30 to 33%. Now in another representative group in this study, they learned that the actual chances were much lower, or that’s the information they were presented with anyway, is that their actual chances of getting cancer who were much lower. And what we see is that that group didn’t just move from let’s say 30% down to 27% but they moved much, much farther in their belief system. And so what’s happening is because of our optimism, when we hear information that is even more optimistic or supports more optimism, the needle moves much farther than if we hear information that would be prone to be more pessimistic. And the pretty cool thing about this is that under functional MRI, what we were able to see or what this study was able to see, I wasn’t involved in it. This study was able to see is that the brain actually lights up differently when it’s provided with information that is optimistic or supports that optimism. So you can imagine if Caveman was walking around and was very pessimistic and down about everything his is get up and go, might not be so high. And so it’s, it’s evolutionarily, evolutionarily helpful for us to have this type of bias.

Dan Neumann: [09:20]  One way to combat the bias because while it might be helpful for survival of the fittest, it might not be helpful in a complex adaptive system. It might not be helpful as we’re building software. It might not be helpful as we’re working in teams. So we can combat this by really trying to process more complex information. And something I like as a technique for this is the beyond budgeting approach. One of the facets of beyond budgeting is it takes a very complex problem and breaks it down into smaller pieces. So think of how many projects you’ve been on where you estimate the project that estimate for time and money becomes the budget, and then that estimate of time and dollars also then becomes your goal and the expected outcome. So what you’ve really got is one number. You can make that a much more complex and interesting scenario if you separate those numbers into three different pieces. Have your estimate, that’s fine. Let’s see, you estimate the project at $1 million. How then does that estimate compared to your budget? Hopefully you have a budget that is larger than your estimate, otherwise you’re in trouble right out of the gates. So let’s say your budget is 1.5 million, you think you can do the project for a million and then as you go through the project you also have an expected outcome and so you may trend to be under $1 million. You may trend to be over $1 million, but you’ve really made it a much more complex scenario by having the estimate, the budget and an expected outcome, all of which you can have much more interesting conversations about. And so by making the situation more complex it kind of handicaps this optimism bias and the selective updating, making it much more difficult to game.

Dan Neumann: [11:21]  There are a couple other ways to de-bias. One of these is a consider the opposite scenario. So instead of assuming that your project is coming in on time and on budget, what if the scenario were we’re going to come in and way over budget or way under budget. That’s a technique where you can go through and consider the opposite, but there’s some pretty cool research that indicates you don’t actually have to consider the opposite, but considering any plausible scenario besides the one that you’re really driving towards, would remove the bias in the way you think. And so the optimism bias, making it more complex and exploring lots of different alternatives or at least a couple of different alternatives can de-bias the way we think of problems. I hope that’s clear. Bias is a tricky animal and humans are complicated creatures and so that’s where this field of cognitive bias for me becomes a really interesting one as we start to explore the way people think.

Dan Neumann: [12:29]  For those of you that are really into designing systems, one of the biases I hope you’re familiar with is called availability bias and this is meaning what we do is we judge how important or how likely something is by how easy it is to think of an example, and this is really more pronounced when that example is vivid or unusual or emotionally charged. And so as we are thinking about the likelihood of users taking particular actions or getting confused by certain parts of the application, what we’re really likely to do is use our own recollection, memories that are really available to us and put those into the reasoning that we use as we talk about the persona or a user that we might have in mind. And so when we’re not really clear on the persona and the end user we’re targeting, what we tend to do is use our own references.

Dan Neumann: [13:32]  So think about this as you think through examples, whether they’re interactions in your family life or interactions with travel or the airline when you’ve had a really bad experience there, the most available information to you is vivid or unusual. It’s when you had that really terrible flight or I really amazing flight there was emotionally impactful. But what you’re not doing is getting an even assessment of how the travel is consistently over time. So one way to combat availability bias is to really do your research and do your homework. Be aware that you are prone to recalling the most recent or the most memorable events, those things that are available to us. Do your research, have study groups, look at how users actually interact with your system and don’t just go off of your own personal bias. The construction of personas can be a way to combat the availability bias so you are more intentionally putting yourself into the space of a user with a certain demographic, certain characteristics, certain goals that they have. And the third way to combat availability bias is to conduct experiments. So if we believe that a certain facet of the application is going to be easy to interact with, if we make the button blue, if we move the button to the right instead of on the left, then we create a hypothesis and conduct an experiment to test and either prove or disprove the hypothesis that we have. And so that’s a way to move away from our own bias. So the three biases I’ve talked about so far, anchoring optimism and availability, all fall within the category of information biases. They’re errors in the way we process information.

Dan Neumann: [15:31]  The second category I want to talk a little bit about is that of the ego bias. And these are errors in the way we see ourselves. One very common one is called illusory superiority or simply it’s the above average effect. So you think of driving. Most drivers believe that they are a better driver than other drivers, so they’re above average. The way that that could be true is some drivers may view themselves as being really good drivers because they can, you know, navigate the highways at 20 miles over the speed limit, they haven’t had an accident and they only get a few speeding tickets. And so they’re good drivers because of the speed at which they’re able to navigate. Other drivers may view themselves as being better than average because they are more prudent. They drive five miles under the speed limit with their hazard lights on whatever the case might be. They tends to look at things from their perspective in a way that supports them and makes them feel above average. So we’re going to touch on that briefly. And then attribution bias is where we tend to attribute our own successes to our innate abilities or our hard work. And when we have failures, people tend to attribute those to bad luck. So I am succeeding because of my innate abilities and when I failed it’s because I have bad luck. Attribution bias however when we look at other people, when we look at their successes, we go, oh well that was because of their circumstances. They were born into the right family, they had the right connections, they were in the right place at the right time. And when they fail, of course it’s because they’re incompetent, lazy, they haven’t put the work in and so attribution of our successes and other successes, our failures and other failures tends to be skewed. One interesting side note for those of you who are listening in some other countries is this is not true of all cultures, but it does tend to be a fairly western perspective. I know it’s true in the Americas and research has been done in other parts of the world that don’t see this same attribution bias. In some countries, some parts of the world, an individual success is quickly attributed to that of their family or because of their circumstances. And there’s much more humility around individual success. So depending on what culture you’re listening from, your experience may vary.

Dan Neumann: [18:18]  And then the third ego bias I want to talk about, we talked about illusory superiority, which is the above average bias attribution bias, where we attribute successes and failures of ourselves and others differently. And the third one is Dunning-Kruger effect, so fancy name. But when we are unskilled, we tend to overestimate our abilities. And for extremely highly skilled folks, they tend to underestimate their abilities. So what can happen with people who are unskilled and believe themselves much more skilled is they tend to be over confident? That’s a side effect. So if I am an unskilled developer, I may overestimate my ability and I may feel like I couldn’t develop something faster or more robustly with higher quality, whatever the case might be then actually is appropriate. Whereas if I’m a highly skilled developer, I might underestimate my ability. And the side effect of this can be frustration with others. So for those of you who are Scrum Masters and you’re looking at your teams, one of the things to be aware of is you may have extremely highly skilled developers or testers or designers on your team. And because they tend to underestimate their own abilities at that point, they may more easily become frustrated with those who don’t catch onto that skill as quickly as possible. They don’t catch on to the development or the testing or the user experience part as quickly as that really highly skilled person.

Dan Neumann: [20:00]  So what do you do with all this ego bias? Because let’s face it, you’re not going to get rid of it. You can’t. It’s simply the way humans are wired. And even when people have been made aware of cognitive bias, they still believe themselves to be less prone to it than others. So, one of the things I think is really important to do is create a lot of safety. And for this, I’d like what’s called the retrospective prime directive, or it’s by Norm Kerth out of his book project retrospectives, a handbook for team review. And it says, regardless of what we discover, we understand and truly believe that everyone did the best they could given what they knew at the time, their skills and abilities, the resources available in the situation at hand. And so you’ve created some safety now where when things aren’t going well, it’s not because somebody wasn’t trying their best or didn’t have good intentions, but you start to look at the skills, the abilities, the situation at the time, and you’ve created some safety for exploring the challenges on the team.

Dan Neumann: [21:10]  A couple of other ways to moderate ego bias is really model-giving feedback. Giving feedback can be tricky. In a previous episode we talked about feedback and the situation, behavior and impact or the SBI format. You can go back and listen to that on a previous episode with Joseph Carella as we were talking about coaching. So you’ve got the SBI format as one way to give feedback. Try to look at the work and not the individual. So evaluate the work, don’t evaluate the individual. Look at the results that are coming out and try to get better results as opposed to focusing on the individual’s activities. Bring the data. So if we believe quality is decreasing due to some effect, bring information about the quality. If we believe that the system is getting slower, go find information about that. So look to bring data about behaviors as one way of moderating ego bias and then broaden your perspective. So I remember as a developer, I thought I was pretty good until I really started hanging around and working with some people that I viewed as much more skilled than me. And so when you have people who overestimate their abilities, one way to hopefully open their eyes and get them exposed to a different reality where perhaps they realize where they are on the spectrum of development is to expose them to other people with their practices and then they’ll get to see how they, how they fit into that. So you know, if you’re a race car driver or a driver and you think you’re really good, you don’t compare yourself to those people who are truly elite race car drivers. Or if you’re in a sporting event, you start playing against a different caliber of talent. Well, it’s the same thing with software development. Get out to Meetup groups, go outside the four walls of your organization and get exposed to new practices, new perspectives, and see what that does for your ego bias. So I’d wanted to take this time in this episode to kind of expose you to cognitive bias. I find it to be a really interesting topic. If you’re interested in hearing more about cognitive bias, let us know. Again, you can tweet us at #AgileThoughtPodcast or you can email us directly at podcast@agilethought-staging.ectfh4-liquidwebsites.com and we’d love to hear your experience with biases, other topics you might be interested in, other biases you might like us to explore. The world is our oyster. So thank you again for listening and we’ll look forward to some feedback from you.

Outro: [23:54] This has been the Agile Coaches’ Corner Podcast brought to you by AgileThought. Get the show notes and other helpful tips from this episode and other episodes at agilethought-staging.ectfh4-liquidwebsites.com/podcast.

|

How can we help
you succeed?

Contact us to share the challenges you’re facing and learn more about the solutions we offer to help you achieve your goals. Our job is to solve your problems with expertly crafted software solutions and real world training.

For a better experience on the web, please upgrade to a modern browser.