Agile Podcast: Agile Coaches' Corner

Ep. 41

Podcast Ep. 41: Testing on an Agile Team—Tips from John Gravitte

Episode Description:

In today’s episode of the Agile Coaches’ Corner podcast, host Dan Neumann is joined by AgileThought colleague, John Gravitte, the director of quality. John has been in the quality field for about 26 years. He started out doing the testing himself but became a QA Manager in ’99 and began to manage teams. It was then that he realized he really had a knack for process improvement, streamlining standards, and efficiency. When he arrived at AgileThought, he was introduced to agile and it was a perfect fit for his quality-oriented mindset.

In this episode, they will be exploring how testing is done on agile projects; taking a look at some of the challenges, good practices, anti-patterns, and which standards and processes can really enhance agile. John also provides many key insights and tips on how to go about leveraging quality and using that within agile teams.

Key Takeaways

How to address some of the challenges that come along with testing on agile projects:

  • Use shift left testing (i.e. doing some testing in the development environment to give quick feedback to the developers).
  • Implement processes that support quick feedback.
  • Arrange meetings with developers to understand exactly what they need to know and get clarity on the acceptance criteria.
  • Have test case reviews and outline the test cases that need to pass prior to coding.

How standards and processes can enhance agility:

  • Having a process and a standard documented for a project helps individuals know what they are responsible for.
  • If the project grows and you have to bring on new talent, having standards comes in handy for newcomers—especially for younger talent.
  • It’s not a replacement for thinking; they’re guide rails.
  • The processes can always be updated when better ways are discovered.

How to ensure quality roles keep up with the rapidly changing codebase and evolving feature sets:

  • Use test automation which enables those in quality roles to be able to keep up with the changes and rapid delivery.
  • QA leads should meet with test engineers who are one sprint behind to automate the most relevant pieces first and put that into a ‘smoke test’ which saves time.

How does automation pay off?

  •  Risk really builds up if you don’t use automation because there’s no way you can manually verify everything, every time.

  • The amount of time and the coverage that you get from automation is worth the investment.
  • As environments get used, data gets cluttered up, so having automation to do the data setup can really enhance quality.

Mentioned in this Episode

 

John Gravitte’s Book Pick:

Transcript

Intro: [00:01] Welcome to Agile Coaches’ Corner by AgileThought, the podcast for practitioners and leaders seeking advice to refine the way they work, and pave the path to better outcomes. Now here’s your host, coach and agile expert, Dan Neumann.

Dan Neumann: [00:16]  Welcome to this episode of the Agile Coaches’ Corner podcast. I’m your host Dan Newman, and I’m excited today to be joined by John Gravitte, one of my colleagues here at AgileThought who is the director of quality. And thanks for joining John.

John Gravitte: [00:29] Absolutely. It’s good to be here.

Dan Neumann: [00:31]  Thanks and as always, we’d need to do a little bit of a disclaimer that these are your opinion and mine and not necessarily that of AgileThought or other folks or other companies.

John Gravitte: [00:41] All right.

Dan Neumann: [00:42]  All right. So we’re going to be exploring kind of how testing is done on agile projects. Maybe some challenges that there are some good practices folks might learn from. And as the director of quality, I’m excited to have you participating. So what, um, what was your path like maybe from quality into agile projects? What, what, what’s a little bit of your background?

John Gravitte: [01:08]  Uh, so my background, uh, I’ve been in the quality field for about 26 years. Uh, you know, in the beginning I was actually, uh, um, doing the testing myself, um, doing manual testing, load and performance testing, moved into some test automation. A lot of the tools that I used are not even around anymore. Um, but, um, you know, once I moved to Florida in ’99, uh, I got into QA manager positions and then started managing teams. Uh, and that’s when I realized that I, I really had a knack for process improvement, um, streamlining, you know, standards and, and being really efficient at getting the testing done. Um, and you know, another big, big issue in the beginning was communication, right? So it was, um, you know, I’m a stickler for detail and you know, I’ve always kind of told folks that quality is in the details. So, um, that, you know, that’s kind of my path was doing the work and feeling the pain and then be getting into management and you know, really focusing on process improvement and efficiency, which then when I came to AgileThought, you know, that was my introduction to agile. Um, and it was just a perfect fit because agile is, it’s lean, it’s quick, it’s flexible and you know, quality needs to be, you know, all of those.

Dan Neumann: [02:42]  John, you mentioned lean, quick and flexibility as some characteristics of agile when it’s done well. And that to me would create a lot of challenges for somebody who’s focused in on quality and making sure that the work that’s being churned out by the development team is actually still of quality because we are, we’re moving fast, we’re changing things, we’re delivering business value and we want to make sure that we’re not just doing things that are breaking things reggressively or half-baked. And what are some ways that you’ve addressed some of those challenges?

John Gravitte: [03:16] Uh, I think the, one of the, one of the things we’ve done that has worked really well is, you know, that the buzzword is shift left testing. Um, but really what it means is doing some testing in the development environment. You know, so everyone knows the Dev environment is not stable. It’s constantly getting builds. Um, you know, the unit testing is taking place there, but what we’ve done is we’ll start testing in the dev environment so we can give quick feedback to the developers. Um, that’s not to say that we’re closing stories in Dev, it’s just a quick look at what they’re producing, give them the feedback. If it’s not working, they can fix it before a release goes to QA. Um, so you know, that that’s one thing that we’re doing on, on all the projects is doing some testing in Dev before it goes to the downstream environments. Um, we’ve also implemented some, some processes that for that quick feedback, right? So we’ve altered how we, we log our bugs and you know, the information that we put in those bugs, we make it short and sweet. Just what the developers need. It doesn’t need to be lengthy, wordy, take uh, you know, minutes for them to read and figure out. Uh, so we, we have these meetings with them just to just to say, Hey, what information do you need from us? Right. I just want to give you the bare minimum that you need. I don’t want to speculate. I don’t want to tell you what I think is wrong. I just want to give you what you need and let you go ahead and fix it.

Dan Neumann: [04:53]  As you were describing that, a couple things came to mind for me. I love the term shift left cause I think of a, a workflow that starts on the left and you have value added steps over to the right and what you’re describing, I think then with that term is moving farther up the value chain or earlier on into the process by shifting left.

John Gravitte: [05:11] Yes, yes. And, and even before testing in Dev, uh, when we say shift left, we’re even performing quality checks on the acceptance criteria. So I have asked the quality assurance folks to attend the, the planning meetings and the refinement meetings. And you know, challenge those acceptance criteria. Do we have, you know, or three or four, is that enough? Do we, are they missing one? Uh, get clarity on those acceptance criteria. Right. Perform that quality check there because we have to write the test cases based off the acceptance criteria and the stronger the AC is, the easier it is for us to write our test cases and efficiently.

Dan Neumann: [05:54]  Yeah, definitely. And you know on a Scrum team that you don’t to do Scrum to be agile, but on a Scrum team, I would definitely want all the folks on the Dev team, whether they’re specialists in development, or specialists in quality, we would want them in that planning meeting to identify, like you said, the clarity on the acceptance criteria have input into the tasks that are going to be part of delivering on that backlog item. One of the things that I have done in my past experience with coaching teams was we actually had quality folks sit down with the Dev specialist and typically the, the product owner, maybe a business analyst who’s helping the product owner. And before we start coding things, outlining what those test cases that ultimately need to pass would be. So taking the acceptance criteria and breaking those down into that next level of some test cases, is that something that you’ve also done as you shift left and get clarity on the acceptance criteria and test cases?

John Gravitte: [06:53]  Yes. Um, we actually have test case reviews. Um, and that is for product owners, business analysts and technical architects. Uh, those are the three individuals that we invite to review the test cases. Um, you know, we want the developers to be heads down writing their code. Um, we feel that we can get the feedback we need on these test cases by having those three individuals in, in the meeting. Um, and it’s proven to be, you know, very beneficial, especially when you’ve got your product owners who are more of the subject matter experts. They know, you know, maybe the application we’re building better. They know the industry, right? If it’s tax or audit or uh, entertainment and you know, they, they look at it from a different perspective than we do. So there are a lot of times a, you know, they’ll say, oh, these test cases are good but you’re missing a few, right. We also do these things in the application or this is something that could happen. So very beneficial to have those eyes in there and have that meeting so we can make sure we’ve got very robust test cases and that we’re covering everything.

Dan Neumann: [08:03]  Okay. I’ve run into a challenge at times where because we are delivering incrementally and iteratively focusing the team regardless of specialty in on, you know, this is the product backlog item we’re delivering for now and it’s not the be all to end all of that feature ultimately. But we’re doing a thin slice through maybe a bunch of tech layers or we’re implementing the happy path and we’re going to make it more robust later or more scalable later or more whatever later. And really finding it difficult sometimes to keep people into a box just around those acceptance criteria without trying to over architect or over engineer or over analyze. What kind of challenges or tips you have or have you encountered that same scenario? I suspect it’s not just me, but maybe, so for instance, maybe we’re building a feature and there are several different steps to it or, you know, I was thinking of a scenario where we’re going to have the user login. You know, the easiest thing to do is have them log in and get in successfully. Then later you can come back and handle misuse scenarios or abuse scenarios. The, you know, if they try three times, what do we do? Do we do multifactor later, do we, et cetera. But really doing a happy path or a really thin slice into the capability and then building it out more robustly later.

John Gravitte: [09:27] Yeah. So that that kind of comes into, um, you know, when we’re writing, when the business analyst is writing the stories, um, if, if it’s getting to be very lengthy and if, you know, we do, we do see times where acceptance criteria grows to, you know, eight, nine, 10 different bullet points of acceptance criteria. Uh, at that point, that’s when we start having the discussion. Maybe we need to break this down into smaller stories. Um, even though the development effort still may be small, you know, maybe a two or three. Um, but if the acceptance criteria is rather large, we would have that discussion to break it down, uh, and make it multiple stories to do exactly what you’re just saying. You know, let’s deliver a thin slice of this, the happy path, and then let’s come back and do the air handling and then let’s come back and do the multi factor authentication, then let’s come back. Rather than putting that all together and making that, you know, a, a medium Dev effort, but a large testing effort.

Dan Neumann: [10:34]  Yeah, definitely. And I, I think it’s just a discipline in some cases of really making sure the team stays focused on what the acceptance criteria are for the increment that they’re working on at that, at that time. I think that might be a good chance to segway into a little bit of of the discipline around quality. So you know, your, your background is, we were talking about it. You, you really do have a mindset for putting processes in place, for putting in some standards and following a procedure that’s appropriately rigorous. And I think a misconception about agility is it is loosey Goosey or undisciplined or, or some of those things that are very fragile. And so maybe you can talk about how standards and having processes or how you’ve put those in place to enhance agility.

John Gravitte: [11:21] Yeah, so I love putting standards and process in place because new people that come onto the project after it’s already started, it’s very easy for them to, to look at a document or look at a process and say, hey, here’s how we’ve been writing the test cases. Here’s how we’re executing, here’s the metrics and reports. Here’s, here’s how we’re doing it, right? And, and with agile, you know, it’s not lengthy, meaning, uh, you know, eight, nine, 10 steps, right? I want to look at that process and say, okay, if that’s a 10 step process, how can I get that cut in half? I want to do that in five steps. I’m going to cut out the junk. Um, so by having a process and a standard documented for a project, uh, is very easy for that, for that individual to know what they’re responsible for, right? How do I do x? Right? Well, is right there. It’s the standard of how we’re doing a reporting or communication. Um, if as the project grows and we have to bring on new, new talent, uh, it’s, it’s an easy, hey, look at this as part of your onboarding and if you have any questions, then let’s, let’s talk. Um, so that’s, that’s where it’s really come in handy. Uh, especially for younger talent, right? It’s the, you know, the junior testers that we hire. Um, having a standard enter process really benefits them versus a seasoned person who’s got 10, 15 years of quality testing under their belt. You know, they come into a situation and they’re like, Yep, been there, done that. You know, I see your process, I get it, I’ve done it before and they’re off and running. But you know, the junior talent that we’re hiring out of college or one, two years experience, it really helps them understand, hey, what am I responsible for and how do I do it?

Dan Neumann: [13:16]  Yeah. And you know, it’s so happy to hear you use the term talent instead of resources. I know as an agile coach, that’s one of the things we try and really shift in mindset. Like these, these are people, they’re talented, you know, they, they are thinkers. And by having those short descriptions of what the responsibilities are, what the standards are, what our team agreements are, if it’s not a replacement for thinking, which is where I think sometimes standards and processes can become dysfunctional is when somebody says, well, I follow the process, but they’ve, they no longer care about the results. The adherence to the process becomes the goal, not having guide rails or safety rails that have the process continue to be valuable and flexible.

John Gravitte: [13:55] Yeah. You know, that, that’s a great point because one of the things that, uh, I advocate on every project that we embark on is, you know, here is the standard in the process that we’re going to start with. But, um, we are flexible. If somebody comes up with a better way of doing it or the client wants to see it in a different format, you know, we, we don’t fight it. Uh, you know, we were flexible and we can make those changes and we’ll change the process, change the standards, um, as long as, right, it doesn’t increase the time and it doesn’t affect the quality. Right. I’m interested in delivering a quality product to our clients. How we get there, if we’re all on the same page and you want to tweak that standard of process a little bit, I am totally fine with that. Uh, you know, I admit in the beginning of every project, you know, I am, I’ve been doing this a long, long time. I still don’t consider myself an expert and I want feedback from all of you as you know. So if you see a better way to do it, let me know and let’s talk about it and we can make that change.

Dan Neumann: [15:15]  That’s what I love about AgileThought. The agile community in general is the willingness to continue to seek out better ways of doing things. So a certain process or just because somebody else or a different team does. How, you know, we were talking mob programming. We aren’t, uh, Eric Landes and I went to a session with a man named Woody Zule at the Agile2019 conference a bit ago. And he was describing their experience with mobbing, which is basically a whole team working on one backlog item at a time, which is pretty awesome. And he made the case that like mobbing emerged from trying to leverage good things that they were doing. So they got a good result. And so they iterated on how they were doing things and they ended up with mobbing being the approach that they took another organization, another team might continue to improve and they’ll end up somewhere totally different. It doesn’t mean that it’s better or worse, it just means it’s maybe more appropriate for their environment. And so it really encouraging people not to grab mobbing as the new bright, shiny object and go shove it into an organization or a team where it isn’t appropriate. And I think that’s part of that continuous learning journey that hopefully every team is on. So, John, we were talking about standards and processes and then earlier in the podcast we were talking about the need to be lean, quick, flexible. And when we’re talking to developers, we really encourage them to do unit testing. So often that’s in test, first writing unit tests before you write the code. And I’m curious what experiences you’ve had with quality and how to have quality folks keep up with the rapidly changing code base and evolving feature sets.

John Gravitte: [17:01]  Um, so one thing that we were doing is using test automation. Um, you know, there’s, you can automate at the API level and, and the web services level, uh, not at the UI level and, and, and be able to keep up with the changes, right? The rapid delivery. Um, most of the changes that we see, it’s UI changes, the API APIs early on in projects and, you know, sprints one, two, three, uh, you know, with, with our test engineers, we, we operate one sprint behind. So as the teams, uh, you know, deliver and demo their, their work at the end of sprint two, let’s, let’s say, um, the QA leads, we’ll meet with our test engineer and say, here are the PBIs that I had, here’s the test cases, here are the good candidates, here are my, you know, my, my backend API, web service, uh, test cases and we’ll automate those first. And we’d like to put that into what we call our smoke test. Uh, and you know, there, there will be no UI test cases in, in our smoke test. It’s all back in and we actually try to work with our Dev ops teams to get it into the build pipeline. So it’s beautiful because when developers, they’re writing their code and you know, doing their unit testing, uh, which is also sometimes automated their unit tests. Um, you know, then the last part of that build process is the automated smoke test kickoff. And it basically validates that all the end points work and the integrations there and we know it’s a solid build and now we can deploy that into the next environment. Whether that’s an integration environment or the QA environment, you know, that saves us time because that automated smoke test of the, of the integrated layer is already executed before that build is deployed. We don’t have to wait for the deployment time then manually verify that. So you know that that is a huge saver of time right there. And you know, once, once all the API’s are developed and they’re solid, they, you know, you know that they rarely change. They do change, but it’s rare. Uh, you know, that at that point now we go after the, the UI test cases and we start what we call creating our regression scripts. Um, and again, we’ll run those as often as we can. You get a new, if we can do continuous integration and get a build every day into our, our ENT integration or QA environment, we’ll run those regression, those automated regression scripts, uh, against every bill that goes there.

Dan Neumann: [19:45]  I wanted to come back and just ask maybe for some clarification on something. So within the sprint when the developers are building an API or some capabilities, it, I’m presuming that you’re doing some foundation of that capability and then cycling back to add the automation later. Is that, so we do have a robust definition of done in the sprint.

John Gravitte: [20:04] Yes. So they’ll, all those APIs will be tested manually. Um, you know, whether w a couple of tools we use is swagger, uh, and Postman, um, the difference there, you know, swagger, you can only test manually, uh, postman, you can automate it and run your APIs through postmen, uh, make that automated. Uh, but yes, it’s done manually during the sprint so we can sign off on that PBI and market as done and then we automate that PBI.

Dan Neumann: [20:34]  That’s cool. Yeah, I thought the clarification would be helpful because one of the challenges, or I dunno, if it’s a challenge, one of the anti-patterns that sometimes we see as we go and to coach and organization and their delivery team is the desire to use, let’s say in a two week sprint, we want to build for 10 days. There’s no time for quality. And so the validation, they want to kind of do it later in a second sprint. So, um, which creates these anti-patterns of do, did we actually build anything that worked in the sprint? Well we don’t know if we haven’t tested. So yeah. Thank you for the clarification on that process. Is there a way that maybe you’ve, you can think of in the past where you’ve seen automation really pay off? Cause I think a lot of folks really get focused in on automation, takes a lot of effort. It’s quite expensive to maintain yet at the same time risk builds up in the system if you’re not automating because there’s no way you can manually verify everything every time sprint after sprint. And so risk builds up in the system. I’m wondering if there’s ever been those instances where you’re like, sweet automation just paid off huge.

John Gravitte: [21:37] Oh yes. Uh, that we’ve had a couple projects where, um, it’s, it was a scaled agile project. So there were multiple work streams and where it really paid off is there were multiple personas for this application. Right. You had three or four different levels of access for, you know, internal clients and also two or three levels of access for external. Um, those are the type of test cases that are good candidates for automation, right? We’ll give, go through, give me the security matrix for these different levels of access and let’s automate that. Uh, it also came in very, very helpful because of cross browser testing. The client wanted us to test with chrome. They also wanted us to test with I. E. 11. So the scripts are able to run in multiple environments. They can run in your dev development environment, QA, uh, low test environment, staging environment and it can run on a chrome browser or I E. So the amount of time and the coverage that you, you get from automation, it definitely is worth the investment. Um, one other thing I want to mention that it’s very, very helpful for is, you know, I had mentioned that staging environment, that’s where a lot of user acceptance testing would take place. Uh, again, different project, different client. There was a tremendous amount of data setup. So the way this, this application worked is every time we deployed a release to staging, uh, it wiped out the data. So we actually used the automated scripts as we were running our smoke tests and regression tests. It prepopulated the data that needed to be there. So not only are we validating that everything is deployed correctly and it’s working, but when the scripts were finished, all the data was there that the end users needed to do their user acceptance testing. They did not have to create that manually. It was already there for them.

Dan Neumann: [23:44]  That’s really nice example cause I know it can be the devil’s in the details with the data and you know, as environments get used, data gets cluttered up and then there’s the questions of is what we’re seeing something that’s kind of bogus data that that really isn’t right or is it a true error in the application. So yeah using your automation to do the data set up is a nice tip maybe for folks that are listening and kind of looking for ways to enhance their, their quality journey. Well, John, time has flown as it always does for me with the podcasts. Anyway, and I want to thank you for giving some folks giving folks some tips on how to go about leveraging quality and using that within agile teams. And I was hoping you could share something about where your continuous learning journey is taking you. If there’s a book or some, something you’re studying these days that has got you excited.

John Gravitte: [24:38] Yeah. So the book that I’m reading now is called Start with Why, by Simon Sinek. And I chose that book because uh, you know, quality is not just the testing that we do in the metrics and reports that we deliver. Uh, it has to do with a quality experience for the client that we’re performing the work for and Start with Why it, it’s about communication, explaining the why behind, you know, the, the decisions and the choices that you make. You know, why are we changing this process? Why are we going to do it this way? Why are we not doing load and performance testing, right? It’s explaining, you know, everything behind the decision or the communication that you are delivering. And, you know, I want the clients, um, that, you know, hire AgileThought. Um, I want them to understand the why behind every decision that we make. Um, it talks about, you know, how to communicate, verbally talks about how to communicate, um, you know, in written. Um, so I, it’s a great book if you wanna Learn, uh, some leadership skills, if you want to learn how to communicate, improve your communication verbally and in writing. Uh, so I’m, I’m about 80% through the book and, um, I highly recommend it.

Dan Neumann: [26:06]  That’s excellent. I will admittedly say that I have not read that yet. It’s, it is one that comes up in a lot of conversations. It’s kind of a classic. It’s, it’s cool that you’re consuming it and you know, knowing the why behind a decision really allows a lot of appropriate behavior. So as we’re trying to delegate responsibility or allow teams to make decisions in their local context, we want them to understand the why so that they can make better choices. And I really love it. So I’m looking forward to hearing more about that in the future. Okay. Well. Thank you John for your time today. I appreciate you taking time out of your other responsibilities to share with folks on the podcast.

John Gravitte: [26:48] Thank you, Dan. It was a pleasure.

Outro: [26:52] This has been the Agile Coaches’ Corner Podcast brought to you by AgileThought and get the show notes and other helpful tips from this episode and other episodes agilethought.com/podcast.

Share this content

Speakers

Dan Neumann

Principal Enterprise Coach

Dan Neuman is the Director of the US Transformation and Coaching practice in the Agility guild. He coaches organizations to transform the way they work to achieve their desired business outcomes.

With more than 25 years of experience, Dan Neumann is an experienced Agile Coach with a deep knowledge of Agility at the team and organizational levels. He focuses on achieving business outcomes by shifting both mindset and practices, resulting in a disciplined, yet practical approach to solving problems.

Related