drone-webinar-image

Webinar: Reducing Utility OpEx Costs with Drone Images, AI & Dashboards

drone-webinar-image
Share on facebook
Share on twitter
Share on linkedin
Share on email

Use your inspection images to extend asset lifespans, reduce OpEx costs and prevent costly failures. During this webinar, you’ll learn how to unlock predictive maintenance insights from your images using AI and business intelligence.

 
This webinar will explain: 

  • Why utilities use drone images capturing for safer, faster and more cost-effective equipment inspections
  • How to ingest, process and visualize data from drone images to extract predictive maintenance insights
  • How to shift from a reactive to proactive maintenance approach

 

Transcript [This is an auto-generated transcript and may not be completely accurate in its depiction of the English language or rules of grammar.

Dr. Jerry Smith: [00:02] Hi everyone. Thank you very much for joining our webinar today on Reducing Utility OpEx costs with drone images, AI, and Dashboards. My name is Dr. Jerry Smith, Dr. Jerry I’m, the managing director here at AgileThought for, uh, analytics and data sciences. Before we get into today’s presentation, just a little bit of background on AgileThought. We are a custom software development and consulting company that, uh, is helping over, you know, hundreds of customers in the fortune 1000 area to transform build and run their digital businesses. More specifically, we’ve worked with the energy companies, including the department of energy, along with regulated utilities on projects, ranging from everything from predictive analytics to data governance and, and building advanced visualization through power BI. Right now I’m joined with a couple of presenters, which you’re going to meet in a few minutes. Arni, who’s one of our principle data scientists here and Jose who’s our director of analytics here in North America for AgileThought. During the webinar, Arni and Jose are actually going to do the heavy lifting for our conversation. They’re going to explain why utility companies are using drone analytics or, you know, faster, safer, more cost effective inspections. We’re also going to get into a small demo, uh, for you to show you how exactly we help energy companies build end to end prototypes to improve their predictive maintenance. But for now you get an opportunity here a little bit more for me to set the stage for why we’re doing all this. You can go to the next slide Arni.

Dr. Jerry Smith: [01:38] It’s a complex world out there, right? I think most of you who are watching this right now, uh, can probably see an image or two in here that reflects, um, your ecosystem, whether or not you’ve got drones out flying for you, or you have a utility systems that have been impacted by economic or environmental situations. You’ve had failures. You’re trying to deal with a very complex situation. And that is this world of, of a very diverse set of utility operations that, uh, aren’t just local, but go across the entire nation. Next slide please. In doing that. I mentioned these folks, I’m the guy in the center. That’s the, uh, that’s that’s me back in the old days and then Arni and Jose. You’ll see them in just a few minutes, if we go to the next slide. So what’s the situation today? A situation is that we have a large complex electrical grid in the United States. Um, it’s composed of hundreds of thousands of miles of high voltage transmission lines. No surprise to folks. What I think is a surprise is the millions of lines of local transmission that occurs after the high voltage. The net result is this is a living breathing entity, and that living, breathing entity is actually maintained by human beings next slide.

Dr. Jerry Smith: [03:01] So this leads to a problem. You know, we can’t scale servicing a living entity that has hundreds of thousands of high power transmission lines and millions of local lines on the back of people. Right? If you take a look at the work that has to be done in identifying bad poles, identifying bad beams, identifying stirpes that are missing under power lines, identifying failed, insulators, identifying failed transformers, identifying worn or debris laden lines. There’s not enough of us to actually get out there and service those lines, right? And even if there was enough of us, the impact that it could have on us, the risks that it poses, the human being is really high. And the net result is this model of using human beings to scale our servicing of a vital infrastructure component, just isn’t working right? Next slide.

Dr. Jerry Smith: [03:56] So the next step of all this is there’s a measured impact, right? There is equipment. Lifetime is going to go on maximize. Um, we have really no way of dealing with the atomic levels of monitored grid health. Uh, while we can monitor transmissions, we can’t tell how many actual insulators are on the verge of failings, which ones will fail next. We’re going to get into a little bit of that. We have a hard time predicting where we’re going to replace inventory parts in the field, who should we send out there? The health grid, the OpEx and CapEx implications of having human beings at the center of servicing our national infrastructure grid is just reached the end of its useful lifetime. We go to the next slide and of course the next slide.

Dr. Jerry Smith: [04:42] So this brings up opportunity for us, right? Opportunity that actually doesn’t exist for human beings. So many of us can look on these, uh, these images, these sensor data elements, and, and see things that we recognize. But a lot of these images are done with sensory information that isn’t possessed by human beings and the upper left hand corner. We have a thermal information as it relates to, um, why, area of storage facilities. Uh, the one on the far right, happens to be my house, uh, where we are doing object identification and true recognition. And of course, they’re in the center, we’re doing 3D modeling. Why? Because you want to do volumetric research. But one of the lower left though is the one that always interests me. And that is when, when customers are asking things like, can you help me identify where a fail bind is or where there leaks underground? There’s no human being in the world that can look at a piece of ground and say, I think there’s a leak there. Well, there is actually one person that’s Superman, right? But he doesn’t exist. But this there’s ways to do that with thermal imaging that we’ll talk about in a little while that allows us to actually take the data we have do some sophisticated analytical and data science work on it, and then produce results out of all that. So the issue, the opportunity here is for us to take this micro data from this macro role, and actually allow us to work smarter in our environmental situations where people don’t do a good job where machines can do a much better job. Next slide please.

Dr. Jerry Smith: [06:10] So this is where our story really starts, and we’re Arni and Jose will, we’ll be kicking yet today. We’re not going to talk about the far left side of the assets for the folks on the call. I believe most of you are in the area of energy distribution and utility management. So you’re in that sort of that top bar there. There Are folks out there that view and pipelines or other folks that are doing construction, right? We’re not going to talk a lot about the sensors either, right? I mean, most of us have seen pictures of, of all the things that you’re looking for, but there’s other ways to look at it with thermal lighter, right? Things like depth. And then chemical chemical is the unsung sensor hero in the world today. Being able to pick up that chemistry that allows us to sniff out gas leaks, et cetera.

Dr. Jerry Smith: [06:54] We’re not gonna even talk about drones, whether it’s quad, copters, long range drones over the horizon capabilities or pilots, right? That isn’t the purpose of today. We’re going to talk about the blue box and the blue box is important for us. Why? Because it’s the thing that takes all that data, all the operational data, all the aircraft deployment data, all the drone app data, all your sensory data, all those things, and allows us to number one, create insights what’s broken. What’s not? What will break what won’t break, right. And where is it? And that allows us to take after service team, to this geo location, with this amount of equipment to service this particular part at this particular time. Why? Because while it may not be broken today, it may be broken tomorrow, right? That’s where we want to get to out of all this stuff.

Dr. Jerry Smith: [07:44] So we’re going to talk about AgileThought’s, perspective on drone analytics. Now, before we get into all this and you think, Oh my goodness, they’re going to show a completely baked system out of this. There’s no system on planet earth that can solve this for you. We’re going to show you some capabilities that are absolutely vital for what you have to do. But when it comes down to solving your problem in your company, with your assets and your team, this is going to be an aggravation of a lot of different use systems, some of which we’ve developed and capabilities and others that you have. And we’ll talk about that at the end of this presentation. Next slide please. Right. This is where I’d like to turn it over to Arni, to actually walk us through a demonstration of this work and, and, um, lead us into some of the capabilities that we’re going to talk about today. Arni.

Arni Steingrimsson: [08:32] Thank you, Dr. Jerry. So I put this slide here to kind of demonstrate the process that we go through first by

Dr. Jerry Smith: [08:39] And Arni, before you get going, I’d love to see your face.

Arni Steingrimsson: [08:44] Thank you, Dr. Jerry.

Dr. Jerry Smith: [08:46] Not a problem. Cause people would also like to see your face as well. There you go.

Arni Steingrimsson: [08:51] So yeah, uh, I put this light up here so we can kind of see where we are in the process. And so what I’m going to focus on is the processing of the images from the drones. And I’ll talk a little bit in details. And then once I finished with a kind of live demo, then I’ll hand it over to Jose who will show you what we can do with, um, all those processed images. So let’s just, um, I wanted to first start with, uh, give you a little bit of kind of, uh, maybe not theory, but little bit insight into the technology we use. And so there’s a lot of, or several types of object detection algorithms out there. And it’s important to list those here, kinda to showcase that there’s more than one way to capture the information you need from images.

Arni Steingrimsson: [09:43] A typical one is just the traditional image classification where, uh, an image has some metadata and it has some classification. So this image here that you see in the top left corner, uh, could have a tag of, uh, maybe some farmland. It could have a tag of, uh, uh, a sheep herder, or it could have a tag of sheep, but it doesn’t show you where it’s in the image. Then you go into object localization, which really finds a bounding box about the object you’re interested in. Then the object recognition goes a step further, finds an object, uh, and classifies multiple objects within an image. So for example, here, we’ve located and classified the object, both a human and sheep and other types is a semantic segmentation where you have highlighted everything in the image that you’ve you’ve located. Um, it could put a kind of look up, um, overlay of all the objects.

Arni Steingrimsson: [10:50] Um, another similar, um, obviously detection, uh, type is that instant segmentation where you’re just focused on the area or the objects you’re interested in. And then you highlight those at a pixel level. And then the last one is the key point detection, which can be useful for, uh, any kind of structure. So in this example, we see, uh, a person who, uh, were, were key points have been placed at every joint. And this can be really useful for maybe tracking their, um, posture or even movements. Then how did this go together? Or what, how is this useful for, for example, in the energy? And if we take a look at this example here, I show the first, uh, kind of like the classic, uh, localization with a boundary box around done object. So this is, uh, getting quite common of, uh, finding an object and then classifying it.

Arni Steingrimsson: [11:51]
So here we locate and classified this to be a 12 kilo-volt tower. Um, an example of the instant segmentation is maybe we find some damage, uh, maybe we find some rust, uh, and what’s important. And what’s interesting about the instant segmentation is that if you have some reference point, you can actually estimate the, the actual size of the object. You’re trying to find, for example, here, we have a woodpecker hole in this, uh, wooden, uh, post it’s possible if we have a reference point, um, or a scale that we can then estimate the actual area for this damage here. And then a third piece is the key point detection, which places, um, certain key points, um, on a structure. So it can be useful to either estimate, uh, uh, any kind of anomaly. For example, we can have a kind of estimated distance between the key points and then if something is often the structure, it, it will throw off and it can maybe highlight it as a, uh, anomaly.

Arni Steingrimsson: [13:01] So how does this all come together, uh, into the database? So every picture or every video is processed and then an algorithm will run through every single image and it will classify it. And then we’ll add this as a metadata, um, into a database. And here’s just a simple example of how that is done. So you have the image name and you get, um, the date and the GPS location from the drone, for example, and then we’ve classified it to be a 12 kilo-volt, um, uh, tower. And then we have some cross arms and insulators and so forth, but now I’m going to go and, um, give you, um, a brief demo into how the algorithm actually works.

Arni Steingrimsson: [13:53] So here we have a quick demo and I’m again, gonna demonstrate the three, uh, algorithms that I’ve been talking about the, the more generic one, the bounding box and the instant segmentation, and then the key points. So I’ll start by loading up the models, and then once it’s finished loading up, we can go in and select an image. So here we have some images from an actual, uh, drone recording, and then they have been, uh, all the frames have been pulled out. So if we take, for example, um, an image here, for example, this one, and let’s take a, a view of this image, so we can see that there’s a pretty good top view of the, of those towers. And then let’s go and start, uh, detecting. So here we’re using the, um, the key point detection and we can see that it captures the post, and this might not be as interesting as the other ones. So let’s go and look at the segmentation. So here we have here, we can see the, that he captures the post, but also the cross arms, and it highlights the, um, the segmentation and then with a better training and more sample data, this can be really accurate and we can actually get the actual area, um, at a pixel level, like we talked about earlier. And then finally, let’s go take a look at the key points and here we have the key points. So bottom of the pole, top of the pole, and then we can put a left, um, cross arm one and right cross arm, uh, one for example. Um, so that’s kinda like a quick demonstration of the, uh, object detection algorithm. So next I will, um, then hand it over to Jose who is going to show you how you take all this process, uh, data and how you can build a, um, an asset anomaly dashboard. So,

Jose Chinchilla: [16:27] Okay, thanks, Arni. Uh, that was a great introduction and demo on how you can apply machine learning in AI, uh, solutions to, uh, solve this particular problem, which is how do you detect anomalies and identify objects in your transmission lines and distribution lines. So what I’m going to show next is a really quick and simple dashboard that we put together. Um, and this dashboard is, uh, will allow you to visualize and gather some insights from the, uh, images that were processed and the, all the anomalies that were detected from them, uh, in this case, uh, this dashboard is using power BI and it’s consuming data, perhaps could use it from a data Lake, could consume data from a database, um, pretty much from any other, uh, data storage. Uh, and you can also augment this dataset along with other types of datasets, like, uh, telemetry data from your, uh, machine equipment, from your data historians, uh, from work schedulers, uh, in other relational non-relational, uh, data sources.

Jose Chinchilla: [17:34] But most importantly, uh, you can augment these data with your own images, your own videos, taken by drones, taken by helicopter, or what other, any other way you capture, uh, images from your equipment and your assets, and then be able to put all these together in a power BI model and allow you to visualize this data. So that’s it, I’m going to share my screen, and I’m going to show you that a quick dashboard that we put together. So in this case, um, what I put together is a way for you to filter, uh, the different, uh, anomalies that were detected the main type of phenomena anomalies. And as you can see, you can click on the different, uh, anomalies and then the data, uh, or the images in assets that were affected by that type of anomaly, uh, will also, uh, filter. You can do, uh, also filtering by date range. So if you’re interested on, um, a normally use detected within a period of time, you can select that, and you can also navigate your, um, asset, um, assets by asset categories or any other type of, uh, grouping. In this case, we have, for example, towers, uh, towers of, uh, uh, 25 feet, 220 kilo-volts, and we can, uh, quickly, uh, narrow down to what we’re looking at. And here we are seeing a particular asset in this case that, uh, tower one, uh, would you say 220 kilo-volt tower that, uh, has some rust damage in one of its crossbeams? Uh, we can see where it is located. Uh, we can expand this, uh, map and we can see in zoom in using, uh, Bing maps or Google maps, and be able to see where that asset is located. Uh, we’re going to do a little bit more of a geographical type of analysis in the next, um, uh, report, uh, but for now, uh, just wanted to show really quick in this dashboard that you can quickly, um, visualize your data and identify, for example, your hot spots.

Jose Chinchilla: [19:40] Uh, in this case, it seems that there are three major hot spots in this sample data that we have, and you can zoom in, in, uh, do more insightful geographical type of analysis. In addition to that, uh, as we, uh, select this particular tower, we can see the progression of the different, the different detections over time. So one of the first anomalies was discovered in, uh, October 1st, 2019, and then every month or so we’ve been detecting more and more anomalies. In addition to that, you can see that as time progresses, perhaps nobody has taken action on this, uh, asset on this anomaly. And you can see the severity of those, uh, anomalies have increased over time to the point where they are consider high severity, perhaps, uh, the, the level of rust, perhaps, uh, some other issues that may warrant immediate action. So this allows you to do more proactive, uh, action on assets that have high severity anomalies detected.

Jose Chinchilla: [20:47] Uh, in addition to that, we can do what’s called a drill through. So if I right click on this asset, uh, I can, uh, drill through my, um, let me select that again. I can drill through and going to that particular asset anomaly details. Once I click on that drill through, I can see very quick statistics and very quick historical information, uh, for example, the acquisition costs, uh, what was this tower installed? What was the anomaly type that we are analyzing in this case rust? Uh, and when was this anomaly detected? So you can see that particular anomaly that we were interested on for the asset, uh, as well as some of the other anomalies that were related to that asset. So not only there was a crossbeam that has some rust, uh, at later, um, detections or later images or videos taken. We discovered that also now the support beam has, uh, become rusty.

Jose Chinchilla: [21:44] Now, the insulators also have some surface rust, bolt rust. Uh, we also discovered that one of the spacers, uh, have a loose bolt and the base is also beginning to rust. Um, as I mentioned, we can look at the geographical information, uh, in a map and we can do more in depth analysis. You can actually do a, uh, perhaps a causal analysis on these, uh, on this asset. Why is this asset rusting so much? So we can look at the images in take and get a picture of what is happening, uh, across these assets and all the components that surround this asset. We can also see if we click on this link, uh, this will open up in Bing maps with a birds eye view of that asset. Uh, this is by gathering the latitude and longitude that Arni mentioned part of the metadata that was captured when that image was taken.

Jose Chinchilla: [22:43] And we can see us in a birds eye view, uh, through Bing maps that asset, and we can do some cultural apps, for example, uh, we can see that acid is very close to a swamp, right? Uh, we also see that it is close to a, a, what seems to be a retention pond or a farm, uh, I guess, runoff pond. So we may do some cultural analysis we can meet. We may dispatch a technician to, uh, take a look at, you know, are there any chemicals or vapors, uh, being, uh, sent out in the atmosphere that are causing this particular tower and all these components to rust, right? So these are the type of insights that you can gather very quickly from your image processing in, uh, an early detection using ML and AI algorithms as well. So, uh, was that when I hand it over back, uh, to Dr. Jerry for some, uh, final thoughts and questions,

Dr. Jerry Smith: [23:43] Thank you very much. I’m always fascinated, even at this stage of the game, with everything I’ve learned from seeing that presentation again and again, all the new stuff that’s in there. Um, Arni, can you bring up the presentation for us? Thank you, sir. Um, so in closing out, uh, the question often becomes is where do I start? Right? Staying on the X of where you are today, whether or not you just started this journey, you have a bunch of drones you’re flying around. Um, you have a group of people that is experimenting. There’s always a place to start, right? And a place where you can begin to evaluate the value you have and then achieve a better outcome. So one of the first things we recommend is get on a call with us, right? We have some pretty fun people here, and some smart people on top of that when it comes to this space, and we’d love to spend, you know, 30, 45 minutes with you talking about your particular situation, what you have, where you’re looking to go to see if there is value in us having a relationship, right?

Dr. Jerry Smith: [24:47] That’s always the first spot to get to, you know, get on a zoom call. Let’s have a conversation. If there’s something there, we’d love to do this drone analytics in a day, right, where we get together virtually these days, maybe one day we’ll get together real time again, but virtually, and we actually walk through your business plan. We do a high level views of what kind of assets you have. And we begin to think about the process of going from all that data image, sensory stuff, into insights and actions, right? What are all the components you need to have. Out of that day with us six hours or so split with a lunch will be a pretty good affirmation of a program that you can put in place to move this thing along. One of the first steps in that would be then to do a five day workshop. Now, this is a workshop. This is hands on. This is where we drill down that business plan. We take a look at your service departments. We take a look at what kind of assets you’re having to manage. We actually collect real information. You fly drones, you get pictures of good and bad insulators, good and bad lines, power poles, um, whatever the assets are. We’re collecting lots of information. We’re looking at both sensory information, wider information, visual ultraviolet. We’re taking a look at a chemical, sensory analysis. If it’s appropriate, we bring that information back. We begin to then expose you to the process of what can you do with this in terms of diagnostic predictive and prescriptive in the kind of algorithms that you have in there. And then out of that, how you manage that in the assets that that Jose was talking about, and then put together the plan that says, okay, now that we know your business, here’s the assets you’re going to need. Here’s the sensor information. Here’s how you’re going to process it. Here’s how much you’re going to have to store. Here’s the diagnostic predictive district of work that you’re going to have to do in there. And here’s a plan for you to get onto it. And that’s about a five weeks hands-on workshop that we can do with you. That’s where you go, which is kind of a kind of fun stuff. And that’s it for the webinar. Um, I understand that we have some questions that I’ll bring up right now. Um, so the first question I think is going to go to Jose and it says, and by the way, why don’t you guys pop back up? Thank you, sir. And we’ll get Arni up as well. All, all of us up here, answering the questions. See if anybody actually wants to ask me your question.

Dr. Jerry Smith: [27:05] Once you have all the data collected for Jose and you brought it into a power BI dashboard, like you were showing us, what kind of business outcomes are we looking for?

Jose Chinchilla: [27:16] Yeah, that’s a good question. So one of the greatest, uh, outcome that you can get from a program like this is the ability to give the end users and the user that matter the most, uh, give access to the data, right? Those could be the subject matter experts, uh, business analysts, your, uh, uh, maintenance program managers, field technicians, to be able to, uh, uh, get to the data when they need it right. And, and, and be able to do it in a scalable manner. And that’s what a reporting platform like power BI, the entire Azure, uh, data platform ecosystem, uh, gives you. So you can not only bring that data, uh, show the images that were processed by your email in an AI algorithm, but you can also augment that with some other datasets, uh, essentially just put the power of the data in the hands of the user, right.

Dr. Jerry Smith: [28:13] And one of those data sets, I think we’ve talked about in the past is that causality piece, right? So now that you’ve collected this over time, you have all those images, right? You’d have insulator insulator, and a lot of environmental data that you can begin to, to, to show what’s causing these things. And you can actually put that causal model up there. So people are like, they can actually not only see the problem, but they can see the causes and take action. I think that’s great. I have a, I have another question for Arni on how would you propose starting to train models with hundreds of images? You know, what kind of level of user interaction is needed at first?

Arni Steingrimsson: [28:50] Yeah. Initially we would need some subject matter experts to help us, uh, identify, um, kind of give us the labels for the model to be able to train on. But a lot of times, um, when we speak to customers is that they get kind of scared of like, Oh, do we need to go through a hundred hundreds or thousands of images and stuff to, um, and label those. And it’s really time consuming and taken away, um, valuable resources, but we, we can there’s, there are several things that we can do. Um, for example, we could, uh, take several of the images that have been labeled. We can synthesize the background, we can, um, and then we can use active learning, which we could set up a system that automatically sends, uh, it makes it more user friendly for people to actually label those images.

Dr. Jerry Smith: [29:44] Yeah. And your point is your point is that it’s like with children, right? When you’re, when you’re raising kids, there is interactions that parents have to do with those children in order to get them to recognize what an Apple and orange is. But over time, they take those things on their own, and they’re able to do a lot so upfront. We have some interactions, we don’t need a lot, but, uh, but you do have to train these classifiers, uh, to identify things. And that’s the cool thing about today is we’re getting much better and faster with this, with one less level of human interaction. Excellent. Jose next one’s for you. Uh, how can asset help, uh, dashboards expand outside, uh, just data from the images? What can we do with that stuff?

Jose Chinchilla: [30:26] Oh, yes. I mean, the sky is the limit, uh, but one of the, where we’ve seen a lot of value, uh, element in the, uh, image, uh, processing data datasets is with equipment telemetry data. So a lot of organizations already own and have collected a lot of information in data historians, like the OSI PI, uh, software, and you can augment and correlate, uh, the telemetry of that equipment. That’s stored in those data storians, along with the anomalies that are detected in these images, uh, you can also bring other datasets, for example, that are not collected inside your organization, such as weather information, right, where you can see weather patterns past and present weather patterns and see the correlation on how those weather patterns affect a certain equipment or, or anomalies, uh, in those assets. So you can bring all this into perhaps a data Lake, uh, if you’re not collecting, uh, these datasets, now it’s a good time to start and be able to augment, uh, and compliment all the datasets, uh, with the drone images.

Dr. Jerry Smith: [31:32] Well, yeah, I think a lot of people underestimate the value of data sets because they think, Oh my goodness, it’s more data. How do I process it today with the use of causality, we’re able to read through thousands of variables in millions of rows and figure out what’s causing my favorite external data set for this area is, is a space weather, right? Taking a look at solar activity and how it impacts, uh, electrical transmission lines. That is an undervalued resource because people just don’t have enough time or energy to process it. We can now solve that problem. That’s a pretty cool information.

Jose Chinchilla: [32:12] That’s a good point because you can, if you have, for example, uh, enough time, enough heads up, or some of these weather events, you can do proactive maintenance, right. And, and do some rerouting of your transformation stuff.

Dr. Jerry Smith: [32:26]
Yeah. Mother nature is predictable if you give her enough time and if we take enough data from her. Arni, so somebody actually has a follow up question to the training, and that is, uh, the process of identifying images seem simple. How long does it actually take to create an initial model?

Arni Steingrimsson: [32:47] Yeah, well, that’s, um, it’s not really a question about, um, it’s more of a, how much are you willing to pay? Because we can throw as much processing power as we want. Um, and we can process it really fast, or maybe you don’t need this, um, over the weekend or something like, and you don’t want to spend a lot of money on it, you can take your time, but it really isn’t a matter with, uh, algorithms, um, that have been developed today and on the computers. Um, it’s really not a matter of how long it’s, it’s really just how much are you willing to spend.

Dr. Jerry Smith: [33:28]
Yup. Yup. And, and one of the things that we’ve learned is, is the impact. And again, I’ll use the causality word has on just the training process, right? So in the old days, last year, uh, for those who are counting, um, we used to throw all sorts of data at a predictive model, like a deep neural network, and then try to force this thing to learn something well, it had to sift through stuff that wasn’t relevant and it eventually weeds it out. But with the causality step in here with the ability to say of all the data you have, what’s causal to your interest, a failed insulator, for example, you could shorten development times by just focusing on causal data, right? So you’re right. It depends, but the processes we have today are streamlined such that you’re able to reduce that time quite a bit. It’s a great question. Follow up. Somebody actually loves me and they’ve asked me a question as well. They said, what if, uh, if your organization doesn’t yet have drones or an automatic way of collecting data, uh, what’s a good starting point look like, I would say that’s the best place to be because you’re not carrying around a lot of baggage, right. Companies that do have drones. That’s awesome because you’re collecting information. You have a lot of experience are all are down that path. They have an investment in the best place is, give us a call, right? And we can start that process with you. Um, I think it will quickly lead into one was one day us was with you, right? Well, we get together and we walked through this process that we set in place. Is this an area for you to explore or not explore? And then with that business case, you’re able to go back to your management and your executives and make a empowered, uh, request to do a full, detailed workshop, right? Over five days, we actually out there collecting data. So if you don’t have this stuff to begin with, but you have the problem, right? You have to have a problem. People out in the field doing work, they shouldn’t be doing risking their lives. Not enough time in the day, not enough people to do this. If you have that problem and it’s costing you money because you can’t scale on the back of people, it’s a great place to start, come and talk to us.

Dr. Jerry Smith: [35:28] Well, I think that’s all we have time today for the webinar. If you’re interested in learning more about us, please feel free to give us a call. You can reach out to Christos. Who’s runs our group here, or, uh, any of the folks here on the, on the slide that you see here, especially Christos Ruci, who’s our energy lead practitioner, sent him an email. inundate his inbox with, uh, with request, uh, to tie up this team a little bit and talk to you in the meantime. Thank you, Jose. Thank you Arni for joining us today and thank all of you for spending the time listening to us. I hope one of you out there who has a problem decides to get off that X decides to give us a call and give us the chance to help you out. That’s the step that you’re going to take. It’s going to change the rest of your life. Thank you very much.

Stay Up-To-Date