This week’s Pipeliners Podcast episode features Sean Donegan and Allan Adams of Satelytics discussing the fundamentals of image analysis and the application to the pipeline industry.
In this episode, you will learn about the history and development of image analysis, the technical aspects and science behind image analysis, and the importance of using image analysis to support pipeline operations for leak detection, inspection, and corrosion mitigation.
Fundamentals of Image Analysis: Show Notes, Links, and Insider Terms
- Sean Donegan is the President and CEO of Satelytics. Connect with Sean on LinkedIn.
- Allan Adams is the Chief Scientist at Satelytics. Connect with Allan on LinkedIn.
- Satelytics is the foremost remote sensing leader with a full staff of Ph.D. level expertise. The company uses proven science, adept software, and powerful technology to meet the toughest business challenges.
- Image Analysis is the extraction of meaningful information from images, mainly from digital images by means of digital image processing techniques.
- Airbus is a European multinational aerospace corporation.
- Regression is a return to a former or less developed state.
- Multispectral imagery generally refers to 3 to 10 wider bands.
- Hyperspectral imagery consists of much narrower bands (10-20 nm). A hyperspectral image could have hundreds or thousands of bands.
- Wavelets are mathematical functions that cut up data into different frequency components, and then study each component with a resolution matched to its scale.
- Chlorosis is the abnormal reduction or loss of the normal green coloration of leaves of plants, typically caused by iron deficiency in lime-rich soils, or by disease or lack of light.
- Phycocyanin is any of a group of blue photosynthetic pigments present in cyanobacteria.
- High-Density Image is a more detailed image.
- Low-Density Image is a less detailed image.
- Anadarko was a company engaged in hydrocarbon exploration. In 2019, the company was acquired by Occidental Petroleum.
- Ground-Truthing is used in various fields to refer to information provided by direct observation as opposed to information provided by inference.
- Spectroradiometer is a light measurement tool that is able to measure both the wavelength and amplitude of the light emitted from a light source.
- Broadband is a wide bandwidth data transmission that transports multiple signals and traffic types.
- Infrared is electromagnetic radiation with wavelengths longer than those of visible light.
Fundamentals of Image Analysis: Full Episode Transcript
Russel Treat: Welcome to the Pipeliners Podcast, episode 123, sponsored by Satelytics, a cloud-based geospatial analytics solution processing multi and hyperspectral imagery from satellites, aircraft, drones, and fixed cameras to lower the cost and improve the timeliness of identifying leaks, encroachment, ground movement, and other pipeliner concerns. To learn more about Satelytics, visit satelytics.com.
[background music]
Announcer: The Pipeliners Podcast, where professionals, Bubba geeks, and industry insiders share their knowledge and experience about technology, projects, and pipeline operations. Now your host, Russel Treat.
Russel: Thanks for listening to the Pipeliners Podcast. I appreciate you taking the time. To show that appreciation, we give away a customized YETI tumbler to one listener each episode. This week our winner is Lee Novak with Elevation Midstream. Congratulations, Lee, your YETI is on its way. To learn how you can win this signature prize pack, stick around until the end of the episode.
This week, Sean Donegan with Satelytics is returning. He’s bringing along Allan Adams, their chief scientist, to talk about fundamentals of image analysis. Get ready. This is going to get geeky.
Sean, Allan, welcome to the Pipeliners Podcast.
Sean Donegan: Hi, Russel. Great to be back with you again.
Russel: Sean, you’ve been with us recently because you participated in the iPIPE vendor perspective episode. Allan, you’re the new guy on the podcast. If you would, would you tell us a little bit about your background and how you got into pipelining?
Allan Adams: Yeah, hi. I am the chief scientist here at Satelytics. I oversee the data analytics portion, which is the algorithm builds, along with the data delivery, how we deliver the information to the pipeline group through our professional services.
Russel: Awesome. I asked you guys on to talk about fundamentals of image analysis. The moniker here is a new way of thinking about airborne inspection. I’m glad we have the scientist on because I know this stuff gets super technical. We’re all geeks here at the Pipeliners Podcast. We like that technical stuff.
I’ll just tee this up. I’ll ask, what is image analysis?
Sean: Russel, image analysis is the taking of raw data, which could be captured from plane, drone, satellite, or other data platforms that may yet be invented or created as time goes on. The folks at Airbus are working on something that flies above the stratosphere, almost like an oversized glider that can fly, not only orbit north-south, but also east-west.
There’s a new revelation that’s going to be a good data source, but the objective is pretty straightforward. We’re using a series of algorithms that Allan and his team developed to pull out some of the most challenging problems from that imagery using reflective light signatures.
The science finds the problem, and then we show that and render that on the image analysis. Allan will give you a real deep dive on how we create those algorithms and the technology behind it.
Russel: Allan, walk us through, if you would, how image analysis works, just the fundamentals of what’s going on there.
Allan: Image analysis, simply put, is taking data that’s collected via satellite, airborne, airplane, drone, as Sean mentioned, and providing data analytics on that. That’s using machine learning, simplified regression techniques, separating the information into viable, useful data for the pipeline community to use and make decisions off of.
Russel: I think maybe it’d be interesting to talk a little bit about the history of image analysis. I have some unique experience from many decades ago, when I was working with a company south of Houston that did image analysis using electron microscopes for chromosome karyotyping.
The thing that I didn’t understand about what they were doing is they had a pixel of data, and that pixel had a whole bunch of attributes. The guy who actually was the chief scientist in that company had started out working at NASA and developing their techniques for satellite imagery for space probes.
How to capture all that and relay it back to Earth so they could do analysis. Maybe break down a little bit for me what is a pixel, and what are the attributes that go with a pixel?
Allan: A pixel is just a stack of layered data. When the data comes down to us from outer space or from a plane, you’ll get what is called bands. Within those bands are the reflective energy categorized into digital numbers via the radiance received at the sensor set itself.
We’re receiving digital numbers that are then relayed back, because of that signature fingerprint in the reflectant’s value that’s observed at the Earth’s surface, the source being the passive energy supplied by the sunlight itself.
I’m walking through it to make sure that’s understood. What we do with that is we supply, and we run our algorithms on it. That’s an artificial algorithm, artificial intelligence algorithms, the machine-learning algorithms, the data analytics.
These are methodologies that are deployed in order to pull those spectral fingerprints because there’s a lot of pixel overlap or signatures that could overlap signatures. Teasing those signatures out to provide viable information and crucial information, that’s what the data analytics has pushed that boundary in the pipeline world.
Russel: If you think about this, if you’re familiar with doing colors, and you’re printing. I’m going to see this array of colors. What I’m really doing is I’m looking at a pixel. I’m looking to see how much red, green, and blue is in that pixel.
What I think you guys are doing is you’re actually taking and looking at it way beyond just visible red, blue, and green. Is that a fair analogy?
Allan: That is a fair analogy. I can elaborate on that.
Red, green, and blue are part of the spectra which we do leverage. However, we’d also look into the near infrared. With that being said, each color, your eye sees generalized versions of those. We actually take them into what’s called that digital number or the pixel depth.
The pixel has a depth based on how many shades of gray each green, blue, red, near infrared, red edge, multiple other bands can actually be used to develop that analytics and to derive that fingerprint or that reflective signature that’s unique to different targets.
Russel: You’re able to see subtle differences by doing math that we can’t see by just using our eyes.
Allan: That is a very good way to put it.
Russel: Cool. Let’s talk a little bit about capture because if I’m going to capture this data I’m doing air quotes with my fingers right now and I’m saying, “Camera.” If I want to capture this with a camera, does that work? Is what I’m using really a camera?
Allan: No, although the term may be generalized as a camera, what we’re actually using are sensors that are tuned to receive energy reflected from the Earth at a certain wavelength. That’s what we term bands. Each sensor’s set to collect a form of band. These sensors actually sit in a package, if you will. That’s the term multispectral versus hyperspectral.
When we get into multispectral, they are bands that are not necessarily consistent. However, they’re discrete bands with known wavelengths that we can pull out to say this is what we expect to be blue, or coastal, or green, or red, or red edge, or near infrared.
These are all wavelets. The sensors actually are tuned to receive that reflective energy in order to provide that digital number that we spoke of.
Sean: The goal is real simple, Russel, for us. Within that spectral signature, which Allan is talking to, we’re looking for a particular target. In the pipeliners’ world, that could be a leak along a line. It could be an encroachment. It could be some other form of business challenge that they’re looking for. Vegetation management, very good example.
With Satelytics, what we’re trying to do is to understand that unique spectral signature, whether it is unique to the target. If not, is there a surrogate that tells us that that is present? We use that surrogate either as a means of looking below the grade because we don’t see below the surface. In a body of water, we see down to a depth of about 12 inches.
Of course, the other consistency with using a surrogate is that it corroborates what your initial find in the algorithm is telling you.
Russel: I’m sorry. You’re using the term, surrogate. What does that mean in this context?
Sean: For example, let’s just say you’ve got a leak below the ground that’s not at the surface. Of course, we can’t see below the ground, but one of the unique factors is that the leaks that are drip drip, drip drip, that go on forever and often become the very costly ones to remediate and put right, vegetation could be a very good example of a surrogate.
It’s like our canary down the mine. The smallest drip is taken up by the vegetation and the spectral signature that’s given off is a different one that’s laden with bacteria rather than what we’d expect of normal vegetation.
Russel: If I were to put this in layman’s terms, it’s somewhat like if I’m looking at a garden, I can see the health of the vegetation and from that infer the health of the soil.
Allan: Correct. What you get is actually that volatile organics as they come off of the hydrocarbon byproduct actually fills the pore space. It causes what is a chlorosis in the plant, so a localized reduction in health of that vegetation.
It also causes a slight increase in the iron uptake. That reflective signature or that fingerprint is observable in all of our sensor sets.
Russel: Interesting. This idea of image analysis is not just capturing the data, but it’s correlating between how the data’s captured and physically what you’re seeing, and then figuring out the math to make that correlation.
Again, is that right?
Allan: That is correct. We actually take it a step further on that end. In fact, we take it to the point of quantification of various constituents. What that means is when we look at phosphorus in water, calculating down to parts per million that we ground-truth and validate what that signature response is and how that correlates to that quantified value.
We’ve done it for various other constituents. When we look at gas detection, we’ve done it for parts per million in methane detection to look at the gas over the top of various infrastructure for various pipeline companies.
Russel: That leads to the next question. One of the things that I heard at the API Pipeline Conference last year was a lot of talk about the size of a pixel.
If I take a photograph with an electron microscope, a pixel is a very, very, very small thing. If I’m taking a photograph, if you…I’m doing air quotes, “photograph” from a satellite, then what’s the size of a pixel when I’m taking the photograph from space?
Sean: That pixel, Russel, can vary. It depends on the sensor and the satellite platform that you’re using. From a Satelytics perspective, a very course resolution would be one of the Landsat family of satellites where every pixel measures 30 meters by 30 meters.
If you’re looking at a body of water and you’re looking for algae blooms, phosphorus, phycosan, and chlorophyll A, which we measure in parts per billion, that’s more than adequate.
If you’re along a pipeline and you’re looking for the most granular movement in land, leaks, encroachments, exposed pipe, then you go to the other end of the spectrum where every pixel is measured in 30 centimeters by 30 centimeters. There are many hops in between.
Depending on the problem that you’re trying to solve will often dictate the resolution, so called, of the satellite platform or, in fact, any of the other platforms for that matter. We’re talking specifically satellite here. That will dictate which of those various resolutions you’ll use of the pixel.
Russel: Sean, that tees up another question, which is if I’m capturing an image from space, versus I’m capturing an image from an airplane, versus I’m capturing an image with a handheld camera. I’m really capturing the same data. How does altitude impact all of this?
Allan: The altitude variation impacts in the aspect that we have atmosphere that we go through. We have a number of correction methodologies that we deploy based on how much atmosphere we actually flow through. There’s a bunch of techniques that we can deploy in order to mitigate that.
We like to correlate our values to what we’ll call as top-of-atmosphere reflectants. I talked about that digital number that we actually get received in the data form. The pixel has a date and a number. Using some of our artificial processing techniques, we actually are able to mitigate that atmosphere component to get it into a reflectant.
What you actually observe, and this is actually how we validate our ground-truthing, is we will actually take a backpack spectroradiometer that actually receives that light at surface in our hands. You look like “Ghostbusters.” You take the value and you receive that.
We calibrate our algorithms down to that value, that reflected value that you would see in scientific instrumentation and how that correlates to the satellite that you see from outer space.
Russel: That’s fascinating. Not to mention, it’s like both “Star Trek” and Ghostbusters all at the same time, which is also cool. [laughs] I’m exposing my inner nerd. I’m sorry.
Sean: Allan has been known to slime people here and there, Russel.
[laughter]
Russel: Said another way, what you’re doing is you’re capturing the high-density image and a lower density image and you’re correlating the two.
Allan: That’s a good way to look at it. Correlating that value, not only that reflective signature from the spaceborne, but the ground detected, and then validated dose…a quantified version into the laboratory qualified values.
Sean: Russel, really, when we go to quantify, let’s just say quantify methane, which is a very good example. Before we put any algorithm into commercial use, it’s normally a three to four month process. Part of that process is validational ground-truthing.
In the case of methane, some of our early customers, early adopters, Anadarko, BP were a couple. We had worked very closely with them on doing controlled releases of the target that we’re looking for, in this case, methane.
What that gave us were a number of known values in the wavelengths of what we were trying to achieve so that, now, when we run our algorithms we can then calibrate back to that so that we know the measurements are accurate or we know what the variance of those measurements are, plus or minus, or, indeed, we know how robust our measurement is.
For example, one of our earlier algorithms was phosphorus in water. We know that today, given all of the ground-truthing we’ve done and all the data sets we’ve run because we’ve run it so many thousands of times, that our accuracy is plus or minus six parts per billion when we measure it.
Russel: You’re using this term ground-truthing. Can you unpack that for me a little bit?
Allan: Ground-truthing, ground validation literally, as I spoke…The spectroradiometer, we take that into the field. We align the satellite overpass. We know exactly where the satellite will collect.
We’ll go out and we’ll either take a soil scoop, a water scoop, or we’ll do a gas detection plume in order to calibrate or correlate that pixel, the satellite, and the spectroradiometer all at the same time in order to collect viable data samples to make sure that our algorithms are in line and are calibrated appropriately and we’re detecting the amount we say we’re detecting.
Russel: Interesting.
One of the things I want to talk to you guys about, and we talked about this a little bit off the microphone before we got on, this idea of multispectral versus hyperspectral. Let’s start with multispectral. We’ve talked about that a little bit. It’s basically you’re capturing spectrum beyond visual. You’re capturing a lot of different…
What was the term you used for that, Allan, different…?
Allan: Bands.
Russel: Yeah, different bands. What is hyperspectral?
Allan: Hyperspectral is the same collection of…You’re collecting bands, but these bands are consistent. They continuously go from one wavelength, to the next wavelength, to the next wavelength without a gap in the full spectrum collection.
Hyperspectral also tends to have many more bands. To give you an example, if we were to collect multispectral, you could have a four band. If you collect a hyperspectral, you could have 363 bands of information.
What that is is discrete little bands and information that you’re collecting in order to resolve that fingerprint or target or to identify multiple constituents.
Russel: If I’m capturing multispectral and I’m capturing four bands and 300 plus with hyperspectral, are the bands…
Knowing a little bit about communications and communication bands and bandwidth within radios and all that, a band is a range of frequencies. Broadband would be a broad range of frequencies. Does the same kind of thing apply here?
Allan: The same general concept can be applied. The band is the wavelength. Instead of having a frequency that you’re thinking about that way, it’s still just that wavelength in the visible, or the near infrared, or the shortwave infrared as you proceed out.
In hyperspectral, the bands tend to be fairly consistent per package. If it’s a five nanometer wavelength for each band, that remains fairly consistent throughout until you get out into the near infrared or the shortwave. They may have a series of pack-ins.
It could be 5 nanometers in the visible, and then maybe 10 nanometers in the near infrared, and then 15 nanometers in the shortwave infrared. Again, continuous from that, let’s say, 400 nanometers to 2,500 nanometers.
You have 365 bands throughout that, whereas the multispectral, the wavelengths are not always consistent. They’re more attuned to what we would say what is blue and what wavelengths do we want? It remains consistent.
Typically, you see some variation from satellite to satellite, but if you launched one satellite type and then a secondary satellite type by the same company, they try to remain that value to be consistent.
Company A versus Company B tends to have that blue or that red shifted slightly. The near infrared, as well. That’s the difference between the multispectral and the hyperspectral.
Russel: I guess the other question is you guys are really more focused on the data and the data analysis and not really the capture so much. Is that a fair statement?
Sean: We’re agnostic to the capture, Russel, by design. First of all, just being very straightforward about it, our pockets are not deep enough to spend billions of dollars on launching the next generation of satellites or the next data platform.
There are some far smarter people than we are up there. We simply want to purchase their data. That’s pretty straightforward.
Our bubba geek, our propellerheadness, if that’s such a word, is in that we develop the algorithms that identify some of these really troublesome challenges for the pipeliners. From one set of data, we want to be able to run as many of those algorithms as possible.
Therefore, if you’re a pipeliner and you’re spending $100 but you can solve five of the trickiest problems, then clearly that’s a much better financial return. We want to spend 100 percent of our R&D efforts with Allan’s team on developing new algorithms that solve some of the trickiest problems that pipeliners face.
We want to pour that effort into that piece of it, and detect them, and measure them, and alert our customers to the effect they’re having on their infrastructure in the right-of-way.
Russel: What are some of the tricky problems, and why are they tricky?
Allan: Some of the tricky problems that we see are liquid leak detection. They want to know immediately where and when did that leak start, how fast can they deploy to it. Urgency of how that image gets captured and delivered to us and the expediency of that is very important to our customers and how quickly that data gets relayed.
Other things are removal of overburden or erosion. That’s relevant to right now, whereas the snow melts or we get more precipitation and the farms haven’t fully developed and there’s no vegetation hold. You’ll get more erosion. They want to see where their infrastructure is susceptible to that.
Russel: Why is that tricky? Why is erosion a tricky problem? I get why leak detection would be because of the speed, just the time to capture the data, process the data, and make the determination. That’s how fast can I move through the process.
With erosion, that’s less time critical, not that it’s not time critical. It’s just less time critical. What’s tricky about that from a data analytics standpoint or an image analysis standpoint?
Sean: Russel, customers want to see minute movements in land because it has very big impact on unsettling infrastructure, particularly below the grade. We have an example of a customer that will remain nameless, but about two and a half years ago they had literally over 400 land movements and landslides.
In those days, they told us it was an average of about $90,000 to remediate. Any early warning that you could get to get out ahead of that curve would, number one, drop that cost and, number two, most importantly, lessen the impact on the underlying infrastructure.
Here’s another good example to your question. Methane. There’s a lot of talk around greenhouse gasses. I think a lot of the companies in the pipeline world are feeling the pressure from NGOs [non-profit organizations] and other bodies that are coming at them thick and fast and saying, “Look at these big, bad emitters.”
None of them have any specificity to back that up. You, as a pipeliner, still have to defend your position. Our methane algorithm, which took us about 18 months to develop because it has influences far beyond just measuring methane escaping. You’ve got to understand wind velocity, wind direction, relative humidity at the location.
Now, at least with the Satelytics, if you’re a pipeliner that’s moving methane and having the public microscope shined on you, then at least now you have a way of measuring and monitoring what your greenhouse gas emissions are like and how you’ve improved over time.
Believe me, there isn’t a pipeliner I’ve met yet that doesn’t want to do the right thing. That is lower its emission footprint, keep a very clean environment, and also operate in a very, very safe manner.
Russel: I actually want to ask a follow-up question about the methane, Sean. That is this — can you distinguish between methane that is coming from a facility and methane that might be otherwise naturally occurring?
Allan: Currently, the methane is just detected. That may get down to what we talked about earlier, is the pixel size. The only data that we can use right now to process that is 3.75 meters by 3.75 meters. That’s no restriction to us, as we’re data agnostic.
Understanding that that is an impediment, the long and short of it is yes. We detect the methane, but to distinguish it between source A and source B, we could get very close to the infrastructure, but to get down to valve A versus valve B, no.
Russel: I was actually talking about…
Sean: Russel was talking about cow poop, right, Russel, and how we distinguish between naturally occurring…
Russel: Like cow flatulence versus production leaks.
Sean: We are able to distinguish those today. What Allan was talking to is when you mentioned about a facility, is how close could we get to the source leak?
The determination between what is the pipeliner’s expulsion of methane and the naturally occurring is something that we have already in our top hat. We’ve worked on that.
Ideally, we would like to get to a space where we could determine not just methane but between the different thanes, and doing the speciation. That’s the next generation of where we’re driving this and working alongside some of our customers.
Russel: I find that idea of being able to distinguish between produced methane and naturally occurring methane…I’d like to ask how that’s done. I don’t want you guys to give away any secrets, though. That, to me, is fascinating, the idea that you could even do that by just doing image analysis.
Allan: The imagery analysis is looking at what we’ll call new methane versus fossil methane and the reflected absorption because it’s different when you’re looking at gas as opposed to…
I talk about reflection and absorption. The absorption features as that energy is passing through the gas cloud, if you will. There are features that only are affected in certain wavelengths.
Understanding that the fossil methane versus the new methane have effects in various wavelengths, and those wavelengths’ effects are translated into that digital number, which are then extracted by the analytics.
That’s what you’re trying to pull out, is the fossil methane has a different fingerprint or signature in different wavelengths that can be identified and then is leveraged in order to say, “This is new methane versus this is fossil methane.”
Russel: This is how a space probe taking a picture of Neptune can determine what chemicals exist in the gas clouds.
Allan: Very similar techniques, yes.
Russel: It’s absolutely fascinating to me. It makes me wonder how far this kind of technology can go and what are all the applications. It’s a bit mind-boggling in my opinion.
Sean: Yeah, Russel. I think it is fascinating. What’s, for us, interesting is when we speak with some of the pipeliners, the level of interest is extraordinary. You’ve got your great followers of bubba geeks, but also at the very senior levels in some of these organizations, there’s a fascination about what we achieve.
I’m assuming when you mentioned Neptune, Russel, that we should get an extended battery pack if we’re going to be doing ground-truthing on Neptune. Is that what you were suggesting?
Russel: [laughs] Yeah, sure. Let’s do that. That’s awesome.
Guys, look. I think this is a great place to wrap this fundamentals conversation about image analysis up. I certainly have learned a bit. I think I’m going to try and do something I haven’t done in a while. I’m going to summarize this in three key takeaways.
One is it’s not a camera. It’s a sensor. Two is I don’t care about the sensor. I care about the data coming from the sensor.
Three is it doesn’t really matter a lot what the elevation of the sensor is. They all generate the same data. That’s what I’m taking away around fundamentals. You think I did a pretty good…I think I left some things out probably. How did I do?
Sean: No. A star, definitely. You would make the team. We’ve got an application ready here with your name on it.
[laughter]
Russel: What do you think, Allan? Do you agree with Sean’s assessment?
Allan: I agree. I think you did very well.
Russel: Cool. Guys, thanks so much for coming on. Just for the listeners, Sean and Allan are going to be joining us again. We’re going to talk a little bit about how this approach actually gets used in practice.
Sean: Thanks, Russel. Great fun.
Allan: Thank you very much for having us.
Russel: I hope you enjoyed this week’s episode of the Pipeliners Podcast and our conversation with Sean Donegan and Allan Adams.
Just a reminder before you go, you should register to win our customized Pipeliners Podcast YETI tumbler. Simply visit pipelinepodcastnetwork.com/win to enter yourself in the drawing.
If you would like to support the podcast, please leave us a review on Apple Podcast, Google Play, or whatever smart device podcast app you happen to use. You can find instructions at pipelinepodcastnetwork.com.
[background music]
Russel: If you have ideas, questions, or topics you’d be interested in, please let me know on the Contact Us page at pipelinepodcastnetwork.com or reach out to me on LinkedIn. Thanks for listening. I’ll talk to you next week.
[music]
Transcription by CastingWords