Spectral resolution as a way of characterizing or describing a sensor that's used for remote sensing based on the amount of light that is recorded and how that light has been sliced up into smaller bands or more images that can be sensed. I know that may not make sense just yet, but hopefully it will by the time we're done in this section. So, let's have a look at spectral resolution and how we can think about it, how we can explain it. If we have the surface of the earth that we're going to do our remote sensing on. We have our light source, in this case the sun, and we have our sensor that's based on a satellite. So, light bounces off of the earth to the satellite sensor. Okay. So, if we just measured the amount of sunlight that's reflecting off of the surface of the earth and this would be just one cell, one small square of land on the ground, we can say that either there's a lot of light being reflected or not a lot of light being reflected. If we just think of sunlight as there's just this white light, this one thing. So, we're not talking with different colors of light, were not dividing things up yet, we're just like, think of it almost like it could be just like a black and white image. A black and white photo is that, all you can say is that something is either dark or it's light, and so you'd be able to give each one of those a number, but that's all that you have to work with. So, this would be considered a low spectral resolution because you just have one thing that you can measure in terms of the amount of light. So, if we have a square on the ground, and let's say this is an eight bit image, so we have a range of possible values from 0-255, and so for this particular location, the amount of light that's being reflected has an amount that turns into a number that's 64. So, that's all we would have to work with for that cell, that piece of land on the ground is we could say, oh, it has 64 is the amount of light that's being reflected. Now, what if we do this again, but we split the light that's coming from the ground before it hits the sensor? So, as the light is hitting the satellite, let's just imagine, I mean, the technical way that this is done at the technology is way beyond our conversation here and I don't pretend to understand completely how all of it works, but just imagine that this way is that the light that's coming from the earth, goes through a prism, and then is divided up or split into blue light, green light, and red light, and so then imagine that on the satellite, there are three sensors, not just one. So, it's not just measuring white light, is it bright or not bright, but it's measuring how much blue light is there coming from that one square on the ground. How much green light is coming from that same square on the ground. How much red light is coming from that, and it's collecting those three different amounts at exactly the same time simultaneously. So, now we have three different portions of the visible spectrum. Blue light, green light, and red light, and now we can get three different values that we can work with based on the amount of light that's being reflected. So, blue light, green light, and red light, and so for this particular cell, this is just an example that we could say that there's not a lot of blue light being reflected, so it has a low value of 24, there's a fair amount of green light. So, it's a higher value of 85 and there's an in-between amount of red light that's got a value of 37. What's interesting though, is we can also do this with light that's not visible to us. In this case, infrared light. So now, we can actually have, as we have here, six different combinations or possible combinations of light. So, we have blue light, green light, and red light, but we also have near-infrared light, mid-infrared light, and far-infrared light. Just imagine at exactly the same way as even though we can't see it, the sensor can and it can sense it and it can record numbers for that, and so now we can have a value for that same square in the ground for the near-infrared, the mid-infrared, and the far-infrared, and so the point of this is, that we now have six possible numbers that we can use in combination to try and identify something on the ground and what we're hoping for, that the whole premise of remote sensing really is that different types of objects will reflect light differently, and that if we can discover patterns to the way that those reflections work. If the same type of material always reflects light in the same combinations, then we can build up this library in our minds or on paper or on a digital file to say that when we see this combination of these different bands, if we have a lot of blue light and not a lot of red and this much mid-infrared, then maybe that means it's this type of material. Maybe it's a pavement or sand, or a crop of corn. So, that's really the importance of spectral resolution is the more of these the wavelengths that you have to work with, the more likely it is that you're going to find a unique combination that's useful for identifying different types of objects. So, I should mention at this point, that the blue, green, red, and so on are referred to as bands. The more, I think formal term is wave bands, but often we just call them bands of light or they're bands of the spectrum, there's sort of sections of the electromagnetic spectrum. So, this slide I hope kind of illustrates what I was just talking about. If we have our source of light, we have incident radiation coming and hitting an object, some of that is going to be reflected. Some of it's going to be absorbed and some of it's going to be transmitted. That's our radiation target interactions. So, objects can reflect and transmit and absorb invisible light just as much as they can with visible light, and so the idea here is that we're trying to find these different combinations of reflection and really what this comes down to, of course is that the there's a certain amount that's reflected and another amount that's absorbed or transmitted, but we're only able to record the reflected amount and that's what we're able to work with, and so, the more possible combinations we have, the more likely it is that we'll be able to find unique combination that we can use to identify something. Let's just have a look at this part of the spectrum here a little more closely. So, here's our visible and infrared. These are the bands that are typically sensed. This is kind of a generic version of it. It really depends a lot on the sensor. There's lots of different sensors out there that are designed for different purposes. So, for now, let's just look at this. If you're curious or you're wondering, I'm basically basing this on Landsat seven, but just think of it as sort of a generic example. So, we have our blue, green, red, near-infrared, mid-infrared, and far-infrared. So, with Landsat seven, these are actually assign numbers, it's useful for you to know this because when you get your own satellite data, that's what it's going to be referred to as, like which bands do you want to work with and their numbers and you just have to know what those numbers represent. So, Landsat eight has a different numbering system because they added extra bands. Other satellites have different numbering systems. As you get to work with them and become more familiar with them, these numbering systems will become more second nature to you, it's just a fact of remote sensing that that's how things are referred to. So, if we think about that prism that splitting up our beam of light into separate waves, wavelengths, so we have blue, green, red, and so on. Remember, what's happening is that there's actually a sensor for each one of those and those sensors are capturing the same square on the ground at exactly the same time. So, we have six versions of the same location. So, we have blue light that's captured as a blue image. We have green light that's captured as a green image. We have red light that's captured as a red image, and we have the same thing for near-infrared, and mid-infrared, and far-infrared, and the whole idea here is that one of the things I want you to really get out of this is that these are taken simultaneously. So, there's six versions of the same location at the same time. Now, what might be confusing to you if you're not used to thinking about things this way, is I just finished saying that this is an image of blue light and this is an image of green light. Well, then why the heck is that image not looking blue to us. Why is it not that this one looks blue and this one looks red. Okay. So, the first thing is that, you have to remember that these are just cells that have values, numbers and the convention is that we give a value or a cell with a value of zero. A gray scale of black and the top of the range. So, in this case, it's an eight bit image. So, that would be 255, gets a value of white and then all the numbers in between get a value of gray that's in between black and white, that's how it's done. So, you end up with a black and white image. So, it doesn't matter what type of light is being sensed. The whole thing here that I want you to get in terms of interpreting these, is if you see an area that's bright or looks white on this image that means it has a high number. So, for this particular image, that's the blue light image. So, if it looks white on the image, that means it has a high number. That means that's a lot of blue light. If it has a low value, it will look dark on that image, that means that it's not a lot of blue light. So, that's a way for us to be able to interpret how much blue light was interpret or was reflected at a particular location. Same thing with a green. If you see a bright area on the green image, that's where there's a lot of green light being reflected. Same thing with the red, there's a lot of red light being reflected, and so what you may start to notice with this and I'll show you a bigger versions of this in a second, is that you can look at the same type of land cover. So, trees in a ravine or water and that they will look different based on which band image you're looking at because they are reflecting different types of light differently. So, here's a better look at the blue image, and so you'll see, for example, that this is Lake Ontario here, so it's one of the Great Lakes, we have a ravine here. So, this is an area with fairly dense vegetation. In Toronto, this is a large part. This is high park. This is, let's see. The downtown area here. So, for this image with blue light, you're seeing that the ravine area is fairly dark, that means there's not a lot of blue light being reflected from the trees and that ravine. The downtown areas fairly bright. That means that the buildings downtown that think of it in terms of the materials like pavements, rooftops, that kind of thing are reflecting more blue light than the trees are. For now, I just want you to kind of be able to interpret those things. If we look at the same location, taken at the same time, but this is green light, we're seeing a slightly different version of this. So, we still have the ravines looking fairly dark, but not quite as dark as they did with the blue. Downtown, still looking fairly bright. So, that means a lot of green light is being reflected. Now, this brings me to something else that comes up often is that, someone might look at this and say, "Well, the downtown is bright, that means it's reflecting a lot of green light, but if I was hovering over downtown, the buildings turn olive green." Right, but remember, if they're reflecting a lot of blue light and reflecting a lot of green light, and reflecting a lot of red light. Together, those are all being reflected a lot to your eye. That's going to be mixed together to make it look like white or something light gray or white or something like that. So, that's where it can get tricky is that this comes back to this idea of what color is and how we perceive things. If an object looks blue, like let's say water, that means that more blue light is being reflected than the other wavelengths. So, it will look bright in the blue wavelengths. If I go back to the blue image and I say, "Well, here's Lake Ontario, why isn't that super bright?" Well, it actually is relatively bright, but there are still other things. So, even though water looks blue, there are other things that reflect more blue light. Such as, I always think of like, the Rogers Centre, which is a big dome stadium downtown that it has a white roof on it. So, because it's white and it's smooth, and it's this really artificial material, it is reflecting a lot of wavelengths very much and so it ends up looking white because it's reflecting a lot of blue and a lot of green and a lot of red. Whereas, something like water is reflecting blue light, but not necessarily as much as some other types of materials. Here we have the red band. So, you're seeing that the vegetation is fairly dark, that kind of makes sense because vegetation tends to absorb red light. The areas downtown or looking fairly bright. So, these are all visible wavelengths. The blue, green, and red. Look what happens when we look at the near-infrared image though. If you look at the same ravine, same vegetation, in this image, they actually, those areas look quite bright. That's because vegetation reflects a lot of near-infrared light. It's a really good band for mapping vegetation for that reason, and conversely, the areas that are water, like Lake Ontario, absorb near-infrared light very well, and so it's really good at detecting areas with water in them because they will always have really low values in the near-infrared band. So, vegetation can have quite high values. Water can have low values. Let's keep going. Here's the mid-infrared band that looks quite different than the near-infrared band. Here's the far-infrared bands. Same thing again. Now it looks different. So, the whole idea here that I'm trying to get you to see is that because we have these different bands to work with, the whole thing we're trying to get at is, can we identify combinations of those bands for something like trees versus grass, versus pavement, versus water. Different land cover types.