MSU logo Minority Serving Institution Partnership Development Program

Remote Sensing: Imaging System

Your imaging system takes advantage of the properties of light. One camera has a red filter and one has a near-infrared (NIR) filter, custom designed to filter out all other kinds of light. It is important to be aware of optical effects created by your materials, the objects in your image, and the atmosphere so you can minimize error and account for possible discrepencies in your data.

  • Light Filters
  • Parallax
  • Reflection and Refraction
  • Atmospheric Scattering
  • Calibration

Color WheelsThere are three primary colors of light: red, green, and blue. These can be added in various proportions to create all the other colors of light that you see. (TV screens only have these 3 colors and combine them to make all the colors in the picture that you see.) The color wheel represents how combinations create new colors. For example, red and blue combine to make magneta. White light is really the combination of all the colors. Colors that are opposite from each other on the color wheel cancel each other out. So, if you wanted to block green light, you would choose a magneta filter.

Camera system

Light filters are designed from materials that only allow a narrow range of the electromagnetic spectrum through by blocking most other kinds of light. Photographers often use different light filters in their cameras to achieve some desired contrasting effect. Your imaging system (pictured at left, courtesy of Joe Shaw) consists of two cameras, each with a diferent circular filter. One camera has a red filter (bottom right of case) so it only lets red light through, and the other has a near-infrared, or NIR, filter (bottom left of case) that only lets NIR light through.

 

Parallax ExampleParallax is the apparent difference of position of an object due to different vantage ponts.

Hold your finger out in front of you with your arm extended. Close your left eye and compare the position of your finger with a distant background. Now close your right eye and do the same. Your finger appears in a different location compared to the background. But you haven't moved your finger - it is just an optical effect! This is parallax.

If you had placed your finger closer to your face, you would have seen a larger parallax (i.e. you finger would have appeared to move even further compared to the background when looking with your left eye and then your right eye). Your eyes are are in different places on your head, so they get different views of things. In fact, your brain combines these different views, judges how strong the parallax is, and sends you a message of how far away the object must be. This gives you depth perception.

Left eye parallaxRight eye parallax

The Red and NIR cameras on your blimp payload are in slightly different positions, so they will get slightly different images, meaning that your NDVI measurements will initially have some error due to parallax.

Light can bounce off of objects, transmit through objects, and bend differently in different substances.

Reflection of Mt. HoodAngle of ReflectionReflection:

Occurs when light bounces off of a substance, like when you see yourself in a mirror. On the left is a picture of Mt. Hood being reflected by a lake. When light reaches a reflecting surface at an angle, it will bounce off at the same angle, measured from the 'normal' to the surface (Diagram at right, drawn by Johan Arvelius 2005-09-26). The normal is just a line perpendicular to the surface. So in the case of the picture on the left, we get two images of the mountain - one from light rays that left the mountain and came to the camera, and one from light rays that left the mountain, bounced off the water, and then came to the camera.

RefractionRefraction in Soda with StrawRefraction:

Occurs when light changes medium, such as from air to water. Light travels at different speeds in different materials, causing it to bend at the transitions. Different wavelengths bend different amounts. For example, a glass prism slows the violet more than blue, blue more than green, etc., so that the outgoing light has been separeated and looks like a rainbow.

Prism

Convex MirrorTo complicate things even more, curved surfaces warp both reflecting and refracting light, as we can see in this image on the right created by a convex (outwardly curving) mirror. Inwardly curved mirrors are called concave.

Since all of these effects warp light, they may impact the images you take. Note that your camera system is encased in a flat, plastic shell. All the light that your cameras receive must first pass through this clear casing. The plastic has been selected to be light-weight, but also to minimize the refraction of light as it passes through. It is especially important that you take care to protect the casing, because scratches or dents will refract the light unpredictably, causing it to scatter and your picture quality to diminish. Compare these images taken with a smooth camera lens and then with a scratched lens. You can see why protecting your camera lens and casing is essential for taking high quality images, and, therefore, obtaining the best NDVI data.

Photo with smooth lensPhoto with scratched lens

There are several other external factors that influece reflection and refraction that you should be aware of, such as the Sun's angle in the sky, your imaging viewing angle, cloud coverage, and other atmospheric disturbances. Your software program will correct for the angle of the Sun in the sky based on the time of day your images were taken. However, the effects of the different angles of leaves on the plants you are imaging and the angle of your camera floating in the air are more difficult to quantify and are not auto-corrected with the software, adding some error to your data.

Sunset SchematicThe climate of the region where you take your data can influence your images. The particles in the atmosphere are going to vary in number and size, depending on if you are in a hot, humid climate, or a dry climate. Your blimp stays relatively low to the ground, so your imaging system will not have to peer through as much of the atmosphere as that on a high altitude balloon or satellite, but it is still important to be aware of atmospheric effects.

Rayleigh Sky

Rayleigh Scattering:

Rayleigh Scattering occurs when light encounters particles that are smaller than the wavelength of the incoming light. The amount of light that scatters (radiance) depends very strongly on wavelength. The short wavelength blue light scatters much more than the long wavelength red light. This is why our sky appears blue - the atmosphere is mostly small particles, so scattered blue light comes to our eye from all angles. At sunset or sunrise the light from the Sun must travel through more atmosphere to reach our eyes, so long wavelengths eventually gets scattered as well, giving us the beautiful colors

Mie Countryside

Mie Scattering:

Mie Scattering occurs when light encounters particles that are larger than its wavelength, such as water droplets that make up clouds. Unlike Rayleigh scattering, Mie scattering has a very low wavelength dependence. All wavelengths of light are scattered almost equally, which is why clouds appear white. In very humid environments, the water droplets in the air also produce mie scattering, so the blue in the sky is muted or pale

 

Scattering Plot

 

The plot at right shows the difference in irradiance (strength of scattering) for Rayleigh (blue curve) and Mie (red line) scattering at visible wavelengths. You can see the sharp drop-off of Rayleigh scattering from 400nm (blue light) to 700nm (right light), as discussed above. However, Mie scattering remains relatively constant at all wavelengths, creating a white color, as seen in clouds.

Pixelation

Pixel Calibration:

A picture taken with a camera is really a grid combination of thousands (or even millions) of pixels that just respond to brightness. For example, the mountain scene at right is a very clear, crisp image, but zooming in (below) to one of the shadows shows the square pixels. Each pixel has only one color, and at this scale, the picture is blurry and unrecognizable. More pixels give a higher "resolution", meaning that you can zoom in on the photo and still make out fine detail.

Flat Field

 

 

Different pixels respond to brightness in different ways. So if you took a "flat field" photo of a constant valued object, it would still show spots where different pixels either responded weakly to the light (darker) or strongly to the light (brighter). Ideally, you would want no variation in pixel response, but in the real world, that never happens. So we must measure and correct for variations in pixel response. You software does this for you.

 

Calibration Panel

Reflectance Calibration:

The black and white calibration panel you have been given is used to gauge the darkest and brightest areas you detector can measure when flying from the blimp. A perfect white surface will reflect 100% of the light it receives, so it has 100% "reflectance", known as Lambertain. A perfect black surface has 0% reflectance - all the light is absorbed as heat (which explains why black asphalt can feel very hot during the summer). Your calibration panel is not perfect, and its reflectance has been measured in the lab.

 

NDVI calibrationThe detector's (or camera's) response to the reflectance is measured by how many photons it receives, which is called "counts", "hits", or "Data Number (DN)". Surfaces with high reflectance cause more photons to hit the detector, and therefore, a higher DN. This number can change depending on what time of day you are imaging and how high your blimp is, so it needs to be re-calibrated every time you fly. But once you plot the DN for the black and white surfaces, you can interpolate by connecting a straight line between these points. Then use the graph to learn about the reflectance of other surfaces. For example, if your red camera received 720 counts, you can go to your graph and match the DN with the corresponding reflectance, which looks to be about 75% on this graph. This gives you the decimal value of 0.75 for your RED value in your NDVI equation. The same procedure applies to the NIR camera.