Cover image: A close up view of AaSI, a Cubesat-sized hyperspectral imager from the VTT Technical Research Centre of Finland. Image credit VTT.
This is part 2 of our series on hyperspectral imaging. Check out Part 1 to learn about the basics of hyperspectral imaging (HSI) and how wavelength, or spectral data, can be combined with position in an image, or spatial data, to form information-rich datasets.
In this article we'll be taking a closer look at four examples space-based hyperspectral imagers.
Wait, what about regular spectroscopy? Isn't that the same thing?
Not exactly. You may have heard of spectroscopy used for all sorts of things in astronomy. In the traditional sense, spectroscopy spreads out the light to look at many wavelengths but lacks the spatial component that is essential for HSI. Regular spectroscopy is used to interrogate the composition of atmospheres, calculate the expansion of the universe, and more. While this science is based on the same principles, hyperspectral imagers do this kind of spectroscopy for every pixel in a 2D image.
Hyperspectral Infrared Imager (HyspIRI)
The first mission on our list is the Hyperspectral Infrared Imager (HyspIRI). HyspIRI will be composed of two instruments aboard a satellite in low Earth orbit. One is an imaging spectrometer measuring visible through short-wave infrared, also called the VSWIR band, from 380nm to 2500 nm in 10nm contiguous bands. Those of you who listened to our episode on RIT SPEX's recent high altitude balloon may recognize the utility of this range with regard to imaging vegetation. One of HyspIRI's objectives is to identify types and health of vegetation. Plant health can be assessed by measuring the ratio of visible light to infrared light that the plant reflects, with healthier plants reflecting more visible light than infrared. HyspIRI will also be equipped with a multispectral (i.e. not contiguous coverage) imager sensitive to light in the mid and thermal infrared, or TIR, bands from 3 microns to 12 microns in wavelength. Both the VSWIR and TIR instruments have a spatial resolution of 60 meters (1 image pixel is 60 meters wide), but the VSWIR instrument can see less ground at one time.
From the mission website, "The objectives of the Hyperspectral Infrared Imager (HyspIRI) mission are to 1) study the processes that indicate volcanic eruption; 2) analyze the nutrients and water status of vegetation; 3) examine soil type and health; 4) use spectra to identify locations of natural sources; 5) study deforestation and changes in vegetation type; 6) provide early warning of droughts; 7) improve exploration for natural resources; and 8) forecast likelihood of volcanic eruptions and landslides."[1]
The mission was recommended in the 2007 National Research Council Decadal Survey and has seen steady development through collaborative workshops between NASA and academia, and key technology is being tested and developed through the HyTES project. HyTES is basically a technology demonstrator for HyspIRI's instruments, but they are flown on aircraft like the ER-2. As of now HyspIRI is still in the study stage, but you can follow along with their progress through the mission website.
Hyperspectral Imager for the Coastal Ocean (HREP-HICO)
Next up is the HICO and RAIDS Experiment Payload - Hyperspectral Imager for the Coastal Ocean (HREP-HICO). In a addition to it's extremely long name, HREP-HICO featured a fairly straightforward system based on another ocean hyperspectral imager, PHILLS. HREP-HICO was mainly a technology demonstrator, exploring ways to reduce cost and schedule for this type of instrument. Construction on the imager began in November 2007 and was integrated with the International Space Station less than a year later in August 2008.
HREP-HICO was just an instrument, not a whole spacecraft. It was mounted on the Japanese Experiment Module Exposed Facility where it imaged the Earth from the visible to near-infrared wavelengths (400nm to 900nm at 5.7nm resolution), with a ground spatial resolution of 95 square meters per pixel.
Data from HREP-HICO was used to research organic material in the ocean, detect emulsified oil from oil spills, and develop new ways to deal with cloud cover in space imagery. The mission outlived its one-year demonstration life by another four years until it was killed by an X-class solar storm in 2014.
AaSI - HSI for CubeSats
Hold up, NASA doesn't have all the fun! In 2017, the VTT Technical Research Centre of Finland launched an HSI technology demonstrator as a payload on Aalto-1, a student-built CubeSat from Aalto University in Finland.
At a form factor of only 0.5U (5cm x 10cm x 10cm), AaSI punches above its weight class. The heart of the instrument is its spectrometer, which is based on a tunable Fabry-Pérot interferometer. The interferometer is a piezo-actuated MEMS device consisting of two highly reflecting surfaces. By tuning an air gap between them, different bands can be isolated. AaSI uses an RGB imaging sensor, so by managing multiple orders of the interferometry up to three separate bands can be imaged at once. (If you remember back to the previous article, this imager uses the "area scanning" method, capturing 2D spatial data and sweeping through the different wavelengths).
AaSI has a spectral range in the visible between 500nm and 900nm at 10-30nm spectral resolution, and produces 2048 x 2048 pixel images at 240 square meters per pixel. You may have noticed a second camera in the AaSI photo above, that's a secondary RGB visible camera used for georeference of AaSI data.[2]
Deep Learning with AVIRIS Data
The last interesting example of hyperspectral imaging takes place on the ground in the image processing part of a space mission. In a research article from the Journal of Sensors, two Chinese researchers have used HSI data from instruments like NASA's AVIRIS platform, an airborne hyperspectral imager discussed in last week's article, to develop a deep convolutional neural network that automatically classifies ground features based on their spectral content.
I won't get into how deep convolutional neural network (DCNN) classifiers work in this article, but here's the gist. Massive amounts of data are fed into this algorithm, which updates many layers of connected parameters that allow it to detect features within the data. This is the same concept that has enabled advanced object detection and speech recognition in modern tech. Here the DCNN looks at the spectral response of different materials. Remember how plants reflect very strongly in a few parts of the spectrum and not so much in others? Extrapolate that idea to all materials, and even different types of plants, all with spectral "signatures" that are unique enough to tell a building from a tree, or even alfalfa from wheat.
Hyperspectral imaging is not a new idea but it is growing as hardware and software technologies develop. Scientists and engineers continue to push the limits of imaging techniques, and I look forward to seeing what new applications will come out of looking at the world one wavelength at a time.
[1] HyspIRI Mission Concept Team, HyspIRI Comprehensive Development Report, Jet Propulsion Laboratory. 2015. ↩︎