By Duncan Steel 12/02/2019

[avatar user=”duncansteel” size=”thumbnail” align=”right” /]

The ongoing fires in the Nelson-Tasman region have quite rightly provoked much alarm. The response of Fire and Emergency New Zealand, the NZ Police, the NZ Defence Force, and many private individuals, has been magnificent. However, the utilisation of satellite imagery for assessing such fires and then planning and responding is deficient in NZ compared to much of the rest of the developed world, where such information is used as a matter of course. In the post below I illustrate what can be done… and my own mistakes. 


I remember once reading a letter to the editor of The Times in which the correspondent lamented that Latin was no longer taught at most schools. He told the tale of having once erred whilst driving, almost colliding with another car which had the right of way, and he wound down his window to apologise, saying “Mea culpa” to the other driver. The reply he got was “Mr Jones. You should be more careful, Mr Cooper.”

Anyhow, mea culpa, mea culpa, mea maxima culpa. And no, I’m not religious. In my preceding post concerning satellite imagery pertaining to the Nelson-Tasman bush fire I missed something that should have been obvious to me. So here is my Hail Mary, and supplication for forgiveness.

Last night I woke at about 03:00, having realised that I had committed a grievous sin as a physicist, and so I rose and quickly started a download more data from the Landsat-8 satellite. The set of files in question amounted to somewhat more than a gigabyte, and had to come all the way from Sioux Falls, South Dakota, and so I went back to a fitful sleep.

When I rose at around 07:00 the files were ready for me to inspect, but I had to tackle a few other work things first. I started looking at them around noon, and it happens that 30 minutes later I received an email from Ben Knight at the Cawthron Institute, saying “take a look at this image I’ve composed from the Landsat-8 data from last Friday.” Great minds work alike, and so on.

All of the data that I discuss and the images I present herein come from an overpass by Landsat-8 that occurred shortly after 11am on Friday, 8 February (NZDT). There are other satellite data sources, and more-recent passes, and I will get to them perhaps in the next few days; but for now I want to concentrate on lessons learnt solely from this set of Landsat-8 data.

Many Earth observation satellites have multispectral sensors, collecting images in various distinct wavelength bands, including many we cannot see directly with our eyes. The human visual spectrum stretches from about 0.4 microns (400 nm) to 0.7 microns (700 nm), from the violet to the red end of the rainbow. Observations at shorter wavelengths (in the ultraviolet) are limited by both atmospheric absorption (thank goodness, else we would all get skin cancer) and atmospheric scattering (the sky is blue due to Rayleigh scattering, which varies as the fourth power of the inverse-wavelength), and so few satellites intended to survey the reflectivities of the land and sea below have detectors for UV wavelengths. Beyond the red end of the spectrum, though, things are different: we may not be able to see electromagnetic radiation at wavelengths longer than 0.7 microns, but lots is going on there of scientific interest. For example, vegetation is highly-reflective in the near-infrared (NIR), especially at a wavelength in the range 0.8-0.9 microns, and so there are satellite sensors tuned to such wavelengths, enabling us to determine how the grass is growing, to some extent.

By convention the NIR stretches out to about 1.6 microns, and from there to around 3 microns wavelength the spectral band is termed short-wave infrared (SWIR). There is no reason that orbiting instruments cannot look down and detect how much sunlight is being reflected back upwards in the SWIR, although the atmosphere does have some deleterious effects providing challenges to those trying to interpret the data.

Going up to longer wavelengths, initially things get to be impossible. The atmosphere is a strong absorber in the mid-infrared, and that is what greenhouse warming is all about: by increasing the mixing ratios of certain gases (carbon dioxide, methane and water vapour in particular) more of the infrared radiation emitted by our planet in response to heating by sunlight (mostly across the visual spectrum, 0.4-0.7 microns) is trapped, and so the global temperature rises so as to balance incoming and outgoing energies.

Eventually, though, one reaches wavelengths at which the atmosphere is transparent again. This means we can see through it, using suitable detectors, looking either upwards (i.e. astronomical observations) or downwards (satellite scanning of the planet from far above). In essence, the atmosphere has various spectral ‘windows’ through which we can peer.

Wavelengths above about 8 microns we term the thermal infrared (TIR). The reason for this is as follows (physics alert!). The Wien displacement law says that, subject to various assumptions (such as the object in question acting as a Black Body; no such thing actually exists) a body at temperature  will emit a continuous spectrum of electromagnetic radiation that peaks in terms of the flux per unit wavelength at a wavelength λmax where λmax T = 2.9 mm K (millimetre-kelvins).

Looks bewildering, actually very simple. Suppose that the average temperature of the Earth were 17ºC. The temperature on the Absolute or Kelvin scale is just the temperature in degrees Celsius plus 273.15. At minus 273.15ºC (or 0 K) everything stops in terms of atomic/molecular vibrations; we call that ‘Absolute Zero’. So, 17ºC is essentially 290 K. Plug that into the Wien displacement law and one finds a wavelength for peak emission of 10 microns. That is, our planet emits a spectrum of electromagnetic radiation with a peak near 10 microns, and a distribution that broadly follows Planck’s law.

At this juncture I wave a flag of warning, concerning something that I already had in mind to discuss in a future blog post. Whilst one can derive as described above a peak wavelength using that form of the Wien displacement law, that does not directly tell you the frequency at which the peak emission occurs. For example: the solar photosphere has a temperature of about 5,800 K, indicating a peak wavelength (rather conveniently) of 0.5 microns. That is at the transition between blue and green. But the Sun appears yellow. As I wrote, I will deal with this in a future post.

Back on track now, concerning the Nelson bush fire and satellite imagery. In my previous post I presented satellite images of the part of Nelson-Tasman where the bush fires have been raging, endangering people and property. Those images were not wrong; it is just that I missed something of some importance, if one’s aim is to assist with assessing such fires and how they are spreading.

The two images I showed (well, there was also a cloud-covered picture) were obtained with the Landsat-8 satellite late on the morning of Friday 8th February (NZDT). One image was what the kind people at the U.S. Geological Survey – the operators of Landsat-8 – term a LandsatLook Natural Color Image. Such images are not natural colour at all, in fact. The table below shows the spectral bands which the Landsat-8 detectors cover.

Landsat-8 spectral coverage. OLI is the Operational Land Imager; TIRS is the Thermal InfraRed Sensor. GSD means Ground-Sampling Distance (i.e. the pixel size). 

Now, the LandsatLook Natural Color Images actually employ bands 6, 5, and 4 from the above table in a mix for red, green and blue (RGB) in that order. Therefore, for the picture you see on your screen, the red colour actually represents the flux detected in the blue part of the spectrum; the green represents a NIR band that our eyes cannot see but at which plants are strongly reflective; and the red represents reflected sunlight from the SWIR part of the spectrum, with wavelengths more than twice as long as the limit of sensitivity of the human eye.

As a matter of useful practice, this makes sense. The two images that follow cover the bush fire region in question, and some land eastwards, across the Waimea River. The first makes use of bands 6, 5 and 4 in an RGB mix, as described above. The second makes use of the actual  RGB bands 4, 3 and 2 (see the table above), and therefore show something like the view that your eye might see if you were an astronaut in orbit.


An RGB mix using Landsat-8 OLI bands 6, 5 and 4. 


An RGB mix using Landsat-8 OLI bands 4, 3 and 2. 

Now one can see why bands 6, 5 and 4 are generally used in RGB colour mixes to eyeball Landsat-8 images on a computer screen. In the 654 mix, whispy clouds and smoke from the fire can be seen; but in the 432 mix much of the region of interest (i.e. the area of the bush fire) is obscured.

Why? I mentioned the fundamental reason earlier. For particle sizes the same order as the wavelength of the radiation in question, the scattering efficiency varies as the inverse-fourth-power of the wavelength: Rayleigh scattering. In the NIR and SWIR, we see through the clouds and smoke, whereas in the red part of the spectrum (band 4, wavelengths around 0.65 microns) the scattering is more efficient, and therefore in the colour-coding used the smoke looks bluish. Down in the blue and green parts of the spectrum (OLI bands 2 and 3) the scattering is even more effective, and so the 432 image has the area of major interest dominated by cloud. That fourth power makes a big difference: a wavelength four times less (say, blue at 0.5 microns compared to SWIR at 2 microns) is scattered 256 (four to the power four) times more effectively.

Lesson learnt here: though we often decry the fact that Aotearoa – the Land of the Long White Cloud – is often decked with obscuring cirrus and cumulus when satellites pass overhead, and so only orbiting radars can deliver useful data on many days, in fact there are ‘optical’ satellites that collect imagery in the NIR and (particularly) the SWIR that can, to a large extent, penetrate the clouds and so enable us to see the land beneath.

Failing to point that out, however, is not my most egregious error. The thing I thought of during the night, and Ben Knight kindly made clear to me with an example, is that although we term the wavelengths around 10 microns the ‘thermal infrared’, in fact that pertains to emission at low temperatures, compared to a fire. The two bands in the TIRS instrument (see table above) are set up for observations of Earth’s land and ocean surface, implying temperatures mostly in the range 250-310 K, and so the wavelengths around 11 and 12 microns are appropriate. But a fire implies a temperature of 1000 K or more (notwithstanding Ray Bradbury’s Fahrenheit 451), and so peak emission at a rather shorter wavelength. If we refer back to the table, we see that bands 6 and 7 in the SWIR might be expected to be emitted strongly by materials at temperatures of 1000-2000 K; and extrapolating from the apparent translucence of the clouds/smoke in band 6 we might expect to see to the ground, with burning/hot areas appearing bright, in bands 6, 7 and in the TIR. So here is a suitable image composite in which I have used the ‘thermal’ IR bands 10 and 11 as red; SWIR band 7 as green; and SWIR band 6 as blue.


A composite image covering the area of the main Nelson-Tasman bush fire using Landsat-8 TIRS intensity as red in the colour-coding, OLI band 7 as green, and OLI band 6 as blue. The real hotspots (active fires) appear bright yellow, and the burnt (still smouldering) area is reddish. The forested area to the west is cool and appears dark blue; Pigeon Valley itself (at lower left) is light blue; and the fields of the Waimea Plains to the east are pink, reflecting their moderate (Sun-exposed) temperature. Why does the Waimea River appear red, and therefore hot? The best guess of the author writing after midnight is that this is because the pebbles/shingle along its banks have high absorptivities in the visual spectrum and low emissivities in the thermal infrared spectrum, and so they reach elevated temperatures under direct sunlight; stepping barefoot on a few dry stones similarly exposed to the Sun beside a river tends to confirm this conjecture. Another graphic, using (R,G,B) = bands (7,7,6) appears at the head of this post, with the bright yellow fires perhaps being even clearer. 

I think that the preceding image is tremendously important. It shows how satellite imagery could be really useful when fighting forest fires and the like. One could argue that it is now several days later, to which I would respond: live and learn. Other deprecators might say that there are few satellite overpasses producing such useful information, to which I reply: you’re wrong. There are several satellites that can deliver such data, and the number is increasing year-on-year. Plus, there are other satellites that I have yet to be able to find the time to access. For example, the National Oceanographic and Atmospheric Administration (NOAA) in the United States operates a satellite called Suomi NPP, and it has a modality used for monitoring wildfires (thanks Robert Schafer).

I cannot conclude this post without returning to something that I mentioned earlier, and that is Band 1 in the OLI instrument on Landsat-8. This covers the violet end of the visual spectrum (see, again, the table presented above for its wavelength limits). Whilst it was designed specifically for monitoring oceanic aerosols, obviously enough it can render information about aerosols over land. Things like smoke and clouds associated with fires.

The image at left below shows a colour-coded map of the aerosols across the top of the South Island, and it’s obvious where they are predominant, just as it has been obvious to any resident of Nelson (like me) that there has been a lot of smoke about over the past few days. At right (below) is an expanded version of the part of the image around Nelson. The obvious ‘slash’ at an angle down the left/west of each graphic is due to the fact that for this overpass by Landsat-8 the region was at the edge of the coverage.










Zooming in on the bush fire area to the north-east of Pigeon Valley, one can see that there is a distinction between the swirls above the fires, some being smoke, and some water vapour.


Can one derive useful information about bush fires from satellite data collection? Obviously enough, I’d argue that you can, and others have demonstrated this to be the case. I could write more, but for now I am out of both time and space.

Addendum, 12th February at 21:40 NZDT: The NASA Earth Observatory webpage has a post concerning the same overpass by Landsat-8 and its observations of the Nelson-Tasman fires (imagery from 7th February UTC means 8th February NZDT for 11:07am); however, that site has yet to show the same sort of images as those I posted above, indicating specific fire locations.