Ops-EO Extention
Extention to GIS for Operation training with a focus on satellite data.
Last updated
Extention to GIS for Operation training with a focus on satellite data.
Last updated
Remote sensing is the process of detecting and monitoring the physical characteristics of an object by measuring its characteristics at a distance. The measurements taken are usually reflected and emit radiation. For optical imagery, this equals detecting colour and brightness.
The simplest example of remote sensing is taking photos of the Earth's surface from an aircraft or a satellite (with a huge camera).
An example of the most popular use case of satellite imagery would be Google Maps Satellite view.
The most common is to divide sensors into passive, which capture signals emitted or reflected from an object (like the human eye or a phone camera works), and active, which register the backscattering of their own emitted signal (like bats or dolphins). Among the first group optical sensors are the most popular; and among the second you may encounter Radar, or Synthetic Aperture Radar (SAR), LiDAR (laser), Laser Altimetry, and other technologies.
When choosing data for your project, besides the sensor type, you should consider three resolutions: spectral, spatial, and temporal.
In terms of spectral resolution, among optical sensors we usually differentiate multispectral, which could have tens of broad (from tens to hundreds of nanometres wide) spectral bands, and hyperspectral sensors, which can have hundreds of very narrow (2-10 nanometres) spectral bands. There are different bands (frequencies), on which radar satellites operate as well. SAR has only one frequency, but various compositions of polarisations.
Another way to classify sensors is by the spatial resolution of images (no matter was this image formed by an active or a passive sensor). Borders of the following classes are not fixed and tend to change over time while new sensors are developed:
Kilometers – low resolution
Tens of meters – medium resolution
Up to 10 m – high resolution
Less than 1 m – very high resolution (VHR)
While high-resolution and VHR cameras let the detection of very small objects on Earth's surface and create topographic maps, low-resolution sensors overview vast territories daily (or even every 10 minutes, like geostationary meteorological satellites, do).
This leads us to temporal resolution – the frequency of image update, or revisit time. This frequency depends on orbit geometry, sensor field of view, and local conditions (for example, no optical images in the visible part of spectra are taken during the polar night).
Optical sensors are designed to capture radiation in pre-defined spectral intervals, called bands or channels. The image below shows the distribution of spectral bands across optical part of spectra for Sentinel-2, Landsat-7 & 8. You may notice, that spectral bands are designed to insure 1) differentiation between various objects and/or study of their properties and 2) transparency of Earth's atmosphere for outgoing radiation. The human eye sees the interval between 380 and 780 nm (from violet to red). Satellites usually have at least 3 bands in this interval: Blue, Green, and Red - to create a "True-colour" view.
Another group of imagery we usually work with is SAR (Synthetic Aperture Radar). SAR technology differs greatly in imagery creation methods, wavelengths, and viewing geometry. This makes it more challenging to use, but it has some advantages.
While optical sensors measure reflected radiation from the Earth's surface from Sun, radar emits a pulse of microwave energy (wavelengths are cm to tens of cm) and detects its reflection from objects. This means that a) SAR sensors are not light-dependent, as they register their own emitted signal; b) microwave signals can go through clouds, as wavelengths are longer then water particles size.
Image formation
Radar registers 3 parameters
Amplitude - the strength of the reflected echo Amplitude changes due to:
Object size and orientation to signal
Surface roughness
Dielectric properties of the surface
Incidence angle
Polarization
Frequency or Wavelength
Signal phase
The time between the pulse generated and the echo received
The latter parameter leads to the necessity of the sensor to look sideways. In another case, two objects at the same distance from the sensor would send the signal back at the same time. This would make it impossible to differentiate them, and therefore, to form an image.
Sideways-looking geometry plus physical properties of radar wave result in geometrical distortions, known as Layover and Foreshortening.
Another feature of SAR images is Speckle - a specific noise-like pattern, which is a result of interference from the many scattering echoes within a resolution cell.
Email magic@bas.ac.uk for any imagery request
View and download data
Polar View https://www.polarview.aq/ - latest SAR (Sentinel-1 and RCM) data
Copernicus Browser https://browser.dataspace.copernicus.eu/ - optical and SAR medium-resolution images
EO-browser https://apps.sentinel-hub.com/eo-browser/ - optical and SAR medium-resolution images
USGS viewer https://earthexplorer.usgs.gov/
Any online maps (Google, Bing, ESRI) - may contain outdated images
For VHR optical or SAR images magic@bas.ac.uk
Sensor | Spectral Bands (Number of bands/ Spectral interval, nm) | Spatial Resolution, m | Revisit time (in December over South Rothera) |
---|---|---|---|
Sentinel-3 (OLCI)
21 / 400-1020
300
3 times a day
MODIS (Terra & Aqua)
36 / 405 - 965, 3000-15000
250, 500, 1000
2 times a day
Sentinel-2
13 / 443 - 2190
10, 20, 60
once in 3 days
Landsat 8&9
11 / 430 -1380, 10600 - 12510
15, 30, 100
once in 3 days
Maxar WorldView-3
16 / 400 - 1040, 1195 - 2365
0.3, 1.24, 3.70
Airbus Pleiades
6 / 400 - 880
0.3, 1.2