Ops-EO Extention

Extention to GIS for Operation training with a focus on satellite data.

What is Remote Sensing?

Remote sensing is the process of detecting and monitoring the physical characteristics of an object by measuring its characteristics at a distance. The measurements taken are usually reflected and emit radiation. For optical imagery, this equals detecting colour and brightness.

The simplest example of remote sensing is taking photos of the Earth's surface from an aircraft or a satellite (with a huge camera).

An example of the most popular use case of satellite imagery would be Google Maps Satellite view.

What types of imagery we usually use?

The most common is to divide sensors into passive, which capture signals emitted or reflected from an object (like the human eye or a phone camera works), and active, which register the backscattering of their own emitted signal (like bats or dolphins). Among the first group optical sensors are the most popular; and among the second you may encounter Radar, or Synthetic Aperture Radar (SAR), LiDAR (laser), Laser Altimetry, and other technologies.

When choosing data for your project, besides the sensor type, you should consider three resolutions: spectral, spatial, and temporal.

In terms of spectral resolution, among optical sensors we usually differentiate multispectral, which could have tens of broad (from tens to hundreds of nanometres wide) spectral bands, and hyperspectral sensors, which can have hundreds of very narrow (2-10 nanometres) spectral bands. There are different bands (frequencies), on which radar satellites operate as well. SAR has only one frequency, but various compositions of polarisations.

Another way to classify sensors is by the spatial resolution of images (no matter was this image formed by an active or a passive sensor). Borders of the following classes are not fixed and tend to change over time while new sensors are developed:

  • Kilometers – low resolution

  • Tens of meters – medium resolution

  • Up to 10 m – high resolution

  • Less than 1 m – very high resolution (VHR)

While high-resolution and VHR cameras let the detection of very small objects on Earth's surface and create topographic maps, low-resolution sensors overview vast territories daily (or even every 10 minutes, like geostationary meteorological satellites, do).

This leads us to temporal resolution – the frequency of image update, or revisit time. This frequency depends on orbit geometry, sensor field of view, and local conditions (for example, no optical images in the visible part of spectra are taken during the polar night).

Optical Remote Sensing and Passive Sensors

Optical sensors are designed to capture radiation in pre-defined spectral intervals, called bands or channels. The image below shows the distribution of spectral bands across optical part of spectra for Sentinel-2, Landsat-7 & 8. You may notice, that spectral bands are designed to insure 1) differentiation between various objects and/or study of their properties and 2) transparency of Earth's atmosphere for outgoing radiation. The human eye sees the interval between 380 and 780 nm (from violet to red). Satellites usually have at least 3 bands in this interval: Blue, Green, and Red - to create a "True-colour" view.

Optical sensors features

Radar Remote sensing and SAR

Another group of imagery we usually work with is SAR (Synthetic Aperture Radar). SAR technology differs greatly in imagery creation methods, wavelengths, and viewing geometry. This makes it more challenging to use, but it has some advantages.

Active sensors

While optical sensors measure reflected radiation from the Earth's surface from Sun, radar emits a pulse of microwave energy (wavelengths are cm to tens of cm) and detects its reflection from objects. This means that a) SAR sensors are not light-dependent, as they register their own emitted signal; b) microwave signals can go through clouds, as wavelengths are longer then water particles size.

Wavelengths

Image formation

Radar registers 3 parameters

  • Amplitude - the strength of the reflected echo Amplitude changes due to:

    • Object size and orientation to signal

    • Surface roughness

    • Dielectric properties of the surface

    • Incidence angle

    • Polarization

    • Frequency or Wavelength

  • Signal phase

  • The time between the pulse generated and the echo received

The latter parameter leads to the necessity of the sensor to look sideways. In another case, two objects at the same distance from the sensor would send the signal back at the same time. This would make it impossible to differentiate them, and therefore, to form an image.

Sideways-looking geometry plus physical properties of radar wave result in geometrical distortions, known as Layover and Foreshortening.

Another feature of SAR images is Speckle - a specific noise-like pattern, which is a result of interference from the many scattering echoes within a resolution cell.

Where do I get a satellite image?

Email magic@bas.ac.uk for any imagery request

Last updated