🦖
Ops-EO Extention
  • Ops-EO Extention
  • Optical imagery exercise
  • Image interpretation examples
Powered by GitBook
On this page
  • What is Remote Sensing?
  • What types of imagery we usually use?
  • Optical Remote Sensing and Passive Sensors
  • Radar Remote sensing and SAR
  • Where do I get a satellite image?

Ops-EO Extention

Extention to GIS for Operation training with a focus on satellite data.

NextOptical imagery exercise

Last updated 9 months ago

What is Remote Sensing?

Remote sensing is the process of detecting and monitoring the physical characteristics of an object by measuring its characteristics at a distance. The measurements taken are usually reflected and emit radiation. For optical imagery, this equals detecting colour and brightness.

The simplest example of remote sensing is taking photos of the Earth's surface from an aircraft or a satellite (with a huge camera).

An example of the most popular use case of satellite imagery would be Google Maps Satellite view.

What types of imagery we usually use?

When choosing data for your project, besides the sensor type, you should consider three resolutions: spectral, spatial, and temporal.

Another way to classify sensors is by the spatial resolution of images (no matter was this image formed by an active or a passive sensor). Borders of the following classes are not fixed and tend to change over time while new sensors are developed:

  • Kilometers – low resolution

  • Tens of meters – medium resolution

  • Up to 10 m – high resolution

  • Less than 1 m – very high resolution (VHR)

This leads us to temporal resolution – the frequency of image update, or revisit time. This frequency depends on orbit geometry, sensor field of view, and local conditions (for example, no optical images in the visible part of spectra are taken during the polar night).

Optical Remote Sensing and Passive Sensors

Optical sensors are designed to capture radiation in pre-defined spectral intervals, called bands or channels. The image below shows the distribution of spectral bands across optical part of spectra for Sentinel-2, Landsat-7 & 8. You may notice, that spectral bands are designed to insure 1) differentiation between various objects and/or study of their properties and 2) transparency of Earth's atmosphere for outgoing radiation. The human eye sees the interval between 380 and 780 nm (from violet to red). Satellites usually have at least 3 bands in this interval: Blue, Green, and Red - to create a "True-colour" view.

Optical sensors features

Sensor
Spectral Bands (Number of bands/ Spectral interval, nm)
Spatial Resolution, m
Revisit time (in December over South Rothera)

Sentinel-3 (OLCI)

21 / 400-1020

300

3 times a day

MODIS (Terra & Aqua)

36 / 405 - 965, 3000-15000

250, 500, 1000

2 times a day

Sentinel-2

13 / 443 - 2190

10, 20, 60

once in 3 days

Landsat 8&9

11 / 430 -1380, 10600 - 12510

15, 30, 100

once in 3 days

Maxar WorldView-3

16 / 400 - 1040, 1195 - 2365

0.3, 1.24, 3.70

Airbus Pleiades

6 / 400 - 880

0.3, 1.2

Radar Remote sensing and SAR

Another group of imagery we usually work with is SAR (Synthetic Aperture Radar). SAR technology differs greatly in imagery creation methods, wavelengths, and viewing geometry. This makes it more challenging to use, but it has some advantages.

Active sensors

While optical sensors measure reflected radiation from the Earth's surface from Sun, radar emits a pulse of microwave energy (wavelengths are cm to tens of cm) and detects its reflection from objects. This means that a) SAR sensors are not light-dependent, as they register their own emitted signal; b) microwave signals can go through clouds, as wavelengths are longer then water particles size.

Wavelengths

Image formation

Radar registers 3 parameters

  • Amplitude - the strength of the reflected echo Amplitude changes due to:

    • Object size and orientation to signal

    • Surface roughness

    • Dielectric properties of the surface

    • Incidence angle

    • Polarization

    • Frequency or Wavelength

  • Signal phase

  • The time between the pulse generated and the echo received

The latter parameter leads to the necessity of the sensor to look sideways. In another case, two objects at the same distance from the sensor would send the signal back at the same time. This would make it impossible to differentiate them, and therefore, to form an image.

Sideways-looking geometry plus physical properties of radar wave result in geometrical distortions, known as Layover and Foreshortening.

Another feature of SAR images is Speckle - a specific noise-like pattern, which is a result of interference from the many scattering echoes within a resolution cell.

Where do I get a satellite image?

Email magic@bas.ac.uk for any imagery request

  • View and download data

  • Any online maps (Google, Bing, ESRI) - may contain outdated images

  • For VHR optical or SAR images magic@bas.ac.uk

The most common is to divide sensors into , which capture signals emitted or reflected from an object (like the human eye or a phone camera works), and , which register the backscattering of their own emitted signal (like bats or dolphins). Among the first group optical sensors are the most popular; and among the second you may encounter Radar, or Synthetic Aperture Radar (SAR), LiDAR (laser), Laser Altimetry, and other technologies.

In terms of spectral resolution, among optical sensors we usually differentiate multispectral, which could have tens of broad (from tens to hundreds of nanometres wide) spectral bands, and hyperspectral sensors, which can have hundreds of very narrow (2-10 nanometres) spectral bands. There are different bands (), on which radar satellites operate as well. SAR has only one frequency, but various compositions of polarisations.

While high-resolution and VHR cameras let the detection of very small objects on Earth's surface and create topographic maps, low-resolution sensors overview vast territories daily (or even every 10 minutes, like , do).

Polar View - latest SAR (Sentinel-1 and RCM) data

Copernicus Browser - optical and SAR medium-resolution images

EO-browser - optical and SAR medium-resolution images

USGS viewer

passive
active
frequencies
geostationary meteorological satellites
https://www.polarview.aq/
https://browser.dataspace.copernicus.eu/
https://apps.sentinel-hub.com/eo-browser/
https://earthexplorer.usgs.gov/
Digital Globe's WorldView-4 at Lockheed Martin Space Systems' Sunnyvale, California, facility. Credit: Lockheed Martin
Satellite view of BAS, Cambridge site. Credit: Google Maps
View from geostationary meteorological Himawari satellite. Credit: NICT, Japan
Electromagnetic spectrum Credit: NASA ARSET
Down-Looking vs. Side-Looking Radar Credit: NASA ARSET
SAR image, representing Foreshortening, Layover and Shadow effects
Optical image
Spectral bands of Sentinel-2 and Landsat 7 & 8. Spectral signatures of vegetation, soil, and water are depicted in green, red, and blue respectively. Credit: DOI:
Credit:
10.13140/RG.2.2.20077.54245
NASA EarthData