Lesson 1, Topic 1
In Progress

# The Resolutions – Dimensions of EO Data

## Spatial Resolution

### Learning objectives of this topic

• Definition of spatial resolution
• Impact factors of spatial resolution
• Comparison of resolutions
• The mixed pixel problem

First, we will introduce the most commonly discussed resolution that is used to describe the abilities of a sensor to monitor the spatial detail: the spatial resolution. Understanding which surface characteristics are combined in the signal return of an individual pixel.

#### What is spatial resolution?

Generally, when we talk about the spatial resolution, we describe the smallest angular (horizontal) separation between two objects. From a remote sensing perspective, this is measured in pixels or cells and is often measured in meters and arranged in a square or rectangular shape. However, the spatial resolution must not be confused with the pixel resolution. These are two terms that cannot be used interchangeably.
A remote sensing instrument with a spatial resolution of 10 m will consequently collect information from pixels that cover an area of 10 by 10 m (such as Sentinel) but the actual spatial resolution may differ.

#### Spatial resolution does not equal pixel resolution

The terms ‘spatial resolution’ and ‘pixel resolution’ are often used interchangeably. However, this is not correct. Keep in mind that the pixel size defines the spatial extent that is covered by a single pixel of the given sensor. On the other hand, the spatial resolution of each pixel can be the representation of the neighboring statistics and local phenomena.

#### Dependencies of spatial resolution

How much detail a sensor can provide in terms of spatial extent is dependent on different parameters. Generally, if a platform is at greater distance (higher altitude) from the Earth’s surface, it is able to view larger areas (swath width) while the spatial resolution decreases. For example, ESA’s Sentinel-2 monitors our planet at max. 10 m resolution with a swath width of 290 km. In contrast, NASA’s MODIS instrument reaches cell sizes of max. 250 m with a much greater swath width of 2,330 km.

The Instantaneous Field of View (IFOV) is the key variable of the spatial resolution of passive sensors. It is often described as a cone characterized by the visibility angle from the sensor, which can monitor a certain portion of the surface. These values multiplied by the distance between ground and sensor results in the size of the resolution cell.

In some cases, the ground area that is identified as a single pixel in the RS data set relates closely to the spatial resolution of the sensing instrument. On the other hand, it is always possible that these parameters differ from each other. A possible reason for this could be manipulations that are applied to the data between the image’s acquisition and its preprocessing, such as resampling procedures.

#### Comparison of spatial resolutions

In the interactive element below, you can slide through the variations of (most) commonly used sensors. They visualize a part of the Skukuza rest camp, which is located in the Kruger National Park, South Africa. Explore how details get mixed or disappear completely.

In recent years, more high-resolution data became freely available and this trend will prevail in the upcoming decades. While low-resolution (coarse) data sets have dominated the past, high-resolution (fine) data will be the go-to input information for state-of-the-art remote sensing science, also taking into account the steadily improving processing possibilities on personal computers and cloud servers.

#### Mixed pixel problem

Mixed pixels represent a classic problem in remote sensing applications, as there existent heavily depends on the pixel resolution we are looking at. With greater pixel size (lower cell resolution), reflections from more surrounding objects are combined into one resolution cell. Hence, the original cell we are looking at will exhibit image statistics from a variety of neighboring pixels. Have a look at the image slider below and see how lowering the spatial resolution leads to mixed pixels containing less detail. Note how some of the small buildings get mixed up with the surrounding trees in the 30 m imagery.

If objects are larger than, or at least equal to, the spatial resolution of the respective sensor, they will be mapped in our data sets. There is also the chance that the brightness of an object smaller than the cell size (e.g., a building) is high enough to impact the cell statistics. If this is the case, analysis could be carried out on a sub-pixel domain to enable the identification of parts of a resolution cell.

Earth Lab (2020). Introduction to Spatial and Spectral Resolution: Multispectral Imagery. <https://www.earthdatascience.org/courses/earth-analytics/multispectral-remote-sensing-data/introduction-multispectral-imagery-r/>

Elachi, C. & van Zyl, J. (2015Â²). Introduction to the Physics and Techniques of Remote Sensing. Hoboken, USA: John Wiley & Sons, Inc.

Jensen, J.R. (2007Â²). Remote Sensing of the Environment. An Earth Resource Perspective. Upper Saddle River, USA: Pearson Prentice Hall.

Rees, W.G. (2010Â²). Physical Principles of Remote Sensing. Cambridge, USA: Cambridge University Press.

Schowengerdt, R.A. (2007Â³). Remote Sensing. Models and Methods for Image Processing. San Diego, USA: Academic Press.

## Temporal Resolution

### Learning objectives of this topic

• Definition of temporal resolution
• The importance of high temporal repetition
• Satellite constellations and their impact on the temporal resolution

After you have learned about the meaning of spatial resolution and pixel size, this topic is dedicated to the temporal domain of remote sensing data acquisition. The temporal resolution is a factor that can make up for a lower spatial resolution in some cases (depending on the application), since the temporal signature of time series has a lot to offer. Let’s find out why.

#### What is temporal resolution?

The temporal resolution describes the time interval between two overpasses of a given point. Often it is also referred to as the ‘revisit time’ or ‘repetition rate’. The duration of this time span can vary between multiple acquisitions per hour (EUMETSAT) up to a single acquisition every two (Landsat) or even four weeks (SPOT). A major difference to the spatial domain is that the temporal resolution is not solely dependent on the sensor, but on the satellite platform that the sensor is mounted on.

The repeating coverage of the identical location on the Earth’s surface is an extremely valuable source of information. Through regular visits, it is possible to identify temporal signatures that make every target unique.

The figure above displays the backscatter values of a Sentinel-1 radar image over time. You can see that the three different land cover/use types vary significantly with regards to the measured backscatter intensity throughout the year. Here, the big advantage of multitemporal information can be demonstrated. When not solely looking at a single time step, but the dynamic for a class or a pixel, we are able to distinguish land cover/use types, which appear extremely similar for certain times of the year.

#### Temporal dynamics

In the image on the lower left, you can see the significant changes occurring in Ouargla, Algeria. Water levels change drastically as evaporation leads to a water loss at the ground. While rains bring the water back to this oasis during the inter-annual wet seasons, water availability is a crucial parameter for surviving in this area. Satellites with high temporal resolution (Sentinel-2, in this case) can help to prevent locals from arising water shortages. Near the Equator, Sentinel-2 can reach revisit rates of up to 5 days, allowing the observation of rapidly occurring changes. Other satellites with lower temporal resolution (optical or SAR) are, therefore, less suited for disaster monitoring.

Another great application of high-resolution (spatial and temporal) Sentinel-2 data can be seen in the lower right. Pivot irrigation systems are a very effective way to cultivate areas that are naturally not suited for intensive agriculture. The pivots have a strong intra- and interannual range in vegetation cover. Located in the Saudi Arabian desert, these installations can be seen from space!

#### How do satellite constellations impact the revisit rate?

The temporal resolution of satellites depends on a number of factors. Orbiting satellites (travel time â‰  Earth rotation) take more time to revisit the same area as geostationary satellites (travel time = Earth rotation). Most Earth observation satellites are following quasi-polar orbits. Higher temporal resolution can be reached with geostationary satellites (travel time equals the Earth’s rotation). One example for geostationary acquisition are weather satellites such as EUMETSAT and GOES. With higher flight altitudes, satellites can travel more quickly around the Earth. While this is related to less spatial resolution, the temporal resolution is increased. Newer generations of satellites, such as the Sentinel fleet, have started to operate in chain constellations, increasing the temporal resolution by utilizing more satellites of the identical design. For example, the revisit rate of Sentinel-1A was increased by launching Sentinel-1B in April 2016 from 12 days to 6 days at the Equator. This trend is currently drastically increasing the number of satellites in space.

DostÃ¡lova, A., Lang, M., Ivanovs, J., Waser, L.T. & Wagner, W. (2021). European Wide Forest Classification Based on Sentinel-1 Data. Remote Sensing 13, 337. https://doi.org/10.3390/rs13030337.

Elachi, C. & van Zyl, J. (2015Â²). Introduction to the Physics and Techniques of Remote Sensing. Hoboken, USA: John Wiley & Sons, Inc.

Jensen, J.R. (2007Â²). Remote Sensing of the Environment. An Earth Resource Perspective. Upper Saddle River, USA: Pearson Prentice Hall.

KÃ¼nzer, C., Dech, S. & Wagner, W. (2015). Remote Sensing Time Series. Revealing Land Surface Dynamics. Berlin, Germany: Springer.

Planet Labs (2021). Our Approach. <https://www.planet.com/company/approach/>.

Rees, W.G. (2010Â²). Physical Principles of Remote Sensing. Cambridge, USA: Cambridge University Press.

Schowengerdt, R.A. (2007Â³). Remote Sensing. Models and Methods for Image Processing. San Diego, USA: Academic Press.

### Learning objectives of this topic

• Defintion of the spectral domain
• Multispectral vs. hyperspectral remote sensing
• Meaning of spectral signatures
• Impoartance of atmospheric windows

Moving on in the world of remote sensing domains, we will introduce another important variable, which is used to characterize a sensor. The spectral resolution can be seen as the ability of the sensor’s eye – the instrument’s measuring capabilities. Let’s start with a short video giving you an overview of spectral resolution and sensors that cover this domain in different ways.

#### What is spectral resolution?

The spectral resolution of a given remote sensing sensor describes its ability to monitor the Earth’s surface at specific wavelengths. A sensor with a finer spectral resolution provides more narrow spectral bands (less wavelength covered per individual band). The range of spectral resolution varies between panchromatic (a single and quite wide band) and hyperspectral (100s to 1,000 very narrow bands) sensors. In between these, multispectral instruments can be found. Take a look at the image below to understand how the number of bands influences the way a given area is monitored and signatures that are created.

With higher spectral resolution, a sensor can store more bands, containing gray values representing a greater number of wavelength parts.

#### Influencing factors of spectral suitability

In order to determine which spectral bands are appropriate for certain land applications and consequently for a sensor two important factors should be considered:
1) the spectral signature of objects that ought to be observed and
2) atmospheric windows.

##### Atmospheric windows

What spectral sensors (shorter wavelengths) see, depends on the atmosphere and its structure, or, whether the wavelengths can pass through the atmosphere or not. Wherever gases or small particles like water drops or dust molecules dominate the atmospheric layer, more parts of the EM spectrum will be absorbed or scattered, resulting in weaker reflectance towards the remote sensing instrument. However, there are some portions of wavelength that are able to travel through the atmosphere more easily; these parts are called atmospheric windows. Consequently, the atmospheric opacity determines which wavelength can travel without losing too much intensity.

##### Spectral signatures

The so-called ‘fingerprint’ of specific land cover/use objects is an important tool to monitor changes over time. But while the advantage of time series information is obvious and receiving more interest in recent years, the ‘true’ value of data with great spectral resolution can be seen in the identification of features which may only have subtle differences in the spectral response compared to similar ‘targets’. These differences become visible when we look at different land cover/use types in more detail. In the figure below, you can easily distinguish between the view a hyperspectral sensor (in this case DESIS from DLR) has on grasses and water surfaces.

Since the rates of bottom of atmosphere (BOA) reflectance varies significantly between different land cover/use types, very unique signatures can be found. These can be used to distinguish the finest sub-classes of land cover/use types. Understanding of the spectral fingerprint of objects is important to being able to evaluate changes observed from remotely sensed data.

Deutsches Zentrum fÃ¼r Luft- und Raumfahrt (DLR, 2019). DLR Earth Sensing Imaging Spectrometer (DESIS). <https://www.dlr.de/os/en/desktopdefault.aspx/tabid-12923/>.

Edmund Optics (2020). Hyperspectral and Multispectral Imaging.<https://www.edmundoptics.eu/knowledge-center/application-notes/imaging/hyperspectral-and-multispectral-imaging/>.

Elachi, C. & van Zyl, J. (2015Â²). Introduction to the Physics and Techniques of Remote Sensing. Hoboken, USA: John Wiley & Sons, Inc.

Jensen, J.R. (2007Â²). Remote Sensing of the Environment. An Earth Resource Perspective. Upper Saddle River, USA: Pearson Prentice Hall.

Rees, W.G. (2010Â²). Physical Principles of Remote Sensing. Cambridge, USA: Cambridge University Press.

Schowengerdt, R.A. (2007Â³). Remote Sensing. Models and Methods for Image Processing. San Diego, USA: Academic Press.

### Learning objectives of this topic

• Interdependencies between remote sensing resolutions

Last but not least, we will introduce you to the radiometric resolution. While the spectral resolution decides how much of the EM spectrum a sensor ‘sees’, the radiometric domain defines the number of gradients that are measured.

The radiometric domain is also referred to as ‘color depth’ and is defined as the sensitivity to the magnitude of the EM energy. Thus, it characterizes how finely a given sensor can receive and divide the radiance between the different bands. Explore the animation on the right to investigate the difference between lower and higher radiometric at comparable spatial resolution. A greater resolution increases the range of intensities that a sensor can distinguish.

Comparison of different radiometric resolutions, left: lower, right: higher. Source: ESA 2009

### How the spectral domain is represented

Typically, the radiometric resolution is expressed in bits. The bit count is used for characterizing numbers in the binary data format and each bit relates to an exponent of power 2 (e.g., 4 bit = 24, 0-15), which is the number of levels of grey values that are being recorded. This value is given for each band. Former remote sensing data sets were recorded (mostly) in 8-bit format. The latest optical sensors, such as Landsat 8 & Sentinel-2 (L2A), come with 12-bit data products. An image channel with 12-bit data is recording 212 (4,096) grey level values or shades of that given wavelength. Consequently, the level of detail in remotely sensed data strongly depends on the bit depth in which the data was recorded.

To finalize this lesson on the different resolution in remote sensing we want to take a quick look at the relations between the different domains and how they interact with each other. The fact that the instruments we are using always use a certain combination of the resolutions given on the right makes every sensor and its acquired data unique. Given this knowledge, planning new sensors always has to be accomplished with an exact idea of what the instruments should sense on our land surfaces.

To achieve high radiometric resolution, a greater instantaneous field of view (IFOV) is beneficial, which in turn, would reduce the spatial resolution. If we want to get to a greater spatial resolution, we would need to engineer our sensor with a smaller IFOV. The spectral resolution on the other side can only be maximized, when we favor smaller bands (fewer wavelengths) over the increase of spatial resolution. Temporal resolution is also depending on the design of the sensor and the orbit characteristics of the satellite, which in turn also alters the spatial resolution. Hence, you can see that designing the right sensor tailored to your specific application is crucial but complex, especially in theory.

A classic example of a real-life situation where this interdependence can become somewhat problematic is the observation of, e.g., tree species. While it is very important to use data with high spatial resolution, the trade-off here is, that the spectral resolution which is essential to discriminate between different species (e.g., types of deciduous trees) is too coarse.

Keep this relation in mind in order to choose the correct sensor for your specific remote sensing application.

Elachi, C. & van Zyl, J. (2015Â²). Introduction to the Physics and Techniques of Remote Sensing. Hoboken, USA: John Wiley & Sons, Inc.

European Space Agency (ESA, 2009). Remote Sensing Principles. <https://www.esa.int/Education/Eduspace>.

Jensen, J.R. (2007Â²). Remote Sensing of the Environment. An Earth Resource Perspective. Upper Saddle River, USA: Pearson Prentice Hall.

Rees, W.G. (2010Â²). Physical Principles of Remote Sensing. Cambridge, USA: Cambridge University Press.

Schowengerdt, R.A. (2007Â³). Remote Sensing. Models and Methods for Image Processing. San Diego, USA: Academic Press.