Blogs

A technical deep-dive into Satellite Imaging, Multispectral, SAR and GAN

AiDash

In the recent past, discussions around the use of satellite imagery as an emerging technology for remote monitoring of infrastructure and vegetation have picked up pace. From tracking vegetation coverage and change detection to disaster monitoring, satellite imagery has indeed proved to be a revolutionary technological advancement with several use cases up its sleeve. While it may seem easy, the science behind capturing and analyzing satellite imagery is complicated, yet very intriguing. Let’s understand how satellite imagery works and helps solve some of the world’s most complex problems.

Understanding satellite imaging: How it works

Satellites have been used for capturing geospatial information for over 60 years now. Satellite data is used for an ever-expanding collection of uses, such as weather forecasting, mapping, environmental research, military intelligence and more. Let’s understand the technicalities of how satellite imagery is collected, analyzed and processed:

  1. Sensors: Satellite imaging as part of remote sensing, is when satellites scan the Earth using different kinds of sensors to collect electromagnetic radiation reflected from the Earth. These sensors are mostly of 2 types – Active and passive sensors. Passive sensors collect radiation that the Sun emits and the Earth reflects and don’t require energy. Active sensors, on the other hand, sensors, provide their own source of energy to illuminate the objects they observe.

    Satellites extract information from energy interacting with the Earth’s surface. Remote sensing sensors measure the electromagnetic radiation from reflection, emission and emission reflection. Light acts as a wave that can be described by its wavelength and frequency, comprising the electromagnetic spectrum. Remote sensing data provided by Synthetic Aperture Radar (SAR) is making waves in this field. According to Earthdata, SAR is a type of active data collection where a sensor produces its own energy and then records the amount of that energy reflected back after interacting with the Earth. Keep reading this article to learn more about SAR. 
     

  2. Electromagnetic spectrum: Only a small portion of the electromagnetic spectrum (visible light) can be perceived by humans, but satellite sensors can use other types, like infrared light, ultraviolet light, or even microwaves. When satellite images are made, these invisible types of light are assigned a visible color. The majority of active sensors operate in the microwave portion of the electromagnetic spectrum, which makes them able to penetrate the atmosphere under most conditions.Spectrum of electromagnetic radiation
  3. Resolution: Moving on to the next part of remote sensing or collecting satellite imagery – resolution. Resolution is key in determining how data from a sensor can be used. On the basis of the satellite’s orbit and sensor design, resolution can vary. There are four types of resolutions to consider for any dataset — 

    Radiometric resolution is the amount of information in each pixel, i.e. the number of bits representing the energy recorded. 
    Spatial resolution is defined by the size of each pixel within a digital image and the area on the Earth’s surface represented by that pixel.
    Spectral resolution is the ability of a sensor to discern finer wavelengths, that is, having more and narrower bands.
    Temporal resolution is the time it takes for a satellite to complete an orbit and revisit the same observation area.
    Today we have satellites with 30cm spatial resolution in multispectral bands and 25 cm spatial resolution in SAR bands.
     

  4. Bands: Many sensors are considered to be multispectral, meaning they have between 3-10 bands. Sensors that have hundreds to even thousands of bands are considered to be hyperspectral. The narrower the range of wavelengths for a given band, the finer the spectral resolution. Multispectral imagery generally refers to 3 to 10 discrete “broader” bands. Hyperspectral imagery consists of much narrower bands (10-20 nm). A hyperspectral image could have hundreds of thousands of bands.
     Spectrum of electromagnetic radiation

Why is Synthetic Aperture Radar a valuable tool?

In the past decade, Synthetic Aperture Radar (SAR) has turned out to be one of the most useful and emerging technologies in remote sensing. The USP of this technology is how it can synthetically produce higher resolution images in any weather condition, including nights.

SAR can “see” through darkness, clouds and rain, detecting changes in levels of water and moisture, habitat effects of natural or human disturbance and changes in the Earth’s surface after natural disasters like earthquakes and sinkhole openings.

According to a research article by NASA’s Earthdata, the spatial resolution of radar data is directly related to the ratio of the sensor wavelength to the length of the sensor’s antenna. For a given wavelength, the longer the antenna, the higher the spatial resolution. From a satellite in space, we’d need a really long antenna, something that’s not practical for a satellite sensor in space. Hence, scientists and engineers came up with the synthetic aperture. In this concept, a sequence of acquisitions from a shorter antenna are combined to simulate a much larger antenna, thus providing higher resolution data.

SAR systems utilize simple geometry, precise GPS location estimation and accurate image processing algorithms to produce high-resolution images. They can be lightweight, low power, highly scalable systems and used in a variety of applications. Generative Adversarial Network

The processing of SAR data is what is the most complex part. Depending on the type of analysis needed to be performed, these steps can include: applying the orbit file, radiometric calibration, de-bursting, multilooking, speckle filtering and terrain correction. Thanks to advancements in Artificial Intelligence and Machine Learning algorithms, it is now possible to make use of SAR data impeccably. One such network is the Generative Adversarial Network (GAN) that is making SAR data processing and analysis easier.

The Generative Adversarial Network (GAN) is an artificial neural network based on unsupervised learning methods. Thanks to its powerful model representation capabilities, GAN has been introduced to synthesize synthetic aperture radar (SAR) image data. 
 
This AI-based image generation from radar data is ideal for regions with cloud and snow cover. It cleans and de-noises satellite imagery with cloud and snow cover for accurate object detection. (Refer to the image)

Satellite imaging using SAR provides high-resolution, day-and-night and weather-independent images for a multitude of applications ranging from geoscience and climate change research, environmental and Earth system monitoring, 2-D and 3-D mapping, change detection and much more. Several critical industries are realizing the potential of using remote sensing for monitoring assets, vegetation management and even disaster management.

AiDash is a leading AI-first SaaS company enabling satellite-powered operations & maintenance for utility, energy and other core industries. Our novel SaaS platform — Remote Monitoring and Survey System (RMSS) — uses high-resolution multispectral and SAR satellite imagery powered with AI to monitor and survey vegetation hazards, RoW encroachments, wildfire risks and weather-related damage remotely via a web dashboard and mobile app. Intrigued? Feel free to mail us at info@aidash.com.