Blogs

Remote sensing: How to collect a world of data

AiDash

As relentless innovation makes devices smaller and smarter, many are put in the sky to increase our understanding of the Earth below. They record a dizzying array of geographic data that is used by our phones and cars to know where they are and what’s nearby. This geospatial information is also used by many industries to understand, maintain, and manage their ground-based, geographically distributed assets.

Global positioning systems (GPS) and geographic information systems (GIS) are important geospatial technologies. GPS collects information about the Earth’s surface. And GIS is a mapping technology for organizing and analyzing geospatial information. As you might imagine, these remote sensing technologies are as complex as they are essential to our modern world.

In this article, we will explore how remote sensing technologies gather geographic data and what the common benefits of that data are.


The basics of remote sensing technology

Remote sensing is the process of obtaining information about objects, areas, or phenomena from a distance, typically from aircraft or satellites. It includes the use of satellite or aircraft-based sensor technologies to detect and classify objects on the Earth’s surface and in the atmosphere and oceans.

Most of the remotely sensed data used for mapping and spatial analysis is collected as reflected electromagnetic radiation —  visible light is one example — which is processed into a digital image that can be overlaid with other spatial data.

Let’s explore the many options for collecting remote sensing data:


Satellite remote sensing

Satellites have been used for capturing geospatial information for over 60 years now. The data they capture is used for an ever-expanding collection of uses, such as weather forecasting, mapping, environmental research, military intelligence, and more.

So, how much detail does the satellite actually see? Satellites carry sensors, sometimes many of them, that read varying amounts of energy reflected from the Earth. For instance, a weather satellite carries a special instrument for recording multispectral data.

The satellite’s sensor observes a small portion of Earth, called a pixel. The pixel is a squarish area that is, for example, 30 centimeters (12 inches) on each side. The pixel size varies depending on the capability of the satellite sensor. We expect sensors will soon be able to capture images at a pixel size of 10 centimeters (4 inches), which will enable another dramatic expansion of applications for satellite imagery.

 

satellite imagery

A common misconception about satellite images is that they are photographs. However, they are quite different. Satellites capture digital information composed of thousands of pixels. While this data is sometimes displayed as a picture, it is often used in its raw, digital state by the many applications that analyze the data.

Because different objects absorb and reflect different wavelengths in the spectrum of visible and invisible light, multispectral remote sensing can discern many Earth-based features. For example, healthy, green vegetation reflects infrared wavelengths quite well and can be differentiated from diseased or dead vegetation to help arborists understand and manage their forests better. Similar distinctions are possible in analyzing the locations and sizes of rivers, lakes, ice-covered or snow-covered areas, and other surface features.


Aerial photography

Aerial photography is one of the earliest forms of remote sensing and is still one of the most widely used methods of remote sensing. The advent of drones and other unmanned aerial vehicles has made aerial photography easier for commercial and noncommercial purposes.

Drone

Since the 1860s — even before the Wright brothers first flew their plane — geographers photographed the Earth from above using balloons and kites to capture larger areas. With the introduction of airplanes, aerial photography could capture images from much higher vantage points.

Today, the altitude of aerial photographs ranges up to more than 18,000 meters (about 60,000 feet). Lower altitude photography  captures more detail, while the higher altitudes allow far wider scopes and the ability to discern relationships between features.

Aerial photography can be conducted at a variety of scales and in a range of formats, such as color, black and white, and infrared and has become popular in vegetation and ocean mapping. Small-scale, radio-controlled (RC) model aircraft and helicopters using 35 mm SLR and video cameras have been used to acquire panchromatic, color, color infrared (CIR) and multispectral aerial photography for a wide range of environmental applications.

These small-scale aerial platforms were not initially suitable for aerial photography. However, that perception has disappeared with developments in miniaturized sensors, camera and battery technology, data storage, small multirotor and fixed-wing aerial platforms, and other unmanned aerial vehicles (UAV). Today, such small platforms and sensors offer low-cost data acquisition of a wide range of aerial data and imagery.

ScienceDirect reports that “many of the smaller UAVs are now capable of utilizing a number of different sensors to collect photographic data, video footage, and multispectral, thermal, and hyperspectral imagery as well as LiDAR.”


LiDAR remote sensing

LiDAR is a technology for capturing geospatial data that uses laser scanning to create three-dimensional point clouds of geographic features. It is an active remote sensing system, which means that the system itself generates energy — in this case, laser light — to measure structures on the ground. LiDAR sensors can be mounted on UAVs, airplanes, and satellites.

LIDAR

GISGeography.com explains that “LiDAR is fundamentally a distance technology. From an airplane or helicopter, LiDAR systems send light to the ground. This pulse hits the ground and returns to the sensor. Then, it measures how long it takes for the light to return to the sensor. By recording the return time, this is how LiDAR measures distance. In fact, this is also how LiDAR got its name — Light Detection and Ranging.”

LiDAR systems allow scientists and mapping professionals to examine both natural and human-made environments with accuracy, precision, and flexibility. LiDAR uses ultraviolet, visible, or near-infrared light to image objects and can capture a wide range of targets, including nonmetallic objects, trees, rocks, rain, clouds, and even single molecules. Its laser beam can map physical features with very high resolutions. For example, an aircraft can map terrain at 30-centimeter (12-inch) resolution.

There is a wide variety of applications for LiDAR, including agriculture and vegetation mapping, plant species classification, atmosphere, biology and conservation, geology and soil science, law enforcement, military, obstacle detection, road environment recognition, object detection for transportation systems, mining, and more.


Synthetic aperture radar (SAR)

The many benefits of satellites with remote sensing technology have gained immense popularity in recent decades. And that popularity has spurred recent advancements in the most exciting remote sensing technology: synthetic aperture radar, also known as SAR.

Synthetic aperture radar has already become the most important tool of remote sensing. It has been widely used for remote sensing for over three decades. This imaging radar is mounted on a moving platform — often satellites. SAR-equipped satellites operate differently than traditional optical satellites and offer many advantages.

SAR


Unlike traditional optical imaging, which uses reflections of the sun’s radiation, SAR-equipped satellites emit their own energy, such as a radio wave, that is reflected back from the earth’s surface and recorded. To create a successful SAR image, successive waves irradiate a targeted scene, and the echo of each pulse is captured.

The most prominent advantage that SAR technology has over optical imaging is its ability to capture a high-resolution image, day or night, in any weather. Unlike optical technology, synthetic aperture radar can “see” through the darkness, clouds, and rain, detecting changes in habitat, levels of water and moisture, effects of natural or human disturbance and changes in the Earth’s surface after events such as earthquakes or sinkhole openings. Typical optical imaging uses visible light and can see only what we see. But SAR remote sensing provides very high-resolution images independent of every kind of weather condition.

SAR has become an extremely important earth observation tool that fills in many data gaps in traditional image-based sensors. Since SAR-equipped satellites offer far superior surveillance quality and can see through clouds, haze and the dark, you can now analyze changes on the ground on your own schedule.

 

The importance of remote sensing and its benefits

The increasing capabilities of computers and communication technology have supported the development of many remote sensing applications. Here are some of the advantages of using remote sensing technology:

  1. Systematic collection of data: Remote sensing allows easy collection of data over a variety of scales and resolutions. Data acquisition can be performed systematically and analyzed very quickly with machine learning and artificial intelligence.
  2. One image, multiple applications: A single image captured via remote sensing can be analyzed for different applications and purposes. This facilitates research and study in several fields at the same time. There are no limits to the extent of information that can be gathered from a single image.
  3. Detection of natural calamities: Remote sensing can detect natural calamities such as forest fires, volcanic eruptions, and inspect the areas nearby. This is a huge advantage because it helps stakeholders respond immediately and locate the exact areas that need assistance.
  4. Unobstructive: Remote sensing does not disturb the object or area of interest, especially when it is passively recording an area’s electromagnetic radiation.
  5. Relatively inexpensive: Data capture by remote sensing has a lower cost and faster acquisition than direct-observation methods of data collection and mapping. The larger the area, the more economical it is.
  6. Large area coverage: It is possible to cover the entire globe and efficiently collect a very large amount of data with the help of remote sensing imagery. In addition, inaccessible areas such as deep valleys and even disaster zones are easily mapped with remote sensing.
  7. Unbiased processing images: The data is digital and can be readily processed on machines without subjective interpretations.
  8. Repetitive coverage: Repetitive coverage allows monitoring of dynamic conditions in vegetation, agriculture, extreme weather, and more.

 

Get more information about how remote sensing technology powered by satellites and AI can help you manage vegetation, encroachments, sustainability, and disasters.

 

Talk to an expert to learn more.