Home Articles Remote Sensing – an overview

Remote Sensing – an overview

Remote Sensing (RS) is the science and art of acquiring information (spectral, spatial, temporal) about material objects, area, or phenomenon, without coming into physical contact with the objects, or area, or phenomenon under investigation. Without direct contact, some means of transferring information through space must be utilised. In remote sensing, information transfer is accomplished by use of electromagnetic radiation (EMR). EMR is a form of energy that reveals its presence by the observable effects it produces when it strikes the matter. EMR is considered to span the spectrum of wavelengths from 10-10 mm to cosmic rays up to 1010 mm, the broadcast wavelengths, which extend from 0.30-15 mm.

Types of Remote Sensing
In respect to the type of Energy Resources

  • Passive Remote Sensing: Makes use of sensors that detect the reflected or emitted electro-magnetic radiation from natural sources.
  • Active Remote Sensing: Makes use of sensors that detect reflected responses from objects that are irradiated from artificially-generated energy sources, such as radar.

In respect to Wavelength Regions
Remote Sensing is classified into three types in respect to the wavelength regions

  • Visible and Reflective Infrared Remote Sensing
  • Thermal Infrared Remote Sensing
  • Microwave Remote Sensing

Bands Used in Remote Sensing
Emission of EMR (Electo-Magnetic Radiation) from gases is due to atoms and molecules in the gas. Atoms consist of a positively charged nucleus surrounded by orbiting electrons, which have discrete energy states. Transition of electrons from one energy state to the other leads to emission of radiation at discrete wavelengths. The resulting spectrum is called line spectrum. Molecules possess rotational and vibrational energy states. Transition between which leads to emission of radiation in a band spectrum. The wavelengths, which are emitted by atoms/molecules, are also the ones, which are absorbed by them. Emission from solids and liquids occurs when they are heated and results in a continuous spectrum. This is called thermal emission and it is an important source of EMR from the viewpoint of remote sensing.

The Electro-Magnetic Radiation (EMR), which is reflected or emitted from an object, is the usual source of Remote Sensing data. However, any medium, such as gravity or magnetic fields, can be used in remote sensing.

Remote Sensing Technology makes use of the wide range Electro-Magnetic Spectrum (EMS) from a very short wave “Gamma Ray” to a very long ‘Radio Wave’.

Wavelength regions of electro-magnetic radiation have different names ranging from Gamma ray, X-ray, Ultraviolet (UV), Visible light, Infrared (IR) to Radio Wave, in order from the shorter wavelengths.

The optical wavelength region, an important region for remote sensing applications, is further subdivided as follows:

Name Wavelength (mm) Optical wavelength
Reflective portion (i) Visible
(ii) Near IR
(iii) Middle IR Far IR (Thermal, Emissive) 0.30-15.0
0.38-3.00 0.38-0.72
0.72-1.30
1.30-3.00 7.00-15.0 Microwave region (1mm to 1m) is another portion of EM spectrum that is frequently used to gather valuable remote sensing information.

Spectral Characteristics vis-à-vis different systems
The sunlight transmission through the atmosphere is effected by absorption and scattering of atmospheric molecules and aerosols. This reduction of the sunlight’s intensity s called extinction.

One cannot select the sensors to be used in any given remote-sensing task arbitrarily; one must instead consider

  • the available spectral sensitivity of the sensors,
  • the presence or absence of atmospheric windows in the spectral range(s) in which one wishes to sense, and
  • the source, magnitude, and spectral composition of the energy availabe in these ranges.
  • Ultimately, however, the choice of spectral range of the sensor must be based on the manner in which the energy interacts with the features under investigation.

Energy Interactions, Spectral Reflectance and Colour Readability in Satellite Imagery
All matter is composed of atoms and molecules with particular compositions. Therefore, matter will emit or absorb electro-magnetic radiation on a particular wavelength with respect to the inner state. All matter reflects, absorbs, penetrates and emits Electro-magnetic radiation in a unique way. Electro-magnetic radiation through the atmosphere to and from matters on the earth’s surface are reflected, scattered, diffracted, refracted, absorbed, transmitted and dispersed. For example, the reason why a leaf looks green is that the chlorophyll absorbs blue and red spectra and reflects the green. The unique characteristics of matter are called spectral characteristics.

Energy Interactions
When electro-magnetic energy is incident on any given earth surface feature, three fundamental energy interactions with the feature are possible. See Fig. 2

Fig 2: Basic interactions between electromagnetic energy and an earth surface feature

Spectral Reflectance & Colour Readability
Two points about the above given relationship (expressed in the form of equation) should be noted.

  • The proportions of energy reflected, absorbed, and transmitted will vary for different earth features, depending upon their material type and conditions. These differences permit us to distinguish different features on an image.
  • The wavelength dependency means that, even within a given feature type, the proportion of reflected, absorbed, and transmitted energy will vary at different wavelengths.

Thus, two features may be distinguishable in one spectral range and be very different on another wavelength brand. Within the visible portion of the spectrum, these spectral variations result in the visual effect called COLOUR. For example we call blue objects ‘blue’ when they reflect highly in the ‘green’ spectral region, and so on. Thus the eye uses spectral variations in the magnitude of reflected energy to discriminate between various objects.

A graph of the spectral reflectance of an object as a function of wavelength is called a spectral reflectance curve.

The lines in this figure. 3 represent average reflectance curves compiled by measuring large sample features. It should be noted how distinctive the curves are for each feature. In general,the configuration of these curves is an indicator of the type and condition of the features to which they apply. Although the reflectance of individual features will vary considerably above and below the average, these curves demonstrate some fundamental points concerning spectral reflectance.

Fig 3: Special Reflectance Curve of common object

Band Wavelength (mm) Principal applications
1 0.45-0.52 Sensitive to sedimentation, deciduous/ coniferous forest cover discrimination, soil vegetation differentiation
2 0.52-0.59 Green reflectance by healthy vegetation, vegetation vigour, rock-soil discrimination, turbidity and bathymetry in shallow waters
3 0.62-0.68 Sensitive to chlorophyll absorption: plant species discrimination, differentiation of soil and geological boundary
4 0.77-0.86 Sensitive to green biomass and moisture in vegetation, land and water contrast, landform/ geomorphic studies.

Colour Discrimination based on Wavelengths of Spectral Reflectances.( IRS-IA/IB LISS I and LISSII*)

Platforms
The vehicles or carriers for remote sensors are called the platforms. Typical platforms are satellites and aircraft, but they can also include radio-controlled aeroplanes, balloons kits for low altitude remote sensing, as well as ladder trucks or ‘cherry pickers’ for ground investigations. The key factor for the selection of a platform is the altitude that determines the ground resolution and which is also dependent on the instantaneous field of view (IFOV) of the sensor on board the platform.

Sensors

  • Active Sensors: Detect the reflected or emitted electromagnetic radiation from natural sources.
  • Passive Sensors: Detect reflected responses from objects that are irradiated from artificially-generated energy sources such as radar.

Resolution
In general resolution is defined as the ability of an entire remote-sensing system to render a sharply defined image.

  • Spectral Resolution: Spectral Resolution of a remote sensing instrument (sensor) is determined by the band-widths of the Electro-magnetic radiation.
  • Radiometric Resolution: It is determined by the number of discrete levels into which signals may be divided.
  • Spatial Resolution: It is determined in terms of the geometric properties of the imaging system.
  • Temporal Resolution: Is related ot the repetitive coverage of the ground by the remote-sensing system.

    Remote Sensing Satellites
    A satellite with remote sensors to observe the earth is called a remote-sensing satellite, or earth observation satellite. Remote-Sensing Satellites are characterised by their altitude, orbit and sensor.

    IRS( Indian Remote Sensing Satellite)
    India has launched several satellite includes IRS 1A, IRS 1B, IRS 1C, IRS 1D, IRS P 2,IRS P 3, IRS P 4 for different applications.

    Landsat
    It is established at an altitude of 700 kms is a polar orbit and is used mainly for land area observation.

    Other remote sensing satellite series in operations are: SPOT, MOS, JERS, ESR, RADARSAT, IKONOS etc.

    Basic Concept of LiDAR Mapping
    The accuracy and functionality of many GIS projects rely to a large extent on the accuracy of topographic data and the speed with which it can be collected. The recently emerged technique of airborne altimetric LiDAR has gained considerable acceptance in both scientific and commercial communities as a tool for topographic measurement.

    The LiDAR instrument transmits the laser pulses while scanning a part of terrain, usually centred on and co-linear with, the flight path of the aircraft in which the instrument is mounted. The round trip travel times of the laser pulses from the aircraft to the ground are measured with a precise interval timer. The time intervals are converted into range measurements, i.e. the distance of LiDAR instrument from the ground point struck by the laser pulse, employing the velocity of light. The position of aircraft at the instance of firing the pulse is determined by Differential Global Positioning System (DGPS). During the movement of aircraft experience lot of distortions in altitude, lateral movements so on but these warps are taken care by the instrument to yield accurate coordinates of points on the surface of the terrain. Laser mappers acquire digital elevation data with accuracies equivalent to those of GPS, but thousands times faster.

    Basics of Digital Image Processing
    Remote sensing images are recorded in digital form and then processed by the computers to produce images for interpretation purposes.

    Images are available in two forms – photographic film form and digital form. Variations in the scene characteristics are represented as variations in brightness on photographic films. A particular part of scene reflecting more energy will appear bright while a different part of the same scene that reflecting less energy will appear black.

    Digital image consists of discrete picture elements called pixels. Associated with each pixel is a number represented as DN (Digital Number), that depicts the average radiance of relatively small area within a scene. The size of this area effects the reproduction of details within the scene. As the pixel size is reduced more scene detail is preserved in digital representation.

    Digital image processing is a collection of techniques for the manipulation of digital images by computers. Digital image processing encompasses the operations such as noise removal, geometric and radiometric corrections, enhancement of images, information extraction and image data manipulation and management.

    Image Processing Methods
    Image processing methods may be grouped into three functional categories:

    Geometric and Radiometric Corrections
    The correction of errors, noise and geometric distortions introduced during scanning, recording and playback operations. However, the data supplied by NRSA-Hydrabad is corrected for these errors. Hence, we are restricted to the enhancement techniques and information extraction.

    Image Enhancement

    • Linear Contrast Enhancement: Very few scenes have a brightness range that utilises the full sensitivity range of the detectors. To produce an image with the optimum contrast ratio, the entire brightness range of the display medium, should be utilised. In linear contrast we have to assign the low end as 0 (zero) and the high end as 1(One) and the other values in between are linearly stretched. The linear stretch improves the contrast for most of the original brightness values.
    • Spatial Filtering: Spatial Filtering is a pixel by pixel transformation of an image, which depends on the grey-level of the pixels concerned as well as the greylevel of the neighbourhood pixels. It is a procedure in which greylevel of a pixel is altered according to its relationship with respect to the greylevel of the neighbouring pixels.

    Information Extraction
    In the case of information extraction processes the computer makes decisions to identify and extract specific pieces of information.