Home Articles Sensor Integration and Image Georeferencing in Support of Airborne Remote Sensing Applications

Sensor Integration and Image Georeferencing in Support of Airborne Remote Sensing Applications

Dr. Naser El-Sheimy
Professor and Canada Research Chair
Department of Geomatics Engineering, The University of Calgary
Alberta, Canada
Email: [email protected]

Dr. Sameh Nassar
Post-Doctor Fellow, Department of Geomatics Engineering, The University of Calgary,
Alberta, Canada
Email: [email protected]

Airborne remote sensing considerably extends the capabilities of satellite remote sensing in terms of resolution and operational planning. Where satellite remote sensing at its best achieves accuracies in the meter level, airborne remote sensing has the potential of achieving the decimeter level in positional accuracy. In addition, the usefulness of satellite remote sensing is often restricted by the images available for a certain area and the extent of the intervening cloud coverage, such limitations do not exist in airborne remote sensing. It is therefore possible to optimize the required results by adapting the operational conditions to the task at hand. Research in airborne remote sensing and mobile mapping at The University of Calgary (U of C) has, as one of its goals, the development of a precise positioning and attitude determination system that can be used with a variety of airborne sensors and ultimately eliminate the need for ground control. In this paper, accuracy requirements for such a system are discussed, different sensor configurations are described and the results of the U of C prototype development are analyzed. Airborne remote sensing for mobile mapping applications can be subdivided into three major groups: those where precise positioning is the major requirement, as for instance photogrammetric applications; those where both position and attitude are required with high accuracy, as for instance pushbroom imaging applications; and those where accurate real-time position and attitude estimation for real-time applications are also needed, as for instance forest fire fighting. Hence, sensor configurations for different applications will be discussed in the paper and the first results of real-time airborne tests will be briefly reviewed.

1. Introduction
Since the early seventies, collecting remotely sensed data by satellite remote sensing has been widely used. At that time, the satellite remote sensing resolution was in the order of 80 m-100 m (Stadelmann, 1990). However, even a considerable improvement of accuracy occurred afterwards, the current satellite remote sensing best resolution is still at the meter level. Due to the high demand of a better accuracy in many mapping applications, a major trend was directed towards the development of digital airborne remote sensing systems. In addition, satellite images are usually not localized, i.e. images are not available for all specific areas of interest. Of course, this is not the case in airborne missions since they are planned to cover the required local areas. Finally, due to the large distance between the satellites and the Earth’s surface, satellite image visibility is always limited by the cloud coverage and all images need a long processing for image enhancement and quality improvement.

Airborne sensing sensors, which are usually called imaging sensors, can be a frame-based (analog) aerial camera, Charge Coupled Device (CCD) digital camera, thermal camera, Light Detection and Ranging (LIDAR) laser scanner, pushbroom scanner or a Synthetic Aperture Radar (SAR). Despite the imaging sensor used, all airborne imaging measurements are performed in what is called measurement frame, which is defined in general by the body frame of the moving aircraft. On the other hand, all image information needed for mapping is required in what is called a mapping frame. The transformation process from the measurement frame to the mapping frame is called georeferencing (El-Sheimy, 2000). Therefore, image georeferencing is simply the process of determining the Exterior Orientation Parameters (EOPs) of the imaging sensor for each exposure. However, this also requires the determination of the imaging sensor Interior Orientation Parameters (IOPs) through calibration.

For many years, georeferencing in airborne photogrammetric remote sensing applications was performed using image block adjustment as well as sufficient and well-distributed sets of Ground Control Points (GCPs) through the well-known process of Aerial Triangulation (AT). This procedure is known as indirect georeferencing. In this case, to obtain the EOPs and for error propagation control, each block of images must have established GCPs. However, the establishment of the required GCPs constructs the major part of the AT process budget. Moreover, for many remote areas such as deserts, coastlines, forests, steep mountains and snow-covered grounds, the establishment of GCPs is extremely limited and, if performed, adds a large amount of additional costs.

In addition, the image evaluation in the AT is very time consuming and requires highly skilled personnel. This issue does not affect only the budget in commercial applications but also it is an important factor in emergency and rescue applications that rely on airborne remote sensing such as forest fire fighting for example. In theses cases, a very quick response is required and hence, image processing is needed in real-time, which is obviously not available through the traditional AT process (Ip, 2004). Finally, in some airborne mapping applications that involve a single strip mapping such as pipelines, power lines, highways, coastline etc., establishing GCPs will be a major problem and also impractical in principle. In some of the other airborne remote sensing applications, such as pushbroom linear scanners (which have very weak geometry since each scan line has a different set of orientation parameters), the EOPs are required for each scan line. Hence, a block adjustment process in this case will require a very large number of GCPs. 2. Image Direct Georeferencing
From the discussion in the previous section, it is clear that the indirect georeferencing of airborne remote sensing images has some operational as well as practical problems. Therefore, there has been a major requirement by the mobile mapping community to eliminate or at least reduce the required number of GCPs. This can be achieved by installing navigation sensors, beside the imaging sensors, on board the aircraft to determine the EOPs of the imaging sensor directly. With the advent of the Global Positioning System (GPS), the idea of installing a GPS to determine the position of the imaging sensor Perspective Center (PC) at each exposure was performed and used for some time. However, this could not eliminate the need for GCPs but only reduced the required number of them, see Hofmann-Wellenhof et al. (1998) for more details.

As part of the intensive research of multi-sensor integration at the University of Calgary, a major part was directed towards the development of an integrated navigation system that integrates a Differential GPS (DGPS) and an Inertial Navigation System (INS) for accurate position and attitude determination. In such a system, the GPS is used to provide positions while the INS is used to provide orientations. By installing this INS/GPS integrated system with the imaging sensor, the EOPs can be determined without any GCPs. This approach was first introduced by University of Calgary researches and is named direct georeferencing, see for example: Schwarz et al. (1984), Schwarz et al. (1993) and Schwarz (1995).

In INS/GPS applications, the initial trajectory (velocity, position and attitude) is obtained by integrating the output of the inertial sensors (accelerometer specific forces and gyro angular rates). This is performed through the INS mechanization (navigation) equations, which are in fact, a set of non-linear differential equations. The solution of these equations is compared to the provided GPS solution and the differences are used to estimate and compensate for the INS errors through a Kalman Filter (KF), see Figure 1, where r, v, att and d represent the position, velocity, attitude and estimated errors, respectively.

Fig.1 INS/GPS Navigation Architecture in Direct Georeferencing Applications
In direct georeferencing by INS/DGPS integration, three coordinate frames are dealt with: the b-frame with its center at the Inertial Measuring Unit (IMU) center, the imaging sensor frame (c-frame) that has its center at the PC of the imaging sensor and the mapping frame (m-frame) where the final object coordinates are required and is usually considered as the local-level frame (l-frame). The direct georeferencing model is given in El-Sheimy (1996) as:

where rjm is the position vector of an object j; rGPSm is the position vector of the GPS antenna interpolated to the time of exposure t; sj is a scale factor for an object per image that relates image coordinates to the object coordinates, which is usually obtained implicitly using image stereopair processing techniques, laser scanner or a Digital Terrain Model (DTM), El-Sheimy (1996); Rcb is the rotation matrix between the c-frame and b-frame (assumed to be constant for the same installation and is determined by calibration before or during the mission); rjc is the vector of image coordinates given in the c-frame; acb and aGPSb are constant vectors between the IMU center and both the imaging sensor PC and the GPS antenna center given in the b-frame (determined during calibration process prior to mission by traditional surveying, i.e. total station). In the above model, Rbm is obtained from the solution of the INS mechanization equations. The graphical representation of the direct georeferencing model is illustrated in Figure 2.

Fig.2 Airborne Remote Sensing Image Georeferencing Components
Following the introduction of the direct georeferencing approach, it has been later widely used in many airborne remote sensing applications, either for research or for commercial production. In the following paragraphs, results obtained using INS/GPS for direct georeferencing will be summarized. The accuracies for the different systems are the Root Mean Square (RMS) values of the differences between the INS/GPS/imaging solution and the reference solution that is provided by well-known established GCPs.

In airborne mapping applications, the obtained accuracy using INS/GPS/imaging sensor configuration depends mainly on the scale of photography (i.e. the flying height). Using frame-based aerial cameras, the reported accuracies are 10-20 cm for easting and northing and 8-32 arcsec for attitude angles (roll, pitch and azimuth). The corresponding height accuracy is 10-30 cm, for more details see Skaloud (1995); Abdullah (1997); Hutton et al. (1997); Reid and Lithopoulos (1998); Reid et al. (1998); Skaloud (1999); Cramer et al. (2000).

In case of CCD digital cameras, the accuracy for airborne applications also depends on the camera resolution. The results given in Grejner-Brzezinska and Toth (1998) using a high-resolution 4k*4k CCD camera showed positional accuracies of 19, 20 and 32 cm in X, Y and Z directions, respectively. Using dual (nadir and oblique) CCD cameras, Mostafa and Schwarz (1999) reported accuracies of 54, 61 and 78 cm in X, Y and Z coordinates using a single stereopair of a nadir and oblique images. With the same system of dual cameras, Mostafa (1999) showed after using a 3*3 block of nadir and oblique images corresponding accuracies of 22, 24, and 34 cm, respectively, can be achieved.

Laser scanners, which are mainly used for generating DTMs & Digital Elevation Models (DEMs) and for mapping forests, vegetation and urban areas. The reported accuracies are in the range of 20-60 cm, 20-60 cm and 10-25 cm for easting, northing and height, respectively. See for instance Kimura et al. (1999); Baltsavias (1999); Mohamed et al. (2001) and Maas (2003).

Pushbroom linear scanners are used in applications that require an accuracy of 2-10m (Alamús and Talaya, 2000). This was confirmed by Cosandier (1999) who obtained accuracies of 2.5m – 3.5m for each channel with the Compact Airborne Spectrographic Imager (casi) system. With InterFerometric SAR (IFSAR) systems, their main usage is the determination of DEMs, especially in areas with heavy vegetation. Arbiol and González (2000) showed planimetric accuracy of 8.7m and vertical accuracy of 5.7m. Specifications given for the DEMs generated by the Intermap Technologies Ltd. STAR-3i airborne system confirmed obtained vertical accuracies in the order of 0.5-3m (post spacing of 5m) with a corresponding horizonta1 accuracy of 2.00m on slopes less than 20° (Intermap Technologies, 2005).

3. Direct Georeferencing Accuracy
The final image direct georeferencing accuracy obtained from an INS/GPS navigation system (i.e. the navigation accuracy at the imaging sensor PC), regardless of the imaging sensor type, accuracy or quality, is a function of the complete INS and GPS processing chain (error control implementation and KF design), IMU accuracy, GPS receiver quality, the alignment between the imaging and navigation sensors (see Equation 1) as well as the airborne data collection circumstances and environment. Hence, it is depending on both the measurement and processing stages. This involves the INS/GPS system estimated position and orientation angles, the INS/GPS/Imaging sensors alignment and time synchronization, the airborne operation circumstances and the sensors’ noise characteristics. Therefore, to improve the image direct georeferencing accuracy obtained from the INS/GPS integrated system, a number of factors have to be considered:

  • The first one is to improve the quality of the obtained GPS data since the GPS is the main source of update information. This can be performed by using: multiple reference GPS stations (Cannon, 1991; Cramer, 2001), minimum banking angles, short master-rover baseline (Schwarz et al., 1994), better ionospheric and tropospheric correction models (Abdullah, 1997), improved clocks, and using the available GPS/GLONASS receivers for providing more satellite measurements (El-Sheimy, 1996; Mostafa, 1999).
  • A second factor is the utilization of high quality inertial sensor technologies, especially in applications that require high accuracy (Bruton, 2000).
  • The third one is to apply an optimal procedure for the overall system calibration and sensor placement and alignment (Skaloud, 1999; Ip, 2004, and Ip et al 2004). This includes optimal calibration of INS and GPS constant errors (accelerometer and gyro biases and scale factors, GPS systematic errors, etc.), optimal determination of the GPS and INS time synchronization, and optimal determination of the INS-imaging sensor relative orientation.
  • The fourth factor is to optimize the INS mathematical modeling for improved error compensation and data quality enhancements (Nassar, 2003, Nassar and El-Sheimy 2004). 4. Airborne Remote Sensing Required Accuracy
    In the previous Section, the direct georeferencing accuracy, i.e. the INS/GPS navigation accuracy at the imaging sensor PC, was discussed. However, the final remote sensing accuracy requirement is the overall accuracy obtained on the ground, i.e. when compared to accurate GCPs. Therefore, the overall accuracy is a function of the INS/GPS navigation solution, imaging sensor quality, imaging resolution, image scale, block geometry (in case of photogrammetry), type of ground coverage and the weather conditions (excluding SAR). However, the accuracy requirements have a very wide range and depend on the different applications. These requirements, expressed as RMS, are shown in Table 1. On the other hand, the general current imaging sensors capabilities in terms of obtained accuracies, after georeferencing, are summarized in Table 2. However, it should be noted here that the figures in Table 2 are quite general and may vary based on the airborne application itself, flying height, pixel size, etc. In addition, the numbers listed in Tables 1 and 2 represent the overall 3-D positional accuracy (i.e. no distinction is made for vertical or horizontal accuracies).

    By inspection of Tables 1 and 2, it can be seen that the georeferenced imaging accuracy obtained by current imaging sensors meet most of the mapping accuracy requirements. However, for large scale engineering projects and cadastral mapping applications, only high precision optical frame-based cameras can provide the required specifications (taking into account low flying height and accurate DGPS positioning). Moreover, all figures are based on post-processing algorithms except the forest fire fighting application which is based on real-time processing (Add a reference for Bruce and me). Using post-processed carrier phase DGPS, typical positioning accuracy of 5-20 cm are obtained for georeferencing (rover-master distance is 10-50 km). For most current georeferencing navigation system components, a high-end tactical grade IMU is used (gyro drift of 1-3 deg/h) due to its reasonable cost (around $ 25,000 USD) and accurate post-mission attitude accuracy (0.005-0.01 deg). As mentioned earlier, an intensive research has been performed at the U of C to develop accurate airborne remote sensing systems for image direct georeferencing. Early U of C developed systems were designed to implement navigation-grade IMUs (gyro drift 0.005-0.03 deg/h) and applying post-processing, see for example Skaloud (1995); Skaloud (1999); Mostafa (1999) and Cosandier (1999). Due to the navigation-grade IMU considerable size and cost (around $ 120,000 USD), the U of C development was aimed at tactical-grade IMUs. In addition, a major research at U of C was directed to developing a real-time system for real-time applications. In the following Section, a brief description of the U of C current developed real-time airborne system for forest fire fighting is introduced and the first results of such system are discussed and analyzed.
    Table 1. Required Accuracy for Different Mapping Applications (Partially after Schwarz, 1995)

    Table 2. Current Georeferenced Imaging Sensor Obtained Accuracies

    Closing Remarks
    Direct georeferencing through the use of integrated navigation systems has developed from a topic of academic interest to a commercially viable industry in many mapping applications. Imaging and navigation hardware currently available is of such a quality that three-dimensional georeferencing can be achieved with an accuracy sufficient for many mapping tasks. The ongoing development of mathematical modeling and advanced post-mission estimation techniques will further increase accuracy and robustness of the solutions. Considerable work is needed in the areas of real-time and post-mission quality control, automation of GPS/INS integration in case of frequent lock of loss, and the efficient and user-oriented manipulation of extremely large data bases. The result of solving these problems will be an enormous extension not only of digital mapping but also its fusion with other multi-sensor data.


    • Abdullah, Q. 1997. Evaluation of GPS-Inertial Navigation System for Airborne Photogrammetry. Proceedings of the ASPRS/MAPPS Softcopy Conference, Arlington, Virginia, USA, July 27-30.
    • Alamús, R. and J. Talaya. 2000. Airborne Sensor Integration and Direct Orientation of the casi System. Proceedings of the XIX Congress of the International Society for Photogrammetry and Remote Sensing (ISPRS), Amsterdam, The Netherlands, July 16-23.
    • Arbiol, R. and G. González. 2000. Map Production in Venezuela Using Airborne InSAR. Proceedings of the XIX Congress of the International Society for Photogrammetry and Remote Sensing (ISPRS), Amsterdam, The Netherlands, July 16-23.
    • Baltsavias, E. P. 1999. Airborne Laser Scanning: basic Relations and Formulas. ISPRS Journal of Photogrammetry & Remote Sensing, Vol.54.
    • Bruton, A. 2000. Improving the Accuracy and Resolution of SINS/DGPS Airborne Gravimetry. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20145.
    • Cannon , M. E. 1991. Airborne GPS/INS with An Application to Aerotriangulation. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20040.
    • Cosandier, D. 1999. Generating A Digital Elevation Model and Orthomosaic from Pushbroom Imagery. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20133.
    • Cramer, M.; D. Stallmann and N. Haala. 2000. Direct Georeferencing Using GPS/Inertial Exterior Orientation for Photogrammetric Applications. Proceedings of the XIX Congress of the International Society for Photogrammetry and Remote Sensing (ISPRS), Amsterdam, The Netherlands, July 16-23.
    • El-Sheimy, N. 1996. The Development of VISAT-A Mobile Survey System for GIS Applications. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20101.
    • El-Sheimy, N. (2000): “Mobile Multi-sensor Systems: The New Trend in Mapping and GIS Applications”, IAG Journal of Geodesy, Volume 121 “Geodesy Beyond 2000: The Challenges of the First Decade”, Springer Verlag Berlin Heidelberg, 2000, pp. 319 – 324
    • Grejner-Brzezinska, D. A and C. K. Toth. 1998. Airborne Remote Sensing Multi-Sensor System: Development, Testing and Applications. Proceedings of the GPS and Forestry conference, Colona, British Colombia, Canada.
    • Hofmann-Wellenhof, B.; H. Lichtenegger and J. Collins. 1998. Global Positioning System, Theory and Practice. Springer-Verlag Wien , New York , USA.
    • Hutton, J. J.; T. Savina and E. Lithopoulos. 1997. Photogrammetric Applications of Applanix’s Position and Orientation System (POS). Proceedings of the ASPRS/MAPPS Softcopy Conference, Arlington, Virginia, USA, July 27-30.
    • Intermap Technologies. 2005. https://www.intermap.com/. Intermap Technologies Ltd. Web Site, Calgary, Alberta, Canada.
    • Ip, A. W. L. 2004. Analysis of Integrated Sensor Orientation for Airborne Mapping. M.Sc. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20204.
    • Ip, A., El-Sheimy, N. and Hutton, J. (2004): “Performance Analysis of Integrated Sensor Orientation”. Proceeding of The XX Congress, ISPRS Comm. V, Vol. XXXV, Part B5, Istanbul, Turkey, pp. 797-802, July 12-23, 2004.
    • Kimura, K.; T. Fujiwara and Y. Akiyama. 1999. Estimation of Accuracy of Airborne Laser Profiling. Proceedings of the International Workshop on Mobile Mapping Technology, Bangkok, Thailand, April 21-23.
    • Mass, H. G. Planimetric and Height Accuracy of Airborne Laserscanner Data: User Requirement and System Performance. Proceeding of the Photogrammetric Week ‘03’, Editor: D. Fritsch, Wichmann Verlag, Germany.
    • Mohamed, A.; R. Price; D. McNabb; J. Green and P. Spence. 2001. The Development of DORIS: An Overview. Proceedings of the 3rd International Symposium on Mobile Mapping Technology, Cairo, Egypt, January 3-5.
    • Mostafa, M. M. R. and K. P. Schwarz. 1999. An Autonomous System for Aerial Image Acquisition and Georeferencing. Proceedings of the ASPRS Annual Convention, Portland, Oregon, May 17-21.
    • Mostafa, M. M. R. 1999. Georeferencing Airborne Images from A Multiple Digital Camera System by INS/GPS. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20131.
    • Nassar, S. 2003. Improving the Inertial navigation System (INS) Error Model for INS and INS/DGPS Applications. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20183.
    • Nassar, S. and El-Sheimy, N. (2004). “Wavelet Analysis For Improving INS and INS/DGPS Navigation Accuracy”, The Journal of Navigation (2004), V58, 1–16. The Royal Institute of Navigation, DOI: 10.1017/S0373463304003005 Printed in the United Kingdom
    • Reid, D. B and E. Lithopoulos. 1998. High Precision Pointing System for Airborne Sensors. Proceedings of The Institute of Electrical and Electronics Engineers (IEEE) Plans’98 – Position, Location and Navigation Symposium, Palm Spring, California, USA, April 20-23.
    • Reid, D. B; E. Lithopoulos and J. Hutton. 1998. Position and Orientation System for Direct Georeferencing (POS/DG). Proceedings of the Institute of Navigation (ION) 54th Annual Meeting, Denver, Colorado, USA, June 1-3.
    • Schwarz, K. P.; C. S. Fraser and P. C. Gustafon. 1984. Aerotriangulation without Ground Control. International Archives of Photogrammetry and Remote Sensing, V.25, Part A1, Tio de Janeiro, Brazil.
    • Schwarz, K. P.; M. Chapman; M. E. Cannon and P. Gong. 1993. An Integrated INS/GPS Approach to The Georeferencing of Remotely Sensed Data. Photogrammetric Engineering & Remote Sensing (PE & RS), V.59, N.11.
    • Schwarz, K. P.; M. Chapman; M. Cannon; P. Gong and D. Cosandier. 1994. A Precise Positioning/Attitude System in Support of Airborne Remote Sensing. Proceedings of The GIS/ISPRS Conference, Ottawa, Canada, June 6-10.
    • Schwarz, K. P. (1995). Integrated Airborne Navigation Systems for Photogrammetry. Proceeding of the Photogrammetric Week ‘95’, Editors: D. Fritsch & D. Hobbie, Wichmann Verlag, Heidelberg, Germany.
    • Skaloud, J. 1995. Strapdown INS Orientation Accuracy with GPS Aiding. M.Sc. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20079.
    • Skaloud, J. 1999. Optimizing Georeferencing of Airborne Survey Systems by INS/GPS. Ph.D. Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20126.
    • Stadelmann, M. 1990. A Knowledge-Based System for the Automated Interpretation of Landsat TM Imagery. M.Sc. Thesis, Department of Surveying Engineering, University of Calgary, Calgary, Alberta, Canada, UCSE Report No. 20036.
    • Write, D. B., T. Yotsumata and N El-Sheimy. 2004. Real Time Identification and Location of Forest Fire Hotspots from Geo-referenced Thermal Images. Proceeding of The XX Congress, ISPRS, Comm II, Vol. XXXV, Part B2, Istanbul, Turkey, pp. 13-18, July 12-23.