Home Articles Geo-referencing of multi-sensor range data for Vehicle-borne Laser Mapping System(VLMS)

Geo-referencing of multi-sensor range data for Vehicle-borne Laser Mapping System(VLMS)

 

Dinesh Manandhar, Ryosuke Shibasaki
Centre for Spatial Information Science, The University of Tokyo
4-6-1, Komaba, Meguro-ku, Tokyo 153-8505, JAPAN
Tel / Fax No: 88-3-5452-6417
E-mail:[email protected], [email protected]
 
Keywords: Mobile Mapping, Range Data, Sensor Calibration, INS / GPS

Abstract:
Laser mapping has become quite popular in recent days due to it's capability of providing information directly in three dimension. However, the present laser mapping systems are either air-borne or ground-based (on a static platform). We cannot achieve detail information from air-borne system, though it has it's own suitability for applications like DEM generation. Ground based static systems are not suitable for larger area mapping purpose.

The vehicle-borne laser mapping system (VLMS) use laser scanners for three dimensional data acquisition, CCD cameras for texture information, GPS, INS and odometer for positioning information. The data obtained by this system could be a good resource for developing urban 3-D database, which has numerous applications in the field of virtual reality, car navigation, computer games, planning and management.

In this paper, we discuss about the system architecture of VLMS, calibration of sensors, integration of all these sensors and positioning devices for direct geo-referencing.

1 Introduction
The mobile mapping technology has been developed since late 1980's. The development of the mobile mapping system became possible due to the availability of GPS signal to the civilian community. This system is capable of observing the objects at closer range, thus giving greater details. As per our knowledge, so far, the vehicle based mobile mapping system are based on CCD cameras (in combination with video camera in some cases) for data acquisition. Combination of GPS with either INS or Gyro is used for navigation purpose. For reference of some of these systems, refer GPS-VanTM (Bossler et al, 1991), VISAT-Van (Schwarz et al, 1993, Li et al, 1994), TruckMapTM (Pottle, 1995), KiSS (Hock et al, 1995), GPS Vision (He, 1996) and GeoMaster (Tamura et al, 1998).

In our system, we have used laser scanners as the main data acquisition device. The system is supplemented by CCD cameras for texture information and as usual combination of GPS, INS and odometer is used for position and attitude information.

2 System Configuration
The mobile mapping system is shown in figure 1. It consists of four CCD cameras, four laser scanners, Hybrid Inertial Survey System (HISS) which consists of DGPS, INS and electronic odometer for position and orientation data.

Figure 1. Mobile Mapping Survey Vehicle and the Sensors on top of the vehicle
2.1 HISS (The GPS/INS Integration)
The Hybrid Inertial Survey System (HISS) is the integration of GPS, INS and Odometer. The integrated system has an accuracy of about one-meter in horizontal plane. The system once initialized can work either on DGPS/INS mode or Odometer/INS mode. The switching between the two modes is automatic, which is based on PDOP (if PDOP > 4.0) and number of visible satellites (if no. of sat < 4). The preference is always given to DGPS/INS mode. We have conducted some tests to see the behavior of the DGPS alone and DGPS/INS results at stationary mode (while the vehicle is not moving). For details on HISS and it's field test results, refer (Tamura et al,1998).

2.2 Laser Scanner
Four laser scanners are fixed on top of the vehicle at four different directions as shown in figure 1. The purpose of using four units is to capture the data from different viewing angle so that as much of data can be extracted as possible even when the target is occluded by pedestrians or moving vehicles. The laser scanner head rotates around it's head (shown in figure 2). It provides the distance from the laser head to the target. This distance is later converted into 3-D coordinates. The configuration of laser scanner is given in Table 1.

Figure 2. CCD and Laser sensor units
 

 

Meas. Distance Range 50m(reflection rate 20%)
Accuracy of Range Measurement 3cm(1)
Scanning Range 300°
Angular Range Step 0.25°
Frequency 10Hz

 

Table 1. Laser Scanner configuration
2.3 CCD Camera
Four CCD cameras are used with parabolic reflecting mirrors. Each unit of CCD camera and Laser scanner are housed in a single frame. Table 2 shows the configuration of the CCD camera.
 

 

Photograph Element 1/2 inch color CCD
Resolution 659×494
Output RGB
Trigger
shutter
1/60
1/10000
Lens Mount C mount

 

Table 2. Configuration of CCD Camera

 

3 Geo-Referencing

Figure 3. Illustration of Geo-referencing
In mobile mapping, the vehicle is continuously moving or the position of the vehicle is changing with respect to time. Besides, every sensor and device has it's own local coordinate system. For example, GPS output is based on WGS84 coordinates system, Laser data is based on it's own local coordinate system, the origin of which lies at the laser scanning head and so on for other sensors and devices. The major problem is to identify the spatial position of the objects scanned by the laser at any time while the vehicle is moving with reference to a common coordinate system, which is called Geo-referencing. It involves the integration of all the sensors and devices to a common coordinate system, which is the (local) mapping coordinate system. The integration process mainly involves the computation of fixed rotation and shift vectors between the INS body and sensors. As the GPS and INS are physically located in two different places, we also need to know the shift vector between the GPS and INS. Refer (Manandhar and Shibasaki, 2000) for details on individual sensor calibration.

The general mathematical model (Cramer et al., 1998) for direct geo-referencing when the GPS and INS are physically offset is given in equation 1.

where,

Equation 2
Any object point vector at time t in mapping coordinate system

Equation 3
GPS measured point vector at time t in mapping coordinate system

 
A variable, 3 x 3 Rotation Matrix at time t from INS body (INS coordinate system) to Mapping Frame. This is direct observation value at time t by INS. The HISS system is calibrated so as to give the output in WGS84 coordinate system.

Equation 4 Image vector in Image Coordinate System

Equation 5
Offset from INS to the CCD in body frame, obtained by direct measurement.

Equation 6 Offset from INS to GPS in body frame, obtained by direct measurement.

 
A fixed, 3 x 3 Rotation Matrix between CCD camera and INS body in INS coordinate system. The computation of this rotation matrix is given below:

We get rotation matrix and shift vector between the CCD and local coordinate system from outdoor camera calibration. During the calibration process we observe the GPS/INS position together with acquisition of the calibration target images. We have defined a local coordinate system for measuring the calibration targets by the total station. While defining the local coordinate system, the orientation of the x, y, and z-axis were set following the map coordinate system, so that the rotation between the two coordinate systems is a 3×3 unit matrix. Thus we can approximate that the rotation between the CCD camera and the local coordinate system is the rotation between the CCD camera and mapping coordinate system. The transformation from local coordinate system to mapping coordinate system is given by equation 7. This is a two- dimensional affine transformation.

Equation 7
The initialization process of INS aligns the horizontal plane and finds the north direction and the output is calibrated to give the rotation with respect to the map coordinate system. Thus the rotation output from the HISS (INS/GPS) is the rotation from the INS body to the Mapping coordinate system.

Now, we have both the rotation of the CCD camera and INS body with respect to the mapping coordinate system. From this two information, we can compute the fixed rotation between the CCD and INS with respect to mapping coordinate system by using the three equations illustrated in figure 4.

Figure 4. Illustration of computation of Fixed Rotation between the Sensor and INS
Since, we need to geo-reference laser coordinates, we have to know the relation between the laser coordinate system and the mapping coordinate system. However, one major problem in outdoor experiment is to identify a specific reflected laser from the object point (or calibration target). The laser data we get is just a cloud of points and it's too difficult to know which particular point or points is the one reflected by the target. Thus in order to overcome this problem, we based our integrated calibration on CCD calibration. We converted the laser coordinate system to CCD coordinate system (image coordinate system) with some assumptions. This is achieved by using the equations 8 and 9.

Equation 8

where,

Equation 9
In this system, the laser and CCD camera are housed in the same frame. Thus, both of these two sensors rotate together with respect to INS and other devices. We assumed that the imaging plane of CCD and Laser are orthogonal to each other and from the coordinate systems defined for laser and CCD, we can write the rotation from laser to CCD as [0 0 -90] along X, Y and Z axis. The shift between the laser and CCD are physically measured.

Thus we modify equation 1 to equation 10. We use this equation to compute every object coordinate measured by laser with respect to the map coordinate system.

Equation 10
Using above mathematical model for geo-referencing, we have integrated laser range data from three sensors. The fourth sensor, which is placed vertically up on top of the vehicle is used for deriving the horizontal profile of the vehicle.
 


4 Results and Discussions
Table3 shows the results of the integrated calibration of the sensors and the positioning devices. The range data taken by each of the laser scanner is shown in figure 5. The orientation of scanners is approximately at 45 degrees to each other. Scanner 2 is the middle scanner, which captured most of the data. We can see some occluded regions in range data taken by scanner 2 due to trees. The region of occlusion is more prominent in scanner 1 and scanner 3 data sets. However, when we combine all these three data sets, we can recover part of the occluded region.
 

 

Sensors to INS Roll Pitch Yaw Tx Ty Tz
Sensor 1 -0.476 -0.982 44.659 -0.437 -0.965 0.110
Sensor 2 0.100 -0.474 90.727 -0.562 -0.236 0.150
Sensor 3 -0.125 -1.037 134.389 -0.423 0.480 0.100

 

Table 3: Computation Results of Fixed Rotation and Shift Vectors from Sensors to INS body
 
5 Conclusion and future work
We have developed a Vehicle-borne Laser Mapping System (VLMS), which is capable of acquiring data in three dimensions. This data can be further processed for use in urban 3-D GIS. The future work includes segmentation of these laser data so that we can classify the objects into buildings, trees (vegetation), road surfaces and other classes.

Figure 5: Plot of Laser Range Data frm each of the scanners.

Figure 6: Integration of Laser Range Data from Three different sensors
References

Bossler, J.D., Goad, C.C., Johnson, P.C., and Novak, K., 1991, GPS and GIS map the nation's highways, GeoInfo Systems, March 1991, p.27-37

Haala, N., Stallman D., Cramer, M., Calibration of Directly Measured Position and Attitude by Aerotriangulation of Three Line Airborne Imagery, e-mail: [email protected] de

He, G., 1996, Design of a mobile mapping system for GIS data collection, IAPRS, Vol. XXXI, part B2, pp.154-159

Hock, C., W. Caspary, H. Heister, J. Klemm, and H. Sternberg 1995, Architecture and design of the kinematic survey system KiSS, Proc. of the 3 rd Int Workshop on High Precision Navigation, Stuttgart, April, p. 569-576

Li, R. Mobile Mapping – An emerging technology for spatial data acquisition, . edu/ron/teaching/894a/paper1.htm

Manandhar, D., Shibasaki, R., 2000, Prototype Development for Vehicle based Laser Mapping System (VLMS), IAPRS, 16-23 July, Amsterdam, Vol XXXIII, part B2, pp. 359-366

Pottle, D., 1995, new mobile mapping system seeds data acquisition, GIM, Vol 9, No. 9, p51-53

Schwarz, K.P., H.E. Martell, N.El. Sheimy, Ron Li, M.A. Chapman, and D. Cosandier 1993, VISAT – A mobile highway survey system of high accuracy. Proc. of the IEEE vehicle navigation and information systems conference, October 12-15, Ottawa, pp. 476-481

Tamura, T., Kitagawa, T., Tsuji, K., Uchida O., Shimogaki, Y., 1998, The GPS/INS integration and kinematic photogrammetry for mobile mapping system, International Archives of Photogrammetry and Remote Sensing, Hakodate, Vol. XXXII, part 5, pp. 824-829