Automatic registration of Iphone images to LASER point clouds of the urban structures using shape features

Other literature type, Article English OPEN
B. Sirmacek ; R. C. Lindenbergh ; M. Menenti (2013)
  • Publisher: Copernicus Publications
  • Journal: ISPRS Annals of the Photogrammetry (issn: 2194-9042, eissn: 2194-9050)
  • Related identifiers: doi: 10.5194/isprsannals-II-5-W2-265-2013
  • Subject: TA1-2040 | T | TA1501-1820 | Applied optics. Photonics | Engineering (General). Civil engineering (General) | Technology
    acm: ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION

Fusion of 3D airborne laser (LIDAR) data and terrestrial optical imagery can be applied in 3D urban modeling and model up-dating. The most challenging aspect of the fusion procedure is registering the terrestrial optical images on the LIDAR point clouds. In this article, we propose an approach for registering these two different data from different sensor sources. As we use iPhone camera images which are taken in front of the interested urban structure by the application user and the high resolution LIDAR point clouds of the acquired by an airborne laser sensor. After finding the photo capturing position and orientation from the iPhone photograph metafile, we automatically select the area of interest in the point cloud and transform it into a range image which has only grayscale intensity levels according to the distance from the image acquisition position. We benefit from local features for registering the iPhone image to the generated range image. In this article, we have applied the registration process based on local feature extraction and graph matching. Finally, the registration result is used for facade texture mapping on the 3D building surface mesh which is generated from the LIDAR point cloud. Our experimental results indicate possible usage of the proposed algorithm framework for 3D urban map updating and enhancing purposes.
Share - Bookmark