publication . Preprint . 2018

Complex Urban LiDAR Data Set

Jeong, Jinyong; Cho, Younggun; Shin, Young-Sik; Roh, Hyunchul; Kim, Ayoung;
Open Access English
  • Published: 16 Mar 2018
Abstract
This paper presents a Light Detection and Ranging (LiDAR) data set that targets complex urban environments. Urban environments with high-rise buildings and congested traffic pose a significant challenge for many robotics applications. The presented data set is unique in the sense it is able to capture the genuine features of an urban environment (e.g. metropolitan areas, large building complexes and underground parking lots). Data of two-dimensional (2D) and threedimensional (3D) LiDAR, which are typical types of LiDAR sensors, are provided in the data set. The two 16-ray 3D LiDARs are tilted on both sides for maximal coverage. One 2D LiDAR faces backward while ...
Subjects
free text keywords: Computer Science - Robotics
Download from

[1] M. Cordts, M. Omran, S. Ramos, T. Rehfeld, M. Enzweiler, R. Benenson, U. Franke, S. Roth, and B. Schiele, “The cityscapes dataset for semantic urban scene understanding,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 3213-3223. [OpenAIRE]

[2] G. J. Brostow, J. Fauqueur, and R. Cipolla, “Semantic object classes in video: A high-definition ground truth database,” IEEE Pattern Recognition Letters, vol. 30, no. 2, pp. 88-97, 2009.

[3] P. Dolla´r, C. Wojek, B. Schiele, and P. Perona, “Pedestrian detection: A benchmark,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2009, pp. 304-311. [OpenAIRE]

[4] A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” The International Journal of Robotics Research, vol. 32, no. 11, pp. 1231-1237, 2013.

[5] M. Smith, I. Baldwin, W. Churchill, R. Paul, and P. Newman, “The new college vision and laser data set,” International Journal of Robotics Research, vol. 28, no. 5, pp. 595-599, 2009.

[6] J.-L. Blanco-Claraco, F.- A´. Moreno-Duen˜as, and J. Gonza´lez-Jime´nez, “The ma´laga urban dataset: High-rate stereo and lidar in a realistic urban scenario,” International Journal of Robotics Research, vol. 33, no. 2, pp. 207-214, 2014.

[7] N. Carlevaris-Bianco, A. K. Ushani, and R. M. Eustice, “University of michigan north campus long-term vision and lidar dataset,” International Journal of Robotics Research, vol. 35, no. 9, pp. 1023-1035, 2016.

[8] C. H. Tong, D. Gingras, K. Larose, T. D. Barfoot, and E´ . Dupuis, “The canadian planetary emulation terrain 3d mapping dataset,” International Journal of Robotics Research, vol. 32, no. 4, pp. 389-395, 2013.

[9] G. Pandey, J. R. McBride, and R. M. Eustice, “Ford campus vision and lidar data set,” International Journal of Robotics Research, vol. 30, no. 13, pp. 1543-1552, 2011.

[10] H. Jung, Y. Oto, O. M. Mozos, Y. Iwashita, and R. Kurazume, “Multimodal panoramic 3d outdoor datasets for place categorization,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2016, pp. 4545-4550.

[11] W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 year, 1000 km: The oxford robotcar dataset.” International Journal of Robotics Research, vol. 36, no. 1, pp. 3-15, 2017.

[12] R. Smith, M. Self, and P. Cheeseman, “Estimating uncertain spatial relationships in robotics,” in Autonomous robot vehicles. Springer, 1990, pp. 167-193.

[13] A. Segal, D. Haehnel, and S. Thrun, “Generalized-ICP.” in Robotics: science and systems, vol. 2, 2009.

Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue