Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Recolector de Cienci...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
DIGITAL.CSIC
Doctoral thesis . 2017 . Peer-reviewed
Data sources: DIGITAL.CSIC
versions View all 2 versions
addClaim

Perception and interpretation of dynamic scenarios using lidar data and images

Authors: Ortega, Agustín;

Perception and interpretation of dynamic scenarios using lidar data and images

Abstract

[ES]: Esta tesis aborda el problema de fusion de datos a partir de distintas modalidades sensoriales, sensores láser de distancia y cámaras, para la interpretación de escenas. Ambos dispositivos son complementarios el uno del otro. El primero proporciona información sobre la distancia a la que se encuentran los objetos mientras que el segundo proporciona información acerca de su apariencia. En esta tesis proveemos de soluciones que muestran como el primero puede utilizarse para la calibración del segundo, y en dado caso, como se propaga la incerteza en las lecturas del primero a la estimacion de los parámetros del segundo. Además para poder obtener una integración a bajo nivel de ambos, desarrollamos soluciones que permiten relacionar espacialmente el uno con el otro además de conseguir su adecuada sincronización temporal. La tesis aborda tambien el uso de ambos sensores para la identificación de eventos de una escena en movimiento. Una vez que los sensores han sido calibrados geométricamente, podemos asociar las características de bajo nivel calculadas de cada uno de ellos y explotamos esta asociación de características para anotar los eventos que ocurren en la escena y segmentar los elementos que se mueven (personas). Durante la busqueda de una técnica adecuada para la fusión de la información de ambos sensores, encontramos necesario abordar un problema que habitualmente no es estudiado con rigurosidad, el de la sincronización de los mismos. Esta tesis proporciona dos soluciones distintas para la sincronización entre una cámara y dos tipos de sensores láser. Por un lado, una solución que permite sincronizar la cámara con un sensor láser de frecuencia de adquisición de datos baja pero alta resolución, y un segundo método para sincronizar un sensor láser de frecuencia elevada de adquisición de datos pero de baja resolución espacial. Además de proponer estas alternativas de sincronización, la tesis presenta resultados de fusión de datos usando mezclas de Gaussianas para la detección de eventos dinámicos.

[EN]: This thesis reports research on the fusion of data coming from laser range scanners and cameras for scene interpretation. These devices are complementary in that one provides information about the distance at which objects are located, whereas the second one provides information about their appearance. We provide solutions that show how one can be used to help in the calibration of the other one, and in such case, how the noise of the first propagates to the estimates computed by the second. Moreover, to provide a tight integration of the two we develop solutions not only for the accurate geometric calibration between them, but also for their correct synchornization. We also studied how the combination of the two sensors can be exploited to identify dynamic scene events. Once the two sensors are geometrically calibrated, we can reliable associate low level features extracted in each of them. We exploit such tight correspondence for the accurate annotation of dynamic events occurring in the scene, and are able to segment out those moving elements (people) from an otherwise static scene combining the data from laser range finders and cameras. In the quest for an adequate data fusion algorithm we encountered another often overlooked sensor calibration problem, that of sensor synchronization. We provide solutions to the synchronization between a camera and a low-rate high-density laser range scanner, and also with a high-rate low-density range scanner. In the end, we provide alternatives for the synchronization and also for the data fusion between camera images and each of the two range sensors using Gaussian mixture models as the core fusion methodology.

I would like to acknowledge all sources of financial support that in one way or another contributed for the development of this thesis. These include, a PhD scholarship from the Mexican Council of Science and Technology (CONACYT) with the scholarship number 181412, a Research Stay Grant from the Agencia de Gestio d’Ajuts Universitaris i de Recerca of The Generalitat of Catalonia (CTP 2013), the Consolidated Research Group VIS (SGR2009-2013), a Technology Transfer Contract with the Asociacion de la Industria Navarra, the National Research Projects PAU and PAU+ (DPI2008-06022 and DPI2011-2751) funded by the Spanish Ministry of Economy and Competitivenes, and the EU URUS project (IST-FP6-STREP-045062).

Universitat Politècnica de Catalunya BarcelonaTech (UPC) PhD program: Automatic Control, Robotics and Computer Vision. The work presented in this thesis has been carried out at: Institut de Robotica i Informàtica Industrial, CSIC-UPC.

Peer Reviewed

Country
Spain
Related Organizations
  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Green