Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Recolector de Cienci...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
versions View all 3 versions
addClaim

Real-time Visual Object Tracking

Authors: Navarro Comes, Albert;

Real-time Visual Object Tracking

Abstract

The presence of robots in humans’ daily life is a growing phenomenon. Autonomous navigation is a part of this progression. In this study robust tracking of different objects simultaneously is investigated. A software capable of tracking different objects, mainly humans and robots, is built using a Kinect camera situated in a moving platform. Computer vision technology is used to provide a tool to improve the human and robot interaction. Choosing a sensor like the recently released Kinect camera, colour and depth images are combined to obtain a differential gain in the tracking implementation. The sensor has been mounted in a moving robot able to provide the position of the robot at any moment from the sensor on its wheels. This information is very valuable to transform the data from different frames and positions to the same reference system. The use given to the Kinect concerns mainly the tracking of one or two humans from a fixed position. This work shows that it can also be used to track humans and much smaller objects simultaneously when the camera is mounted on a mobile robot. The Point Cloud Library is the framework used to deal with the data obtained from the sensor. The use of PCL libraries for filtering, segmentation and clustering results in the detection of the object. Then the descriptor creation and comparison with the ones obtained in previous frames will come up with the correlation part. A training process for the recognition of humans and robots is performed beforehand. The descriptor used is the Viewpoint Feature Histogram, which maintains invariance to distance and size. The main limitation imposed is that the solution implemented should be integrated in the Mobotware framework. Thus, it has been created as a plug-in running directly in the framework. The Kinect introduce limitations such as small range of view and impossibility of being used outdoors. The application is validated with tests and the results obtained are presented. The recognition of the objects trained has proved to be successful and the time of computation, even though not ideal, is enough to see how the scene varies over time. Enough to track humans and SMR moving at a normal speed. In all cases the system detects and tracks the objects and keeps learning from the detections over time. Even though it tracks well, when movement is added to the camera the tracking process is less effective, mainly due to the apparition of false detections.

Outgoing

Country
Spain
Keywords

Teledetecció, Àrees temàtiques de la UPC::Informàtica::Robòtica, Visió per ordinador, Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial, :Enginyeria de la telecomunicació::Processament del senyal::Reconeixement de formes [Àrees temàtiques de la UPC], Pattern recognition systems, Remote sensing, Human-computer interaction, Interacció persona-ordinador, Robots mòbils, :Enginyeria electrònica::Instrumentació i mesura::Sensors i actuadors [Àrees temàtiques de la UPC], Àrees temàtiques de la UPC::Enginyeria electrònica::Instrumentació i mesura::Sensors i actuadors, Àrees temàtiques de la UPC::Enginyeria de la telecomunicació::Processament del senyal::Reconeixement de formes, Mobile robots, Computer vision, Reconeixement de formes (Informàtica), :Informàtica::Intel·ligència artificial [Àrees temàtiques de la UPC], :Informàtica::Robòtica [Àrees temàtiques de la UPC]

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 48
  • 48
    views
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
0
Average
Average
Average
48
Green