Supplementary material for: [https://doi.org/10.1016/j.jinorgbio.2017.06.011 ] Related to published version: [http://cherry.chem.bg.ac.rs/handle/123456789/2495] Related to accepted version: [http://cherry.chem.bg.ac.rs/handle/123456789/3258]
In this contribution we present a framework for an embodied robotic system that is capable of appearance-based self-localization. Specifically, we concentrate on the issues of robustness, flexibility, and scalability of the system. The framework presented is based on a panoramic eigenspace model of the environment. Its main feature is that it allows for simultaneous localization and map building using an incremental learning algorithm. Further, both the learning and the training processes are designed in a way to achieve robustness and adaptability to changes in the environment.