Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Electronic Library S...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Распознавание сигналов световых приборов автомобилей для умных светофоров

Распознавание сигналов световых приборов автомобилей для умных светофоров

Abstract

Статья исследует применение методов машинного обучения для распознавания сигналов световых приборов автомобилей с целью использования данных в умных светофорах. Для решения задачи распознавания машин на видео была использована библиотека Keras, и была применена архитектура нейронной сети RetinaNet [1]. Для распознавания состояний фар транспорта была использована архитектура YOLOv8. Процесс сбора данных, аннотации и обучения модели был проведён с использованием платформы Roboflow. В результате работы были получены веса обученной модели, которые позволяют распознавать состояние передних и задних фар различных видов транспорта в различных погодных условиях. Предложена адаптация нейросетевой модели на основе YOLOv8 для решения задачи распознавания сигналов световых приборов светофоров, которая может быть использована как для статического распознавания на фотографиях, так и в режиме реального времени или видео. This paper explores the application of machine learning methods for recognizing automobile light signals to enhance smart traffic light systems. For vehicle detection in video footage, the Keras library was employed along with the RetinaNet neural network architecture [1]. The YOLOv8 architecture was used for identifying the status of vehicle headlights and taillights. Data collection, annotation, and model training were conducted using the Roboflow platform. The research resulted in trained model weights capable of recognizing the state of front and rear lights on various vehicle types under different weather conditions. The paper proposes an adaptation of the YOLOv8-based neural network model for recognizing traffic light signals, which can be utilized for both static recognition in photographs and in real-time or video applications.

Keywords

Image processing, Обработка изображения, Recognition task, Умные светофоры, Model training, Задача распознавания, Smart traffic lights, Нейронные сети, Обучение модели, Neural networks

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Green