Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2024
License: CC BY
Data sources: ZENODO
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2024
License: CC BY
Data sources: ZENODO
ZENODO
Dataset . 2024
License: CC BY
Data sources: Datacite
ZENODO
Dataset . 2024
License: CC BY
Data sources: Datacite
ZENODO
Dataset . 2024
License: CC BY
Data sources: Datacite
versions View all 3 versions
addClaim

Smartbay Marine Types Object Detection Training dataset

Authors: Cullen, Eva;

Smartbay Marine Types Object Detection Training dataset

Abstract

Data preprocessing The following pre-processing was applied to each image on the Roboflow platform: Auto-orientation of pixel data (with EXIF-orientation stripping Resize to 640x640 (Stretch) The following augmentation was applied to create 3 versions of each source image: Random brigthness adjustment of between -25 and +25 percent Random exposure adjustment of between -23 and +23 percent Random Gaussian blur of between 0 and 3 pixels Salt and pepper noise was applied to 1.88 percent of pixels Data splitting The initial training dataset was split as follows: 86% Training Set, 10% Validation Set and 4% Test Set. Data labelling The classes used in the initial dataset are: Eel, Fish, Flat Fish, Jelly, Pipefish, Ray, Seahorse, Shark Parameters NA Data sources The imagery used in this training dataset consists of image frame captures from the Smartbay video Archive files, CC-BY imagery from the www.minka-sdg.org website (See attached "minka_image_citations.txt" file for citations of Minka-sdg.org imagery used) and images taken by Eva Cullen in the "Galway Atlantaquaria" Aquarium in Galway, Ireland. Data quality Video frames were extracted from Smart Observatory and Galway Atlantaquaria video recordings for annotation usign CVAT and also ffmpeg. Target species CC-BY images were also downloaded from minka-sdg.org using a python minka-downloader script (https://github.com/obsea-upc/minka-downloader) Useful images were manually selected and annotated by the bursor students and collated into training datasets in Roboflow. Data resizing The images in the initial dataset were resized to 640x640 when exported from Roboflow. Spatial coverage Smartbay Observatory Spiddal, Galway Bay, Ireland Galway Atlantaquaria, Seapoint Promenade Galway H91 T2FD, Ireland Minka-sdg.org Sample Imagery from Spanish and Portuguese waters Contact information Data.Requests@Marine.ie

Training Dataset The SmartBay Observatory in Galway Bay is an important contribution by Ireland to the growing global network of real-time data capture systems deployed within the ocean – technology giving us new insights into the ocean which we have not had before. The observatory was installed on the seafloor 1.5km off the coast of Spiddal, County Galway, Ireland . The observatory uses cameras, probes and sensors to permit continuous and remote live underwater monitoring. This observatory equipment allows ocean researchers unique real-time access to monitor ongoing changes in the marine environment. Data relating to the marine environment at the site is transferred in real-time from the SmartBay Observatory through a fibre optic telecommunications cable to the Marine Institute headquarters and onwards onto the internet. The data includes a live video stream, the depth of the observatory node, the sea temperature and salinity, and estimates of the chlorophyll and turbidity levels in the water which give an indication of the volume of phytoplankton and other particles, such as sediment, in the water. The Smartbay Marine Types Object Detection training Dataset is an initial Bounding Box Annotated image dataset used in attempting to Train a YOLOv8 Object Detection Model to classify the Marine Fauna observed in the Smartbay Observatory Video footage using broad "Marine Type" classes. The imagery used in this training dataset consists of image frame captures from the Smartbay video Archive files, CC-BY imagery from the www.minka-sdg.org website and images taken by Eva Cullen in the "Galway Atlantaquaria" Aquarium in Galway, Ireland. The imagery were annotated using CVAT, collated on Roboflow and exported in YOLOv8 trainign dataset format.

Keywords

object detection training dataset, underwater imagery, marine fauna

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Funded by
Related to Research communities
EGI : advanced computing for research