<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
Data accompanying the paper: "Passive Acoustic Monitoring and Transfer Learning" Please cite this dataset as: Dufourq, Emmanuel and Batist, Carly and Foquet, Ruben and Durbach, Ian. (2022). Passive Acoustic Monitoring and Transfer Learning. BioRxiv doi: This dataset contains approximately 6 hours of audio that contained calls of the pin-tailed whydah (Vidua macroura). The audio data was collected in the Intaka Island Nature Reserve in Cape Town, South Africa using 1 Audiomoth. The sampling rate was set to 48,000Hz and the recordings were obtained over four days in January 2021. A larger dataset exists. The annotations files are in (.svl) format which is compatible with SonicVisualiser (https://www.sonicvisualiser.org/). Each audio file has a corresponding .svl file. Each .svl has segments of audio that were manually annotated as either ''thyolo-alethe" (presence class) or "noise" (absence class) -- this dataset can be used to train a binary classification model. The audio files are provided in "Audio.zip" and the manually verified annotation in "Annotations.zip".
ED is supported by a research chairship from the African Institute for Mathematical Sciences South Africa. This work was carried out with the aid of a grant from the International Development Research Centre, Ottawa, Canada, www.idrc.ca, and with financial support from the Government of Canada, provided through Global Affairs Canada (GAC), www.international.gc.ca. This work was supported by funding from Microsoft's AI for Earth program.
pin-tailed whydah, vocalisation classification, passive acoustic monitoring, bioacoustics, machine learning, convolutional neural networks
pin-tailed whydah, vocalisation classification, passive acoustic monitoring, bioacoustics, machine learning, convolutional neural networks
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=doi_dedup___::6621b49cdd4de0f2d978daf35b35981f&type=result"></script>');
-->
</script>
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=doi_dedup___::6621b49cdd4de0f2d978daf35b35981f&type=result"></script>');
-->
</script>
citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |