Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Other ORP type . 2021
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Other ORP type . 2021
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Other ORP type . 2021
License: CC BY
Data sources: ZENODO
versions View all 2 versions
addClaim

DCU Library's Captioning Best Practices and Subtitle Guidelines

Authors: O'Neill, Eilís;

DCU Library's Captioning Best Practices and Subtitle Guidelines

Abstract

{"references": ["BBC (2018) BBC Subtitle Guidelines, BBC. Available at: https://bbc.github.io/subtitle-guidelines/ (Accessed: 4 June 2021).", "Datta, P. et al. (2020) 'Readability of Punctuation in Automatic Subtitles', in Miesenberger, K. et al. (eds) Computers Helping People with Special Needs. Cham: Springer International Publishing (Lecture Notes in Computer Science), pp. 195\u2013201. doi: 10.1007/978-3-030-58805-2_23.", "DCMP (2021) Captioning Key - Elements of Quality Captioning. Available at: https://dcmp.org/learn/599-captioning-key---elements-of-quality-captioning (Accessed: 16 June 2021).", "EngageMedia (2021) 'Best Practices for Online Subtitling', EngageMedia. Available at: https://engagemedia.org/help/best-practices-for-online-subtitling/ (Accessed: 4 June 2021).", "Gerber-Mor\u00f3n, O. and Szarkowska, A. (2018) 'Line breaks in subtitling: an eye tracking study on viewer preferences.', Journal of Eye Movement Research, 11(3), pp. 1\u201322.", "Minnesota State captioning committee (2017) 'A campus toolkit for course captioning'. Available at: https://ccaps.umn.edu/documents/CPE-Conferences/MnLC/MNStateCaptioningToolkit.pdf (Accessed: 10 June 2021).", "National Disability Authority (2020) Subtitles for people who are deaf or hard of hearing. Available at: http://universaldesign.ie/technology-ict/archive-irish-national-it-accessibility-guidelines/digital-tv-equipment-and-services/guidelines-for-digital-tv-equipment-and-services/subtitles-for-people-who-are-deaf-or-hard-of-hearing/ (Accessed: 4 June 2021).", "O'Donovan, Kelly (2016) 'How to Do Subtitles Well \u2013 Basics and Good Practices', Translation Journal. Available at: https://translationjournal.net/October-2016/how-to-do-subtitles-well-basics-and-good-practices.html (Accessed: 4 June 2021).", "Office of Information Technology (2016) 'Captioning standards for quality checklist'. University of Colorado Boulder. Available at: https://oit.colorado.edu/sites/default/files/Captioning%20Standards%20Checklist_0.pdf (Accessed: 16 June 2021).", "Souto-Rico, M. et al. (2020) 'A new system for automatic analysis and quality adjustment in audiovisual subtitled-based contents by means of genetic algorithms', Expert Systems, 37(6), p. e12512. doi: https://doi.org/10.1111/exsy.12512.", "Szarkowska, A., D\u00edaz Cintas, J. and Gerber-Mor\u00f3n, O. (2020) 'Quality is in the eye of the stakeholders: what do professional subtitlers and viewers think about subtitling?', Universal Access in the Information Society. doi: 10.1007/s10209-020-00739-2.", "Trask, L. (1997) Capital Letters\u202f: Capital Letters and Abbreviations, University of Sussex. Available at: http://www.sussex.ac.uk/informatics/punctuation/capsandabbr/caps (Accessed: 21 June 2021).", "W3C WAI and Henry, S. L. (2021) Transcribing Audio to Text, Web Accessibility Initiative (WAI). Available at: https://www.w3.org/WAI/media/av/transcribing/ (Accessed: 10 June 2021)."]}

This manual is based on several best practice guidelines for captioning or subtitling pre-recorded educational videos at a university level. It will help the user to apply best practice recommendations when creating captions with a focus on line breaks and sentence layout, spelling, punctuation, capitalisation, numbering and date and time. It also offers guidance on how to caption sound effects, music, silence and unintelligible audio and describes how to identify multiple speakers. It gives instructions on how to edit automated YouTube subtitles to create highly readable and accurate captions that follow best practice guidelines. The aim of this manual is to increase the accessibility of online educational video content and to boost the learning experience of all viewers, whether they are deaf, hard of hearing or hearing.

Further guidelines for captioning live broadcasts, entertainment programmes or other scenarios not set out in the manual can be found using the list of references provided with this document.

Keywords

captions subtitles best practice guidelines University online educational video accessibility YouTube deaf community, Creative Commons OER

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 32
    download downloads 12
  • 32
    views
    12
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
0
Average
Average
Average
32
12
Beta
sdg_colorsSDGs:
Related to Research communities