Downloads provided by UsageCounts
The interpretation of sounds can lead to different associations and mental models based on the person's prior knowledge and experiences. Thus, cross-modal mappings such as verbalization or visualization of sounds can vary from person to person. The sonification provides complements to visualization techniques or possible alternatives to more effectively support the interpretation of abstract data. Since sonifications usually map data attributes directly to auditory parameters, this may conflict with users' mental models. In this paper, we analyze various sketch-based associations of sounds to better understand users' mental models in order to derive understandable cross-modal correlations for sonification. Based on the analysis of different sketches from a previously conducted user study, we propose three semantic-auditory channels that can be used to encode abstract data.
sonification, mental model, sketching, auditory encoding, sound, visual encoding
sonification, mental model, sketching, auditory encoding, sound, visual encoding
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 29 | |
| downloads | 28 |

Views provided by UsageCounts
Downloads provided by UsageCounts