<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
In this paper we propose an empirical method to develop mapping strategies between a gestural based interface (the Gloves) and physically based sound synthesis models. An experiment was performed in order to investigate which kind of gestures listeners associate to synthesised sounds produced using physical models, corresponding to three categories of sound: sustained, iterative and impulsive. The results of the experiment show that listeners perform similar gestures when controlling sounds from the different categories. We used such gestures in order to create the mapping strategy between the Gloves and the physically based synthesis engine.