
We propose a system to retrieve background music (BGM) for game scenes. BGM plays an important role for making a particular atmosphere in game scenes, so there have been studies on investigating relations between game scenes and BGM. However, none of the existing studies made an attempt on predicting the audio features of BGM directly from a sequence of images expressing game scenes. In our system, the user inputs a sequence of images of a game scene, then our machine learning model, trained with gameplay videos, predicts the audio features from the input. Finally, the system retrieves the closest musical piece to the predicted audio features. Experimental results show both positive and negative tendencies: the predicted audio features for fight scenes are closer to the features of actually used BGM in fight scenes than those in other scenes (positive); the same musical piece was retrieved for different scenes (negative).
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
