User behaviour
From MIReS
(→References) |
|||
Line 101: | Line 101: | ||
* [Widmer et al., 2003] G. Widmer, S. Dixon, W. Goebl, E. Pampalk and A. Tobudic. In Search of the Horowitz Factor. ''AI Magazine'', 24 (3): 111-130, 2003. | * [Widmer et al., 2003] G. Widmer, S. Dixon, W. Goebl, E. Pampalk and A. Tobudic. In Search of the Horowitz Factor. ''AI Magazine'', 24 (3): 111-130, 2003. | ||
+ | |||
=='''[[User behaviour: Challenges|Challenges]]'''== | =='''[[User behaviour: Challenges|Challenges]]'''== |
Latest revision as of 17:58, 20 April 2013
Music is listened to, performed and created by people. It is therefore essential to consider the user as central to the creation of user scenarios, hence to the development of technologies. Developing user applications involves analysing the user needs in respect of novel scenarios and the user behaviour in respect of existing ones, thus enabling the creation of the user-specification-development loop. Taking into account user needs applies to all stages of the development loop, however the analysis of user behaviour must be carefully conducted by a specialist. Gathering feedback from users is a research field in itself and shouldn’t be done without carefully designed methods. Considering user needs through the analysis of user behaviour will have a great impact on the usability of the developed MIR technologies.
Back to → Roadmap:User perspective
Contents |
State of the art
Activities related to music can be roughly grouped into (i) listening (to recorded media or live performances; review/discussion of what was heard), (ii) performing (interpretation, improvisation, rehearsal, recording, live performance) and (iii) creating (composition, recording, studio production, improvisation). Other activities are concerned with researching, studying (education, musicology), sharing, worship and dance (see part Other exploitation areas). Within each group, MI research can relate to the analysis of practices or to the proposal of tools to help the practice.
Listening
Among these categories, research presented in conferences such as ISMIR mainly focus on the listening scenario and propose tools to help people access (listen to) music. But little attention is paid to analysing user practices. As pointed out by [Weigl and Guastavino, 2011], a focus on the user has repeatedly been identified as a key requirement for future MIR research, yet empirical user studies have been relatively sparse in the literature, the overwhelming research attention in MIR remaining systems-focused. [Lee and Cunningham, 2012] proposes an overview of user studies performed so far in the MIR field and propose explanation why their impact on the field have been weak so far: lack of findability, dominance of small scaled studies that are difficult to generalize.
Important questions related to the user are: What are its requirements and information needs? How do people organise their music? How would they like to see, access, search through digital libraries? What is the influence of the listening context? What is the role of social relations? Given that one of the Grand-Challenges in MIR is the creation of a full-featured system [Downie et al., 2009], these questions should be answered in order to make the system useful for users. This is especially true considering that the results provided by the little research done on this topic yielded unexpected results. For example [Laplante and Downie, 2006] showed that some of the users are seeking new music without specific goals in mind, possibly just to update and expand their musical knowledge or for the sheer pleasure of searching. With this in mind, systems should support various browsing approaches. [Cunningham et al., 2004] highlight user needs for use tagging (scenarios in which a given piece of music might be relevant), a subject currently largely under-studied. [Laplante, 2010] identifies the changes in musical taste according to social factors and [Cunningham and Nichols, 2009] suggest support for collaborative play-list creation. [Uitdenbogerd and Yap, 2003] conclude that textual queries for melodic content are too difficult to be used by ordinary users. The various possibilities to design music recommendation systems that take user into account are summarized in [Schedl and Flexer, 2012]. According to [Kolhoff et al., 2008], landscape representations or geographic views of music collections have certain disadvantages and that users seem to have preferences for simple and clean interfaces. A recent survey made within the Chorus+ EU project [Lidy and Linden, 2011], also highlights important points such as the prevalence of YouTube as the most-used music service (among participants to the survey). It also highlights the fact that most people search using artist, composer, song title, album or genre but the search possibilities enabled by new technologies (taste, mood or similarity) appear less prevalent.
Performing
If few papers relate to the listener-behaviour, this is not the case for performers and performances (in terms of music concerts, opera, theatre, dance) or interactions (interactive installations or instruments). A large community has been studying the subject of performance from the pioneer works of [Seashore, 1938]. In this, a performer is considered as the essential mediator between composer and listener. These studies show the impact of the performer, the performances, the large-structure and micro-structure, and the intentional mood on the choice of tempo, timing, loudness, timbre and articulation [Rink, 1995], [Gabrielsson, 2003]. First experiments were made using piano analysis (for ease of event-recoding) [Parncutt, 2003], but today they are extended to saxophone [Ramirez et al., 2007], cello [Chudy and Dixon, 2010] and singing voice. Understanding the process of performance has several goals: a better understanding of what makes a great interpretation (the Horowitz or Rachmaninov factors [Widmer et al., 2003]); music education; and automatic expressive performances (KTH model of [Sundberg et al., 1983] and Rendering Contest (Rencon)). Tools to visualise performance interpretation have also been proposed [Dixon et al., 2002]. According to Delgado [Delgado et al., 2011], different research strategies can be distinguished: (a) analysis-by-measurement (based on acoustic and statistical analysis of performances); (b) analysis-by-synthesis (based on interviewing expert musicians); and (c) inductive machine learning applied to large database of performances. An example of the use of MIR research for inductive machine learning is given by [Chudy and Dixon, 2010]. Considering that performance is not limited to the instrumentalists, the conductor is also studied [Luck et al., 2010], and research includes studies on interaction and gesture ([Jorda, 2003], [Bevilacqua et al., 2011]). The large number of related contributions at conferences such as ISPS (International Symposium on Performance Science) shows that this domain is very active. As another example of the activity in this field, the current SIEMPRE EU project aims at developing new theoretical frameworks, computational methods and algorithms for the analysis of creative social behaviour with a focus on ensemble musical performance.
Composing
While historical musicology aims at studying composition once published, hence not considering the composition practice, research groups such as the one of Barry Eaglestone [Nuhn et al., 2002] at the Information Systems and the Music Informatics research groups, or new projects such as MuTec2 aim at following composers during their creative project (using sketches, drafts, composer interviews, and considering composer readings). Related to this new field, the conference TCPM 2011 "Tracking the Creative Process in Music" has been created.
References
- [Bevilacqua et al., 2011] F. Bevilacqua, N. Schnell and S. Alaoui. Gesture capture: Paradigms in interactive music/dance systems. In Transcript Verlag, editor, Emerging Bodies: The Performance of Worldmaking in Dance and Choreography, p. 183-193, 2011.
- [Chudy and Dixon, 2010] M. Chudy and S. Dixon. Towards music performer recognition using timbre. In Proceedings of the 3rd International Conference of Students of Systematic Musicology, pp. 45-50, Cambridge, UK, 2010.
- [Cunningham et al., 2004] S. Cunningham, M. Jones and S. Jones. Organizing digital music for use: an examination of personal music collections. In Proceedings of the 5th International Conference on Music Information Retrieval, pp. 447–454, Barcelona, Spain, 2004.
- [Cunningham and Nichols, 2009] S. Cunningham and D. Nichols. Exploring social music behaviour: An investigation of music selection at parties. In Proceedings of the 10th International Society for Music Information Retrieval Conference, pp. 26–30, Kobe, Japan, 2009.
- [Delgado et al., 2011] M.Delgado, W. Fajardo, and M. Molina-Solana. A state of the art on computational music performance. Expert Systems with Applications, 38(1): 155–160, 2011.
- [Dixon et al., 2002] S. Dixon, W. Goebl, and G. Widmer. The Performance Worm: Real time visualisation of expression based on Langner’s tempo-loudness animation. In Proceedings of International Computer Music Conference, Göteborg, Sweden, 2002.
- [Downie et al., 2009] J.S. Downie, D. Byrd and T. Crawford. Ten Years of ISMIR: Reflections on Challenges and Opportunities. In Proceedings of the 10th International Society for Music Information Retrieval Conference, pp. 13-18, Kobe, Japan, 2009.
- [Gabrielsson, 2003] A. Gabrielsson. Music performance research at the millennium. Psychology of music, 31(3): 221–272, 2003.
- [Jorda, 2003] S. Jorda, (2003). Interactive music systems for everyone exploring visual feedback as a way for creating more intuitive, efficient and learnable instruments. In Proceedings of Stockholm Music Acoustics Conference, Stockholm, Sweden, 2003.
- [Kolhoff et al., 2008] P. Kolhoff, J. Preuß, and J. Loviscach. Content-based icons for music files. Computers & Graphics, 32(5): 550–560, 2008.
- [Laplante, 2010] A. Laplante. The role people play in adolescents music information acquisition. In Proceedings of Workshop on Music Recommendation and Discovery, Barcelona, Spain, 2010.
- [Laplante and Downie, 2006] A. Laplante and J. Downie. Everyday life music information-seeking behaviour of young adults. In Proceedings of the 7th International Conference on Music Information Retrieval, pp. 381–382, Victoria, Canada, 2006.
- [Lee and Cunningham, 2012] Jin Ha Lee and Sally Jo Cunningham. The impact (or non-impact) of user studies in music information retrieval. In Proceedings of the 13th International Society for Music Information Retrieval Conference, Porto, Portugal, 2012.
- [Lidy and Linden, 2011] T. Lidy and P. van der Linden. Think-tank on the future of music search, access and consumption. In European Community's Seventh Framework Programme (FP7/2007-2013), editor, MIDEM, Cannes, France, 2011.
- [Luck et al., 2010] G. Luck, P. Toiviainen, and M. Thompson. Perception of expression in conductors’ gestures: A continuous response study. Music Perception, 28(1): 47–57, 2010.
- [Nuhn et al., 2002] Nuhn, R., Eaglestone, B., Ford, N., Moore, A., and Brown, G. (2002). A qualitative analysis of composers at work. In Proceedings of the International Computer Music Conference, pp. 597-599, Göteborg, Sweden, 2002.
- [Parncutt, 2003] R. Parncutt. Accents and expression in piano performance. Perspektiven und Methoden einer Systemischen Musikwissenschaft, pp. 163–185, 2003.
- [Ramirez et al., 2007] R. Ramirez, E. Maestre, and A. Pertusa. Identifying saxophonists from their playing styles. In Proceedings of the 30th AES Conference, Finland, 2007.
- [Rink, 1995] J. Rink. The Practice Of Performance: Studies in Musical Interpretation. Cambridge University Press, 1995.
- [Schedl and Flexer, 2012] Markus Schedl and Arthur Flexer. Putting the user in the center of music information retrieval. In Proceedings of the 13th International Society for Music Information Retrieval Conference, Porto, Portugal, 2012.
- [Seashore, 1938] C. E. Seashore. Psychology of music. New York: McGraw-Hill, 1938.
- [Sundberg et al., 1983] J. Sundberg, L. Fryden, and A. Askenfelt. What tells you the player is musical? An analysis-by-synthesis study of music performance. Publication issued by the Royal Swedish Academy of Music, volume 39, pp. 61–75, Stockholm, Sweden, 1983.
- [Uitdenbogerd and Yap, 2003] A. Uitdenbogerd and Y. Yap. Was parsons right? an experiment in usability of music representations for melody-based music retrieval. In Proceedings of the 4th International Conference on Music Information Retrieval , pp. 75–79, Baltimore, Maryland, USA, 2003.
- [Weigl and Guastavino, 2011] D. Weigl and C. Guastavino. User studies in the music information retrieval literature. In Proceedings of the 12th International Society for Music Information Retrieval Conference, Miami, USA, 2011.
- [Widmer et al., 2003] G. Widmer, S. Dixon, W. Goebl, E. Pampalk and A. Tobudic. In Search of the Horowitz Factor. AI Magazine, 24 (3): 111-130, 2003.
Challenges
- Analyse user needs and behaviour carefully. Gathering feedback from users is actually a research field in itself and shouldn’t be done without carefully designed methods.
- Develop tools and technologies that take user needs and behaviour into account. Much work in MIR is technology-driven, rather than being user, context or application driven. Thus the challenge is to step into the shoes of users and understand their world-view in order to produce useful applications. User studies must be considered right from the beginning of a research project. Appropriate tools and technologies must be developed to cater for different types of users who perform the same task in different contexts.
- Identify and study new user roles related to music activities. The aforementioned user-types (listener, performer and composer) are prototypes and not orthogonal (guitar-hero involves both listening and performing). Moreover users can have different expertise for the same role (common people, musicologist). Development of MIR tools will also create new user-profiles that need to be identified and taken into account.
- Develop tools that automatically adapt to the user. According to the role, profile and context the tools must be personalised. Those profiles are therefore dynamic, multidimensional. Those are also fuzzy given the nature of the input provided by the user. An alternative to the personalisation of tools is the use of a companion (the "music guru").