User interaction

From MIReS

Jump to: navigation, search

The grand challenge of user interaction is how to design MIR systems that put the user at the centre of the system. This applies to the whole interaction loop, including visualisation, input devices, manipulation metaphors, and also system adaptation to user behaviour. This challenge is relevant because it contributes to both the user’s and to the researcher’s (e.g. system designer’s) understanding of the system’s features and components, the overall purpose of the system, and the contribution the system can make to the user’s activities. The benefit to users is more productive workflows and systems which better serve the users’ needs. The researchers stand to benefit from the feedback loop which enables them to fine-tune and develop systems with greater accuracy. Effective user-oriented research will have a major impact on the usability of MIR systems and their wider deployment.


Back to → Roadmap:User perspective


Contents

State of the art

In the last decade, Human Computer Interaction (HCI) research has witnessed a change in focus from conventional ways to control and communicate with computers (keyboards, joysticks, mice, knobs, levers, buttons, etc.) to more intuitive uses of non-conventional devices such as gloves, speech recognition, eye trackers, cameras, and tangible user interfaces. As a result of technological advances and the desire to surpass the WIMP (window, icon, menu, pointing device) limitations, interaction research has progressed beyond the desktop and the ubiquitous graphical user interface (GUI) into new physical and social contexts. Since terms such as "multi-touch" and gestures like "two-finger pinch and zoom" have become part of the users’ daily life, novel research areas such as "tangible interaction" have finally entered the mainstream. However, aside from the ongoing research explicitly focused towards real-time musical performance which typically falls under the New Interfaces for Musical Expression ([www.nime.org NIME]) discipline, not much of this research has yet been devoted to novel interface and interaction concepts in the field of MIR.

The use of HCI and related methodologies in MIR

The Association for Computing Machinery defines human-computer interaction (HCI) as "a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them." (ACM SIGCHI Curricula for Human-Computer Interaction) HCI involves the study, planning, and design of the interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioural sciences, design and several other fields of study. Interaction between users and computers occurs at the interface which is the result of particular affordances of a given combination of software and hardware. The basic and initial goal of HCI is therefore to improve the interactions between users and computers by making computers more usable and responsive to the user's needs. For decades HCI has mostly focused on making interaction more efficient, though more recently the emphasis has shifted to the user’s Quality of Experience, highlighting the benefits of beauty and fun, and the intrinsic values of the experience and its outcomes [e.g. Norman, 2004; McCarthy and Wright, 2004]. The human component in HCI is therefore highly relevant from the cultural, psychological and physiological perspectives.

MIR could benefit from knowledge inherited from HCI and other related disciplines such as User Experience (UX), and Interface and Interaction Design studies. These methodologies could bring benefits not only to the conception of MIR systems at earlier design stages, but also for the evaluation and subsequent iterative refinement of these systems. While the evaluation of MIR systems is traditionally and historically conceived to provide categorically correct answers (e.g finding or identifying a known target song), new evaluation challenges are presented by open systems which leave users room for interpretation [e.g. Sengers and Gaver, 2006], include more subjective aspects (e.g. the users’ emotions, perceptions and internal states [e.g. Hekkert, 2006]), or encourage contextual engagement [e.g. Hassenzahl and Tractinsky, 2011](The current SOA in MIR evaluation of research results is covered in section Evaluation methodologies). Furthermore, beyond the evaluation of User Experience, another MIR component that would directly benefit from HCI-related knowledge would be research into open and holistic systems for the creation of MIR systems and tools.

Music Search Interaction

Over the past 12 years a few projects from the MIR community have contributed to the development of interfaces for music search and discovery. In the field of data visualisation, there is an extensive bibliography on the representation of auditory data. In the particular case of the visual organisation of musical data, solutions often consist of extracting feature descriptors from data files, and creating a multidimensional feature space that will be projected onto a 2D surface, using dimensionality reduction techniques (e.g. Islands of Music [Pampalk, 2003]; SOM: Self Organizing Map [Kohonen, 2001]; SOMeJB [Lidy and Rauber, 2003] and [Hlavac et al., 2004]). Beyond 2D views, the advantage of a topological metaphor has been applied to facilitate users’ exploration of big data collections in nepTune, an interactively explorable 3D version of Islands of Music, which supports spatialised sound playback [Knees et al., 2007], and the Globe of Music which places a collection on a spherical surface to avoid any edges or discontinuities [Leitich and Topf, 2007]. More recently, MusicGalaxy [Stober and Nürnberger, 2010] implements an adaptive zoomable interface for exploration that makes use of a complex non-linear multi-focal zoom lens and introduces the concept of facet distances representing different aspects of music similarity. Musicream [Goto and Goto, 2005] uses the "search by example" paradigm, representing the songs with dynamic coloured circles which fall from the top of the screen and when selected show their title and can be used to ‘fish’ for similar ones.

In terms of developing a user-oriented visual language for screen-based music searches, the interactive aspect of most commercial library music applications has resorted to the metaphor of spreadsheets (e.g. iTunes) or has relied on searching for music by filling a set of forms and radio buttons (e.g. SynchTank). Innovative approaches from the MIR community suggested visually mapping sound clusters into abstract "islands" [e.g. Pampalk, 2003]; collaborative mapping onto real geographical visual references (e.g. Freesound); and tangible tabletop abstract symbols (e.g. SongExplorer [Julià and Jordà, 2009]). Visual references have included control panels used in engineering (e.g. MusicBox [Lillie, 2008]); gaming platforms (Musicream [Goto and Goto, 2005]); lines of notation (e.g. Sonaris and mHashup [Magas et al., 2008]); or turntables (Songle [Goto et al., 2012]).

A few MIR-driven search interfaces have addressed different user contexts. Mediasquare [Dittenbach et al., 2007] addresses social interaction in 3D virtual space where users are impersonated by avatars enabling them to browse and experience multimedia content by literally walking through it. decibel 151 [Magas et al., 2009] enables multi-user social interaction in physical space by turning each user into a "walking playlist", creating search environments for social networking in real time. Special visual interfaces have addressed poorly described or less familiar music to the user (e.g. field recordings; ethnomusicological collections) to both educate and allow music discovery in an entertaining way (e.g. Songlines 2010 and [Magas and Proutskova, 2013]). User contexts however remain vastly under-researched and remain a major challenge for the MIR community.

Some of the above interfaces have adopted HCI research methods which consider MIR-driven search systems holistically, not only as visual representations of data, but focusing on the user Quality of Experience. This resulted from a coherent system design approach which creates a feedback loop for an iterative research and innovation process between the interactive front end and the data processing back end of the application. Further research challenges are presented by a holistic approach to MIR user-oriented system design in the context of novel devices and modalities, real-time networks, collaborative platforms, open systems, physical experiences and tangible interfaces.

Tangible and Tabletop Interaction

Tangible User Interfaces (TUI), which combine control and representation in a single physical device emphasise tangibility and materiality, physical embodiment of data, bodily interaction and the embedding of systems in real spaces and contexts. Although several implementations predate this concept, the term Tangible User Interface was coined at the MIT MediaLab in 1997 [Ullmer and Ishii, 2001] to define interfaces which augment the real physical world by coupling digital information to everyday physical objects and environments. Such interfaces contribute to the user experience by fusing the representation and control of digital data with physical artefacts thus allowing users to literally "grasp data" with their own hands.

Within the domain of Tangible Interaction, Tabletop Interaction constitutes a special research field which uses the paradigm of a horizontal surface meant to be touched and/or manipulated via the objects placed on it. In contrast to the mouse and keyboard interface model which restricts the user's input to an ordered sequence of events (click, click, double click, etc.), this type of interface allows multiple input events to enter the system at the same time, enabling any action at any time or position, by one or several simultaneous users. The implicit ability of tabletop interfaces to support physical tracked objects with particular volume, shape and weight properties, expands the bandwidth and richness of the interaction beyond the simple idea of multi-touch. Such objects can represent abstract concepts or real entities; they can relate to other objects on the surface; they can be moved and turned around on the table surface, and these spatial changes can affect their internal properties and their relationships with neighbouring objects. The availability of open-source, cross-platform computer vision frameworks that allow the tracking of fiducial markers combined with multi-touch finger tracking (e.g. reacTIVision, which was developed for the Reactable project [Bencina et al., 2005]), have become widely used among the tabletop developers community (both academic and industrial), and have increased the development of tabletop applications for educational and creative use [e.g. Khandelwal and Mazalek, 2007; Gallardo et al., 2008].

There is a growing interest in applying Tabletop Interfaces to the music domain. From the Audiopad [Patten et al., 2002] to the Reactable [Jordà et al., 2007], music performance and creation has become the most popular and successful application field in the entire lifetime of this interaction paradigm. Tabletop interfaces developed using MIR have specifically focused on interacting with large music collections. Musictable [Stavness et al., 2005], takes a visualisation approach similar to the one chosen in Pampalk’s Islands of Music, for creating a two dimensional map that, when projected on a table, is used to make collaborative decisions to generate playlists. Hitchner [Hitchner et al., 2007] uses a SOM to build the map visually represented by a low-resolution mosaic, enabling the users to redistribute the songs according to their preferences. Audioscapes is a framework enabling innovative ways of interacting with large audio collections using touch-based and gestural controllers [Ness and Tzanetakis, 2009]. The MTG’s SongExplorer [Julià and Jordà, 2009] uses high-level descriptors of musical songs applied to N-Dimensional navigation on a 2D plane, thus creating a coherent 2D map based on similarity with specially designed tangible pucks for more intuitive interaction with the tabletop visual interface. Tests comparing the system with a conventional GUI interface controlling the same music collection, showed that the tabletop implementation was a much more efficient tool for discovering new, valuable music to the users. Thus the specific affordances of tabletop interfaces (support of collaboration and sharing of control; continuous, real-time interaction with multidimensional data; support of complex, expressive and explorative interaction [Jordà, 2008]), together with the more ubiquitous and easily available individual multi-touch devices, such as tablets and smart-phones, can bring novel approaches to the field of MIR, not only for music browsing but particularly for the more creative aspects related to MIR music creation and performance.

The physical embodiment of data, bodily interaction and the embedding of systems in real spaces and contexts is particularly present in recent research into gestural and spatial interaction. The Real-Time Musical Interactions team at IRCAM has been working with motion sensors embedded within everyday objects to explore concepts of physical and gestural interaction which integrate performance, gaming and musical experience. Their Interlude project combined interactivity, multimodal modelling, movement tracking and machine learning to explore new means for musical expression [Bevilacqua et al., 2011a], [Bevilacqua et al., 2011b] and [Schnell et al., 2011]. The results included the Urban Musical Game which breaks down some of the boundaries between audience and musician by producing a sound environment through the introduction of a musical ball; Mogees which uses piezo sensors coupled with gesture recognition technology for music control allowing users to easily transform any surface into a musical interface; and MOs (Modular Musical Objects) which represent one of the pioneering attempts to answer the challenges of tangible, behaviour-driven musical objects for music creation. This project has demonstrated the huge potential of research into physical and gestural interfaces for MIR within the context of future internet applications for the Internet of Things.


References


Challenges



Back to → Roadmap:User perspective

Personal tools
Namespaces
Variants
Actions
Navigation
Documentation Hub
MIReS Docs
Toolbox