Text-only version of poster presented at NIME 2020. P(l)aying Attention: Multi-modal, multi-temporal music control Nicolas Gold, Chongyang Wang, Temitayo Olugbade, Nadia Bianchi-Berthouze, Amanda C. de C. Williams University College London, UK 1. Introduction Chronic Pain [1,2]: • Prevalent, disabling condition. • Pain maintained by changes in the nervous system, not by ongoing tissue damage. • Harmless everyday movement is challenging. Physiotherapist: • Interprets movement. • Draws attention to important areas of the body. • Explores movement in collaboration with patient. • Assists in management for improvement of patient's physical capability [3]. Conductor: • Interprets music through movement. • Gestures cause orchestra to behave in a certain way [4]. • Links movement, attention, interpretation, and music. People with Chronic Pain: • May find benefit from participating in musical activity. • But may have difficulty in coordination for ensemble participation. • Need agency, and musical manipulations free of absolute time constraint. • Need sonic movement representation that minimises music-synchronous action, reveals interpretation of movement, permits exploration of movement, maintains musical coherence. 2. Sonification Framework Design Sonification needs to be: • Informational [5] to reveal aspects of movement. • Experiential to increase self-efficacy/induces behaviour changes [6, 7]. • For musically expressive applications, retention of music essential. Solution: • Model-based musical sonification. • Three key design considerations: how and when to manipulate, and how to obtain interpretation of movement. Temporal Scales and Contexts: • Movement-synchronous: movement analysed in real-time. • Movement-asynchronous: movement analysed after it has happened (replayed forward, reversed, or temporally scaled). • Music-synchronous: movement corresponds to musical features (e.g. beats). • Music-asynchronous: movement does not correspond to musical features. • Discursive-free: movement is analysed in free time and in any direction for the purpose of discussion. 3. Determining and Representing Attention People with Chronic Pain: • Aim to protect themselves by moving cautiously. • Movement may be inefficient and can contribute to longer-term disability. • Fear and anxiety toward pain lead to different strategies in functional activity. • Body parts engaged in inefficient, bio-mechanically unnecessary ways. • Observed as the use of particular body parts during stages of activity. Machine Learning Attention Interpretation: • Machine learning model to detect protective movement behaviour [8]. • Pays attention to salient body configurational and temporal evidence. • Input is 13 joint angles from 26 joints. • Learns to give more weight to parts and stages most informative for discriminating protective from non-protective movement behaviour. • Realised as weights on joint and time dimensions of movement data. • Normalised into 0-1 for use as gain values. 4. Implementation Proof-of-Concept Implementation: • Stems for thirteen parts. • Two styles: Afro-Cuban percussion (drawing on Uribe [9]) and Pachelbel Canon (selections from the original score [10]) both augmented by one author (Gold). • Purpose to investigate observability of movement interpretation in music. User Interface • Column of buttons on left-hand side represent active data channels. • Each channel mapped to an audio loop, panned hard left or right for separation. • Data loaded using buttons at the top left. • Music playback controlled at the top right. • Data exploration controlled using the panel at the bottom. • Representation figure shows coloured line weights corresponding to attention scores between joints: second modality of data display. 5. Future Work Includes: • Extending the implementation to allow smoothing, aggregation, relative mapping. • User interface enhancements. • New modalities (dyadic representation, and real-time data). • Empirical studies to determine applicability in a range of scenarios. • Generative music directly derived from body movement to 'personalise' audio. • Explore longer musical forms and timbral control. References [1] IASP Task Force on Taxonomy, editor. Classification of Chronic Pain. IASP Press, Seattle, 2nd edition, 1994. [21] I. Tracey and M. C. Bushnell. How neuroimaging studies have challenged us to rethink: is chronic pain a disease? The Journal of Pain, 10(11):1113-1120, 11 2009. [3] A. Singh, A. Klapper, J. Jia, A. Fidalgo, A. Tajadura-Jim´enez, N. Kanakam, N. Bianchi-Berthouze, and A. C. de C. Williams. Motivating people with chronic pain to do physical activity: Opportunities for technology design. In Proc. SIGCHI Conf. on Human Factors in Computing Systems, CHI '14, page 2803-2812, New York, NY, USA, 2014. ACM. [4] W. Siegel. Conducting sound in space. In Proc. Int. Conf. on New Interfaces for Musical Expression, pages 376-380, Copenhagen, Denmark, 2017. Aalborg University Copenhagen. [5] A. Singh, S. Piana, D. Pollarolo, G. Volpe, G. Varni, A. Tajadura-Jim´enez, A. C. de C. Williams, A. Camurri, and N. Bianchi-Berthouze. Go-with-the-flow: Tracking, analysis and sonification of movement and breathing to build confidence in activity despite chronic pain. Human-Computer Interaction, 31(3-4):335-383, 2016. [6] A. Singh, N. Bianchi-Berthouze, and A. C. de C. Williams. Supporting everyday function in chronic pain using wearable technology. In Proc. CHI Conf. on Human Factors in Computing Systems, CHI '17, page 3903-3915, New York, NY, USA, 2017. ACM. [7] A. Tajadura-Jim´enez, M. Basia, O. Deroy, M. Fairhurst, N. Marquardt, and N. Bianchi-Berthouze. As light as your footsteps: Altering walking sounds to change perceived body weight, emotional state and gait. In Proc. 33rd Ann. ACMConf. on Human Factors in Computing Systems, CHI '15, page 2943-2952, New York, NY, USA, 2015. ACM. [8] C. Wang, M. Peng, T. A. Olugbade, N. D. Lane, A. C. de C. Williams, and N. Bianchi-Berthouze. Learning temporal and bodily attention in protective movement behavior detection. In Proc. 8th Int. Conf. on Affective Computing and Intelligent Interaction: Workshops and Demos (ACIIW'19), pages 324-330, Cambridge, UK, 2019. IEEE. [9] E. Uribe. The Essence of Afro-Cuban Percussion and Drum Set. Warner Bros. Pub., Miami, FL, 1996. [10] J. Pachelbel. Canon and Gigue; v (3), cemb; D-Dur; P 37, 1838. Owning institution: Staatsbibliothek zu Berlin - PK, http://resolver.staatsbibliothek-berlin.de/ SBB0000A73000000000 This work is funded by: Horizon 2020 European Union Funding for Research & Innovation EU H2020 FET PROACTIVE EnTimeMent project no.824160 Call H2020-FETPROACT- 2018-2020 Topic: FETPROACT-01-2018 Subtopic: b.Time Ethical Statement: The authors declare that they have no conflicts of interest (financial or otherwise) that would affect the work reported here. Data used in this work was gathered with informed consent under various approvals relating to the EmoPain project: UCL 5625/001, UCLIC/1516/012/Staff Berthouze/Singh; NRES Committee - 11/0514 and used in accordance with UCL Research Ethics Committee approval number: 5095/001 (EnTimeMent project). Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). Copyright remains with the author(s). NIME'20, July 21-25, 2020, Royal Birmingham Conservatoire, Birmingham City University, Birmingham, United Kingdom. Contact: n.gold@ucl.ac.uk