Inexpensive Colour Tracking to Overcome Performer ID Loss Bob Pritchard UBC School of Music bob@mail.ubc.ca Ian Lavery Algorhythmic Software ian_lavery@hotmail.com Overview The UBC Kinect Controlled Artistic Sensing System (KiCASS) uses a Kinect-for-Windows© camera, a high-end Windows laptop, a router, and NuiTrack-based custom written software, allowing users to optically generate media control data by by selecting up to 22 tracking points as well as four hand positions on up to six performers. Problems 1. Tracking Loss - Performer can be occluded by another performer or object, or performer will exit and re-enter the tracking area. When this occurs system loses track of performer. - The system creates a new tracking ID upon reacquisition of performer. - Original tracking points on performer are lost, so no performance data is generated. 2. Tracking Swapping - System will occasionally swap IDs of onscreen performers - Performers lose assigned target points, thus unable to control media The solution To overcome these issues, we developed costume colour tracking to override system- assigned performer IDs and generate consistent performer IDs. Colour Registration of Performers - Analyse 6 frames of each dancer’s torso to find their dominant hue - Torso region defined by NuiTrack user mask and hip-shoulder quadrilateral tracking points - Convert image to HSV (Hue Saturation Value) with Open C library Emgu CV - Create a histogram of all hue values in the torso region. Most frequent colour value selected as the dominant hue - During registration we also select the target points to be tracked on each dancer. Data Check and Optimization - Incoming colour compared to with registered colours. - Confidence Value (CV) = 100 – distance-from-match - Reacquisition: software determines dominant incoming colour and checks closest distance-from-match - If CV falls below certain threshold, system runs optimization - System can swap performer IDs if better match is found Performance - Data (X Y Z coordinates of tracked points and CV) sent over wi-fi in OSC format - Clients receive data and use Max or pD to control audio/video, processing, lighting - Tracking parallelogram area has depth of 8 metres and far width of 6 metres Performance video: h0ps://youtu.be/EpLqE7oQSpA Outcomes - Successful tracking of four dancers to control amplitude of individual audio tracks, based on distance from camera - Occlusion/exit and re-entry/acquisition caused associated audio track fade out/fade in - Issues with light shine on spandex leotards, and performer shadows - Performers/choreographer enjoyed being able to control audio levels through stage positioning Future Work - Preperformance colour registration will include performers moving through performance space to account for lighting differences - Addition of limb tracking and target points to increase control and artistic possibilities - Addition of e-textile RUBS sensors to increase control opportunities Acknowledgements We acknowledge that the University of British Columbia is located on the traditional, ancestral, and unceded territory of the hən̓q̓əmin̓əm̓ -speaking xwməθkwəy̓əm (Musqueam) people.