joe.wright
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

joe.wright
2020-07-17 23:28
@joe.wright set the channel purpose: Paper Session 3: Sensors, Actuators and Haptics

niccolo.granieri
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

hassan.hussain5
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

overdriverecording
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

lamberto.coccioli
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

jonathan.pearce
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

richard.j.c
2020-07-17 23:28
has joined #papers03-sensors-actuators-haptics

joe.wright
2020-07-18 12:13
@joe.wright has renamed the channel from ?papers3-sensors-actuators-haptics? to ?papers03-sensors-actuators-haptics?

eskimotion
2020-07-20 09:25
has joined #papers03-sensors-actuators-haptics

edmund.hunt
2020-07-20 09:25
has joined #papers03-sensors-actuators-haptics

acamci
2020-07-20 17:01
has joined #papers03-sensors-actuators-haptics

aaresty
2020-07-20 17:21
has joined #papers03-sensors-actuators-haptics

10068197
2020-07-20 17:21
has joined #papers03-sensors-actuators-haptics

a.nonnis
2020-07-20 17:22
has joined #papers03-sensors-actuators-haptics

a.macdonald
2020-07-20 17:23
has joined #papers03-sensors-actuators-haptics

andreas
2020-07-20 17:24
has joined #papers03-sensors-actuators-haptics

dianneverdonk
2020-07-20 17:25
has joined #papers03-sensors-actuators-haptics

likelian
2020-07-20 17:25
has joined #papers03-sensors-actuators-haptics

ko.chantelle
2020-07-20 17:25
has joined #papers03-sensors-actuators-haptics

anika.fuloria
2020-07-20 17:26
has joined #papers03-sensors-actuators-haptics

clemens.wegener
2020-07-20 17:26
has joined #papers03-sensors-actuators-haptics

10068197
2020-07-22 12:57
How do I sign up for this session? Thanks.


niccolo.granieri
2020-07-22 12:58
Hi @10068197, no signup required. Just join the webinar at the scheduled time

10068197
2020-07-22 12:58
thanks so much. Magic!!!

niccolo.granieri
2020-07-22 12:58
You can find all the links on the Timtable and Zoom Links available from the conference hub (http://nime2020.bcu.ac.uk/conference-hub/)

info041
2020-07-22 13:31
is it possible to know which papers will be presented and in which order? Also is there anything like this online for all the papers in the conference?

marije
2020-07-22 13:32
on that same conference hub page you'll find the program

niccolo.granieri
2020-07-22 13:40
We will start with paper session 3 - in 5 mins, see you there!



niccolo.granieri
2020-07-22 13:42
Just in case we have issues with Zoom Captions, this is the link for the external captions: https://www.streamtext.net/player?event=NIME220720

niccolo.granieri
2020-07-22 13:44
we're live now!

info041
2020-07-22 13:47
Hi Marije, I don't see this just timetable and the general programme with no individual papers listed, can you post a link if you know where this is?

a.mcpherson
2020-07-22 13:47
First paper *The Da ??s: A Haptically Enabled New Interface for Musical Expression for Controlling Physical Models for Sound Synthesis* is paper 119 in the proceedings

marije
2020-07-22 13:48
there is a menu item, above each day to show all details

max
2020-07-22 13:49
Link to the pdf papers would be great on that page too

marije
2020-07-22 13:50
paper 119

niccolo.granieri
2020-07-22 13:51
Hi Max! Thanks for asking: the 7th voice on the conference hub is exactly a link to the pdf papers.

info041
2020-07-22 13:51
Ha, it's not very obvious, thanks!

a.mcpherson
2020-07-22 13:52
*The Da ??s: A Haptically Enabled New Interface for Musical Expression for Controlling Physical Models for Sound Synthesis* Paper 119 in proceedings *Force dynamics as a design framework for mid-air musical interfaces* Paper 70 in proceedings *Knotting the memory//Encoding the Khipu_: Reuse of an ancient Andean device as a NIME* Paper 94 in proceedings *Surface Electromyography for Direct Vocal Control* Paper 88 in proceedings or https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/65364/Reed%20Surface%20Electromyography%20for%202020%20Published.pdf?sequence=2 *Touch Responsive Augmented Violin Interface System II: Integrating Sensors into a 3D Printed Fingerboard* Paper 32 in proceedings Proceedings at: https://nime2020.bcu.ac.uk/paper-pre-proceedings/

tragtenberg
2020-07-22 13:57
@pjch17 Could you please tell us why you have discarded the hall effect sensor?

max
2020-07-22 13:59
I meant here:

marije
2020-07-22 14:00
The force feedback from the elastics is also already a haptic mode of feedback

dianneverdonk
2020-07-22 14:00
Nice project and talk, @pjch17. What kind of backgrounds did the participants have? Musically underlayed, technically, etc.? Or did I miss this info while you mentioned it? Thanks!

a.r.jensenius
2020-07-22 14:00
@pjch17 Great project. Very interesting finding about participants wanting to play it as a drum. Did they explain why? Because of the visual appearance? What did you tell them before they started testing? Any priming effect in the instruction?

fcac
2020-07-22 14:00
Do you think that the rounded shape and the dimensions that resemble a percussion instrument communicated a different instrumental inheritance from the actual mapping?

g.moro
2020-07-22 14:02
@pjch17 about sensors, what IR sensors did you use? Did you try with different currents through the IR LED to compensate for the larger distance ?

weixler
2020-07-22 14:02
@eskimotion where are you now ? looks nice.

travis.west
2020-07-22 14:02
could you describe the elastic suspension in a bit more detail?

jmalloch
2020-07-22 14:03
Thanks @pjch17 ? considering your findings, do you have plans to either recreate the instrument with a different shape/physical appearance or try a percussive mapping?

marije
2020-07-22 14:03
and mentioned in the paper

a.mcpherson
2020-07-22 14:03
Great project @pjch17! It's nice to see passive mechanical elements (elastic bands) used as part of the sensing process. The lo-fi prototyping was also a nice step. Finally, glad to see that design plans are available [https://github.com/PelleJuul/dais].

niccolo.granieri
2020-07-22 14:04
Oh, I get what you mean, thanks for the tip: considering that the proceedings are not final we decided not to link them directly in the timetable. Unfortunately, being our team a team of 3, we will be unable to make such a major edit on the website during the conference. We do appreciate the feedback though so please, keep the comments coming! (P.S. Giulio Moro provided us with a version of the proceedings with all the pdfs named according to the paper. The proceedings are much easier to navigate now.)

g.moro
2020-07-22 14:04
Also, did the IR sensors not get affected by the angle of the reflective surface as much as the Hall sensors?

dianneverdonk
2020-07-22 14:06
Thanks Andrew for collecting and sharing these links in here, very useful!

marije
2020-07-22 14:06
possibly also unwanted interaction between the magnet of the haptic vibration feedback and the hall sensor (magnetic field)?

pjch17
2020-07-22 14:08
The IR sensors used are APDS-9900, and I think we're driving them at max strength. IR sensors are pervious to angle, but not nearly at much as hall-effect sensors. In the future we'd like to try out time-of-flight type sensors.

pjch17
2020-07-22 14:09
I hadn't thought of that, but it could be interesting to recreate the instrument with a different shape. Maybe a not-round shape would break the drum-like associations.

pjch17
2020-07-22 14:10
Thanks @a.mcpherson!

dianneverdonk
2020-07-22 14:13
@a.mcpherson this is what we might need for the in-progress NIME Publications Ecosystem: an active linking 'system' like you're at right now :smiley:

lukedahl
2020-07-22 14:15
Very interesting work! As I?ve understood Johnson?s work (which may be idiosyncratic), we experience the music as an agent who encounters various forces. And the listeners imagine themselves as experiencing the same forces. I find it interesting that here the musician is the creator of the forces. (I guess not a clear question!)

a.r.jensenius
2020-07-22 14:16
@aes I like the concept "mid-air" performance. How/why did you decide on this term?

ko.chantelle
2020-07-22 14:16
You mentioned metaphors related to physical relationships. Was there any metaphors related to colour or light? Like a dark or bright sound?

s.holland
2020-07-22 14:16
Really like this, Anders & Mads - super neat application of conceptual metaphor

xname
2020-07-22 14:17
can I get a link to the paper?

niccolo.granieri
2020-07-22 14:17
Hi @xname, proceedings can be found on the conference hub,

julian
2020-07-22 14:17
About that bow analogy: How was it perceived by the participants that the plate has to be tilted to start the bowing and tilted into the opposite direction to stop? Also, did I understand correctly that the bow also have dampening so that if you just stop interacting with the plate, the sound would fade out?

c.n.reed
2020-07-22 14:17
Would you say that the musician is the creator of the forces or rather the director of existing forces in this case (I'm interested from the perspective of the relationship to existing schema)?


g.moro
2020-07-22 14:18
thanks

a.mcpherson
2020-07-22 14:18
Very interesting work! I'm reminded of another NIME paper from Ward and Torre, "Constraining Movement as a Basis for DMI Design and Performance", https://www.nime.org/proceedings/2014/nime2014_404.pdf That paper used Laban Movement Analysis, and I wonder whether you had any experience with how the Laban approach might differ?

s.holland
2020-07-22 14:18
Lots of relevant work by Zbikowski?

christian.frisson
2020-07-22 14:19
Have you considered providing haptic feedback with mid-air devices such as these from Ultraleap (https://www.ultraleap.com/) or wearable devices?

c.n.reed
2020-07-22 14:19
Re diversity here: looking at speakers of other languages who may have alternate directional schema based on different language

tom.mitchell
2020-07-22 14:20
Hi! Really nice use of force metaphors to communicate affordances - hard in mid-air interaction. Some of the schema seem to lend themselves more easily to hand gestures and look more obvious than others. Can you comment on any observations on which ones worked well and otherwise and why this might be?

s.holland
2020-07-22 14:20
Likely relevant stuff in Katie Wilkie?s PhD?

jmalloch
2020-07-22 14:21
Hi @s.holland ? any work in particular you recommend?

dianneverdonk
2020-07-22 14:21

erik.nystrom
2020-07-22 14:24
Great paper @aes & Mads. This is a very coherent application of Johnson's conceptual metaphors. I wonder to what extent your applications are anchored in mid-air interfaces as opposed to sound synthesis mapping ? if the latter, would it be straightforward to apply these concepts to other kinds of physical interfaces?

marije
2020-07-22 14:25
The Weaving Codes project seems related to this! http://kairotic.org/ - also looked at the Khipu

aes
2020-07-22 14:27
Thank you. I think this is somewhat similar to the question by @c.n.reed. We thought of music as a (set of) forces that the user would be able to interact with.

laddy.cadavid-hinojos
2020-07-22 14:27
Is beautiful!! thanks for the reference!

mario.buoninfante
2020-07-22 14:28
it seems like there are knobs on the instrument, could you tell us a bit about them? if they're not knobs, what are they?

manolimoriaty
2020-07-22 14:28
is someone's microphone on?

tragtenberg
2020-07-22 14:28
Congratulations @laddy.cadavid-hinojos! Good to see NIME inspired by non-european cultures. Our cultural heritage is our wealth.

niccolo.granieri
2020-07-22 14:29
Solved!

abi
2020-07-22 14:29
Whew, seems to be muted now!

manolimoriaty
2020-07-22 14:29
thanks, it was making following the presentation v difficult!

aes
2020-07-22 14:29
This looks very relevant, I'll definitely have a look at that - thank you!

a.r.jensenius
2020-07-22 14:30
Great project @laddy.cadavid-hinojos - looks amazing! I might have missed it in your presentation, but could you explain a little more about how the knotting influences the sound?

m.ortiz
2020-07-22 14:31
Do the wires become 'looser' over time and change the programmed response?

pjch17
2020-07-22 14:31
Great question. How are the knots sensed and does the position, size, shape and type matter? What kind of synthesis algorithm and parameters are they mapped to?

shatin
2020-07-22 14:32
Fascinating presentation. Would like to hear more about the mapping of the knots to their sonic embodiment!

konstantinos.vasilako
2020-07-22 14:32
The concept of live coding also is explaining the process of changing the source code of a running process and in turn the performer shapes their next steps according to the response of the real-time suspending, hacking, and replacing some algorithms, on this basis, how the interface tackles these?

xname
2020-07-22 14:32
got it many thanks

matthew.mosher
2020-07-22 14:33
Very nice instrument! It looks like the string you are actively tying with gets disconnected. Does this mean that the 'active' cord does not make sound until it is reconnected?

xname
2020-07-22 14:35
however, I have missed the title of Andre?s paper

xname
2020-07-22 14:35
Anders? paper

aes
2020-07-22 14:35
We have thought about various forms of feedback (visual, haptic) and how they would affect the user experience. I will have another look at the other products from ultraleap. But the premise where the feedback is mainly/only auditory can be a very stimulating design problem.

alarcon.xime
2020-07-22 14:36
A very beautiful instrument and concept! Do you always use all strings?

niccolo.granieri
2020-07-22 14:36
You can find all the titles of the papers in chronological order on the Timetable :slightly_smiling_face:

aes
2020-07-22 14:36
Absolutely. :+1:

fcac
2020-07-22 14:37
Great project, @laddy.cadavid-hinojos! I think that carving the local culture contributes a lot to exciting instrument designs. =D

xname
2020-07-22 14:37
I am sorry I cannot find it computer bit stuck

niccolo.granieri
2020-07-22 14:38
"Force dynamics as a design framework for mid-air musical interfaces"

a.r.jensenius
2020-07-22 14:39
@laddy.cadavid-hinojos did you show the instrument to any traditional khipu performers (are there any/many?)? What did they think about your new version?

christian.frisson
2020-07-22 14:39
Thanks Anders!

alarcon.xime
2020-07-22 14:40
@laddy.cadavid-hinojos congratulations! Your project is so beautiful. Looking forward to the performance. I was wondering on the use of all the cords, and variations, and how you appropriate the instrument, it is mysterious too, and in the number of strings used, number of knots per string. I saw Cecilia Vicuņa Khipu, and it is great to see your version and approach!

laddy.cadavid-hinojos
2020-07-22 14:41
thank you for your question, not during the performance since they are fastened in the box once the strings are knotted, once the performance is over, I undo the knots and since it is conductive rubber, it recovers its initial length after a couple of hours

joe.wright
2020-07-22 14:43
Nice appearance of ?the lick? there

laddy.cadavid-hinojos
2020-07-22 14:44
Thank you very much, I think you mean the string I'm connected to so I can knot the others, this string works as a bridge to close the circuit and produce the signals while I'm knotting. this string has no sound.

aes
2020-07-22 14:46
Thanks! Our (very limited) pilot study suggested that Counterforce (example 3) and Attraction (example 4) were the most intuitive schemas. The paper has some more info on this. One reason could be that the response to gestures is transparent but still allows for complexity. The most simple one, Path Restraint (example 1), was perhaps too simple, while the more complex Compulsion (example 5) was perhaps too complex. However, this would likely change when a user would have more time to experiment with the interface, so there is certainly a limit to how much our initial study can say. :slightly_smiling_face:

aes
2020-07-22 14:47
Noted, thank you!

dianneverdonk
2020-07-22 14:47
Cool to see analysis figures where breath and speach/singing are related

x
2020-07-22 14:48
what are your personal goals / aims for this work?

laddy.cadavid-hinojos
2020-07-22 14:48
unfortunately the practice of traditional khipu weaving was almost eliminated in its entirety by colonization, there are still indigenous groups in the Andes that use the khipus but I have not been able to have contact with them, although of course I would love to receive their feedback

shatin
2020-07-22 14:48
Beautiful work! How many electrodes are needed to register all of these aspects of the vocal production?

tragtenberg
2020-07-22 14:49
Maybe our local culture's idiosyncrasies are a more powerful source for NIME inventions than our personal idiosyncrasies since it has already passed through a "natural selection" and evolution in many dimensions (materials, sounds, movements, forms, stories) and among a lot of people.

x
2020-07-22 14:49
And... do you need any test subjects in london?

shatin
2020-07-22 14:49
Also, what about accessibility of this equipment for vocalists and composers who would lie to try it?

m.ortiz
2020-07-22 14:49
I assume the EMG plots you showed were amplitude (RMS?), have you looked at other features to extract?

julian
2020-07-22 14:49
I worked with those conductive rubber bands a few years ago. I found that they ripped pretty easily. Are they more durable nowadays or is it still pretty finicky to work with them?

julian
2020-07-22 14:50
@pjch17

js.vanderwalt
2020-07-22 14:50
Not a question, but I'm wondering if this technique could also be used to investigate brass and woodwind playing in a similar way.

a.martelloni
2020-07-22 14:50
How obtrusive are the electrodes? Do they impair any singing movement?

a.martelloni
2020-07-22 14:51
Me me me me meeeeee (test subject!)

ko.chantelle
2020-07-22 14:51
Are there differences in the EMG data between different styles of singing? Like pop vs opera?

joe.wright
2020-07-22 14:52
This would be really exciting to explore. Throat position is really difficult to see when teaching wind instruments too

pjch17
2020-07-22 14:52
The tilting mapping was definitely an obstacle for the participants, and almost none figured it out by themselves. However, after the mapping was explained, they had no problem interacting with it.

aes
2020-07-22 14:52
We didn't use colour or brightness in this study, but those are arguably common cognitive metaphors used to conceptualize timbre and other abstract aspects of music/sound. How would brightness/colour relate to gesture? Smaller containers perhaps being darker?

quinnjarvisholland
2020-07-22 14:52
Applications for voice therapy for trans folks ?

pjch17
2020-07-22 14:53
The sound is damped, so if you release the plate the sound will decay exponentially.

laurel.pardue
2020-07-22 14:53
You can also put reflective surfaces on, or just white to improve your range. (I had to duck out half-way through the video so apologies if this was covered.)

js.vanderwalt
2020-07-22 14:53
Yes, exactly, as a brass player there is a certain amount of mystique here!

s.holland
2020-07-22 14:54
Is there any potential synergy with any other forms of real time imaging?

quinnjarvisholland
2020-07-22 14:54
Thank you!

c.n.reed
2020-07-22 14:54
Hoping someone would notice :joy:

pjch17
2020-07-22 14:55
Also, there is a disconnect in the bow analogy because with a real bow you have to keep the bow moving, whereas with the daīs you can keep the disk at a static angle. It could be interesting to try out velocity-based mappings.

emmafrid
2020-07-22 14:55
@c.n.reed Thank you for an interesting presentation! Not sure if you are already familiar with Andreas Selamtzis work but it might be relevant: https://www.kth.se/profile/selamt/page/publications :slightly_smiling_face:

joe.wright
2020-07-22 14:55
Love it

laurel.pardue
2020-07-22 14:55
That'd be way cool. Good stuff!

joe.wright
2020-07-22 14:55
That was a really cool presentation, thanks!

laddy.cadavid-hinojos
2020-07-22 14:56
thank you very much @alarcon.xime, for attending and listening. so nice to see you here. for this version i used 9 strings because it had a special meaning for me, but actually i am making a smaller one of 5. i always try to use all the strings but also I like the idea to keep some mystery there. I'll be waiting for you today at 21:00 for the performance streaming.

pjch17
2020-07-22 14:56
That's a great idea @laurel.pardue, such a simple (possible) fix. Next problem is the exponential response that causes low spatial resolution at far distances.

c.n.reed
2020-07-22 14:57
Thank you! :slightly_smiling_face: Glad to hear it was useful here - looking forward to building more voice NIMEs with everyone now!

joe.wright
2020-07-22 14:59
When you are able to test again, would you be interested in exploring this alongside saxophone playing?

joe.wright
2020-07-22 15:00
A sensing interface for singing while playing would open up loads of possibilities for existing extended techniques

mario.buoninfante
2020-07-22 15:02
really like that all the electronics is not really invasive, as in you can get rid of it if you don't need that :slightly_smiling_face: was that a decision since the begin? will you keep this approach?

c.n.reed
2020-07-22 15:02
Yes! I'm actually planning some studies of embouchure positioning and respiration at the larynx with Isabelle Cossette at McGill - I'm also a hornist, so for sure will be looking into brass work and happy to lend this to anyone looking into other applications.

laddy.cadavid-hinojos
2020-07-22 15:02
They are delicate but for this project they have proved to be durable, as they are not exposed to any sharp objects that could weaken them or to very extreme stresses. It is also important to untie them after each performance

alucas02
2020-07-22 15:03
How would you rate the precision of the fingerboard sensors? Would they be suitable for controlling a parameter such as pitch?

alarcon.xime
2020-07-22 15:04
Hola @laddy.cadavid-hinojos a pleasure! It is a unique instrument, ancestral. Ritualistic. Ah, interesting the 9 strings! A nice number to weave connections and codes. And another of 5. It makes sense using all the strings, but yes, there is a mystery of which ones, and the starting and ending. Good! See you there!

laurel.pardue
2020-07-22 15:04
What about wear on the fingerboard? On a normal fingerboard, you create divets. On my handmade sensors, they have a decent life, but eventually I'll wear it out. Have you hit problems with that yet?

info041
2020-07-22 15:05
so you can also control visual effects through different strings and effects? If so curious if you had explored early developments by Steina Vasulka in this regard?

c.n.reed
2020-07-22 15:06
to follow up for anyone who missed it: it's two electrodes per muscle, one in the middle of the muscle body and one at the end near where the muscle attaches to the bone. The circuit measures the action potential between them, which changes during contraction, using a differential amplifier. A third electrode is used for reference for further help with common mode rejection of general body noise (the body is VERY noisy!). Only one reference electrode is needed, so it would be an additional two for each additional muscle. My current setup works with two muscles simultaneously. The laryngeal muscles are very small, so it's just important to get correct positioning to avoid cross-talk between them. I usually focus somewhere above and below the voicebox to avoid this and it works well :slightly_smiling_face:

c.n.reed
2020-07-22 15:07
thank you for your feedback - it means a lot to me!

steventkemper
2020-07-22 15:07
+1 for adding LEDs, it could be a nice way to visualize sensor data, processing, etc. (also, people love LEDs)

travis.west
2020-07-22 15:07
Is the lack of precision due to the analog sensors (e.g. too noisy signal), or the digital acquisition (e.g. limited ADC resolution)?

aes
2020-07-22 15:07
Thank you, this is a very interesting question. I'm not sure which interfaces you have in mind? But I'm certainly thinking about combining the leap motion with a physical controller/isntrument (to add gestural manipulation of timbre, for instance). In this prototype, the sound synthesis we used was not particularly sophisticated - we decided the force schemas first and then came up with the sound designs. I could imagine that, for instance, Attraction or Counterforce could be implemented musically/sonically in many different ways. Force dynamics would then be the framework for conceptualizing the interaction between user and sound.

marije
2020-07-22 15:07

s.holland
2020-07-22 15:08
His book ?conceptualising music? is a block buster, but he also wrote more recent papers. Katie Wilkie probably cited many such papers in her CMJ paper. But Im not sure you are missing any key ideas. Really liked your imaginative concrete use of force schemas - seems like a really fruitful direction

laurel.pardue
2020-07-22 15:08
RGB is even better!

c.n.reed
2020-07-22 15:08
If you'd be interested in getting a hold of this or helping with future studies, please do let me know and keep in touch. I'd like to make a few copies of the setup I have and distribute it soon!

s.holland
2020-07-22 15:09
ooh sorry crossing the streams - I meant great work by ANders & Mads - but hi Joseph!!

pjch17
2020-07-22 15:09
Thanks for a great session. I was super happy to be part of it!

marije
2020-07-22 15:09
Thanks to all the presenters!

lamberto.coccioli
2020-07-22 15:09
Many thanks for chairing Echo!

laddy.cadavid-hinojos
2020-07-22 15:09
Thank you for your comment, more than a NIME for me is very important the cultural heritage of the interface and the way we can give back to them their practical value that many times has been displaced by anthropological concerns linked to western thought

c.n.reed
2020-07-22 15:10
As well, the first iteration of the circuit is available in the paper and can be reproduced fairly easily (I am in no way/shape/form an electrical engineer!). Happy to share all updates to that system which have been made since January :slightly_smiling_face:

eskimotion
2020-07-22 15:10
Thank you all for being inspiring stimulating, was a great experience hope to meet you live somewhere :heart:

g.moro
2020-07-22 15:10
@pete?s installation involving *whale song* and *live interactive cymatics* will be running *live* for the next hour over the coffee break. Please call by! Instructions and Zoom link here: http://www.danpollardmusic.co.uk/liquid-noise

manolimoriaty
2020-07-22 15:10
great session, thank you!

julian
2020-07-22 15:10
which work are you referring to? I?m looking through the website but can?t find any connection to a string instrument?

travis.west
2020-07-22 15:10
It would be great it there was a github repo or similar where people could check out the hardware!

aes
2020-07-22 15:10
Thank you everyone, an inspiring session :-) Thank you for the comments and questions too. If you have a Leap Motion controller, you can try our prototype - source code and instructions can be found here: https://github.com/sparkletop/forces

eskimotion
2020-07-22 15:11
Thank you :sweat_smile:

laddy.cadavid-hinojos
2020-07-22 15:12
it's been an honor having you at the head of our table, I really admire your work.

ahsu
2020-07-22 15:13
Thank you to all the presenters!

c.n.reed
2020-07-22 15:14
In this case just the voltage itself but yes, also with RMS which is good for representing workload in the muscle - I have not currently looked at other features as I'm working with the voltage stream for synthesis applications at the moment, but there is much more work in other domains regarding classifications of gestures and I'm hoping to see whether this is possible once I am able to collect a bit more data on this (COVID-19 situation pending). Open to all ideas on this as well!

laurel.pardue
2020-07-22 15:14
@pjch17 No problem! You can linearise the response if you train it. Measure the response from specific distances and then do a linear fit. You'll have to verify that your sensors are working similarly (some reflectance sensors have a LOT of variation in response between sensors and I haven't used that particular make) but train off a couple sensors and should be sufficient.

ko.chantelle
2020-07-22 15:15
Yes, I had LEDs on my thesis instrument, the GLOBE. Feedback from that research indicates that audiences really LOVE bright colours, lights, and all things shiny. With TRAVIS I don't think I would add LEDs directly onto the instrument, but if the LEDs are part of a different set piece it would be cool to control them with TRAVIS. https://www.chantelleko.com/masters-thesis.html

eskimotion
2020-07-22 15:15
:blush: thank you!I admire your work too, maybe we can meet and talk in Linz sometimes

c.n.reed
2020-07-22 15:15
I can do this during this week! Should have probably done so beforehand :sweat_smile:

eskimotion
2020-07-22 15:16
Thanks for sharing! I definitely want to try to play with it.

steventkemper
2020-07-22 15:17
Thanks for sharing--this project looks amazing!

steventkemper
2020-07-22 15:17
I agree with your idea about controlling the LEDs of a set piece

ko.chantelle
2020-07-22 15:19
Ah, when I heard this question in the zoom call, I thought you were referring to the conductivity/resistivity of the strips. It's been about a year of playing and still no physical wear/tear of the fingerboard. Also the strips are still maintaining their initial conductivity/resistivity levels so far. If the strips damage, they're designed to be replaceable. If there is physical damage to the whole fingerboard, I can reprint and replace the main piece as well.

steventkemper
2020-07-22 15:19
Good luck with your continued work on this

c.n.reed
2020-07-22 15:20
They do not for me, although I have already anticipated some may find them uncomfortable and this is something I'm planning to test once I can get others. They are cabled and attached using a conductive paste (Ten20) which helps them adhere to the skin, and then are secured using non-woven fabric tape. This is really light but perhaps not so pretty - I'd love to get some wearable design help to make a sort of collar and attachment for the board/Bela, which I have crazy ideas of in my head (coming from the insane costuming of the opera world).

ko.chantelle
2020-07-22 15:20
Thanks!

laurel.pardue
2020-07-22 15:21
Nice! Wish this was in person as I'd love to have played with it and checked it out. Also, if you _are_ interested in getting pitch level info, I've got a paper on that doing audio fusion and sounds like it'd work really well with what you're doing.

c.n.reed
2020-07-22 15:23
I'm hoping! I was speaking with someone the other day about incorporating this into accessibility for voice training and to give some control to the voice for people who may have lost some vocal functions as well. Because EMG detects the neural activation, it's a measurement of intention of movement, rather than the result. I imagine (and hope) it could be used in voice therapy to help users visualise and get feedback about these tiny muscles and match that with the auditory feedback of hearing themselves sing/speak, so they can reproduce the sound they want and learn to control it better :slightly_smiling_face:

pjch17
2020-07-22 15:25
Absolutely, and I do apply linearization. However, with an exponential sensor you're not gonna escape the fact that for much of the distance you want to measure, only a few bits of your dynamic range is gonna be used in your raw reading.

c.n.reed
2020-07-22 15:25
For sure! I'm really interested in breath mechanics as well as some of the methods which have employed ultrasound to look at vocal formant shapes, as well as of course audio analysis, again going off this idea of blended control forms. At the moment I don't have too much access to other imaging devices, but will start bothering some otolaryngologists in the future, for sure!

c.n.reed
2020-07-22 15:27
Yes! Thank you Emma :slightly_smiling_face: I'm well familiar with Andreas's work - would be incredible to get some electroglottography in this mix

c.n.reed
2020-07-22 15:29
It's crazy how much the tongue moves around! :smile: Thanks for your feedback Dianne - if you check the paper you'll see I refer to the lovely Bellyhorn as a great example of direct control! Thanks for your inspiration

laurel.pardue
2020-07-22 15:31
Fair enough. I've usually avoided using the edge of the range for exactly that reason.

c.n.reed
2020-07-22 15:31
Absolutely! Once I've gotten a solid way to get this system out into the wild, I'd be happy to send one to you and work on some embouchure sensing and laryngeal mechanics there! I mentioned a bit further down - I'm also a French horn player, so could offer some comparison there as well!

joe.wright
2020-07-22 15:32
Ah cool that would be really exciting

x
2020-07-22 15:33
really great work

joe.wright
2020-07-22 15:33
Comparison of the mechanics would be interesting, and where there are similarities and differences for overtone / timbral control

c.n.reed
2020-07-22 15:34
This is something I am currently exploring - I would love to work with others who are more heavily focused in other styles. I sing a lot of pop but my background is in more classical styles so it might be that I move in different ways than others do, and exploring those differences would be great for educational purposes! As far as technique and gesture, a lot of it is transferrable across styles. The examples I showed today would still appear in a different style as they are a bit more basic - lowering the larynx for low notes, text and breathing. So, hopefully applicable in a lot of contexts with just the basics!

c.n.reed
2020-07-22 15:35
Let's keep in touch for sure - I'd love to look into this

joe.wright
2020-07-22 15:35
I?ll dm you my email, that would be great

aes
2020-07-22 15:36
We would be very happy to hear any feedback :-)

ko.chantelle
2020-07-22 15:36
When making sensors from scratch from raw materials there will always be a little noise. When testing the resistance of the strips with a multimeter the signal isn't precise to begin with; the values I wrote down for the changing resistance across the strips are approximations. But I noticed that the dimensions of the printed strip affect how jittery the signal is. In my paper I wrote about how the very first test strip was 4mm-5mm wide and the values on the multimeter was bouncing all over the place that I couldn't even make a guess as to what the resistance was. But when I printed the next strip 3mm wide, the values were stable enough that I could write down an approximate resistance across the length of the strip. I may have been able to fine tune it further, but I was constrained by the physical dimensions of the fingerboard and I needed to make sure the strips were large enough that they wouldn't accidentally snap, as well as wide enough that when the string is pressed it will always come in contact with the strip.

c.n.reed
2020-07-22 15:37
Thanks everyone for the lovely feedback and input - I'll get back to everyone as fast as I can, and please do let me know your contact details if you're interested in trying this out or helping with further studies or anything else I might be able to work with! :smile: This is my first NIME and I'm having a ball, so happy that everyone enjoyed the presentation and looking forward to applying this sensing in all sorts of ways!

eskimotion
2020-07-22 15:40
i will! looking forward to the rest of the conference, enjoy!

aes
2020-07-22 15:43
Thanks, likewise :slightly_smiling_face:

ko.chantelle
2020-07-22 15:43
I'm not familiar with Vasulka, but during the first year of my masters I did take a course on the linking of visuals and sound within Max MSP. From that course artists such as Nick Cope, Jean Piche, Diego Garro, and Ryoichi Kurokawa come to mind. There's probably more that I've looked at that I can't remember at the moment.

ko.chantelle
2020-07-22 15:45
Yes, that paper is in my references! I thought it was really cool! I think it would be so neat to have all the augmented instruments side by side for a playing test.

laurel.pardue
2020-07-22 15:48
If only... Someone did a kickstarter for a commercial version of the sensor I did. We both started development around the same time. Don't know if you have seen it, but if you are interested, I know @m.ortiz is getting a set for his cello so will have the link.

ko.chantelle
2020-07-22 15:56
I've answered in the zoom session, but thought I should also type it out just in case. No, on TRAVIS II the sensor data is not precise enough for mapping exact pitches, but that isn't a problem for my style of composing/programming. I tried it with TRAVIS I with the softpots, and even for those it was very difficult to do because of how close together the semitones are on the violin. I could only map exact pitches that were whole tones apart. Which is why on TRAVIS I, I composed a study entirely out of the whole tone scale (though in retrospect it does sound somewhat bland compared to my other compositions): https://www.youtube.com/watch?v=If24BvJWEF8&list=PLXtepoOoQnoh3Q7xgdMMCLdrrm4uUpFV9&index=7&t=0s

travis.west
2020-07-22 16:00
Very interesting. I will be sure to check out the full paper.

ko.chantelle
2020-07-22 16:01
I answered this in the zoom session, but I thought I should type it out as well. Yes for TRAVIS II, it was a goal to be able to remove all electronics and what's left is not too noticeable. This came from my personal need so I don't have too much stuff to carry throughout the day as I'm running in between rehearsals. It also made it easier for flying between Calgary and my hometown Victoria so I could only have one carry-on violin instead of trying to talk my way into bringing two on board.

julian
2020-07-22 16:16
oh thanks for the references!

ko.chantelle
2020-07-22 16:31
Very cool! I wonder (when it's possible to get participants) if you can also map out developmental milestones through different ages, or for people who are transitioning genders. Last year I saw a music/drama interdisciplinary performance of another grad student who is a singer and a trans man. He had recently started hormone therapy and the performance was a reflection on his experiences of how the hormones affected his voice. He said that there wasn't much medical research in that area. But luckily the hormones didn't damage his voice and only lowered it so he went from the alto section to the tenor/bass of the choir.

g.moro
2020-07-22 16:37
@pjch17 if you were using an analog sensor you could build a logarithmic or expo amplifier ... or if you plug it straight into Bela's analog inputs maybe the 16bit ADC there (which is probably better than the one on your device) will be enough?

g.moro
2020-07-22 16:39
Though I am not sure how far you could go ... I never really had to measure more than a few millimeters.

pjch17
2020-07-22 16:53
@g.moro Well, the sensor I'm using is digital, so the log-to-lin amp isn't an option. It's a good idea though :slightly_smiling_face: Just keep in mind that analog IR distance sensors often have a nasty non-linearity at close proximity, so that's another thing you'll have to deal with.

g.moro
2020-07-22 16:55
Yes they are non-monotonic at very small distances but that's normally below 1mm or so

g.moro
2020-07-22 17:05
I am also thinking if you have looked at stretch sensors so that the rubber bands can become your position sensors ! :slightly_smiling_face: sorry that instrument looks so great that I keep thinking about it

erik.nystrom
2020-07-22 17:11
Thanks for you reply! I?m very interested in ideas such as counter forces and thresholds etc as sound design concepts and can imagine applying these even with very simple one dimensional controllers such as faders or knobs, but in that case the physicality is very much inferred in the sound/multimodal. Was great to see such a methodical application of the metaphors in your paper/system.

ko.chantelle
2020-07-22 17:15
Oh, I didn't see someone else already mentioned this.

ko.chantelle
2020-07-22 17:22
I searched through my computer and found list from that course. I think we were given this list to choose from for a presentation assignment.

tragtenberg
2020-07-22 17:44
Definitely! There is a lot of work to do to reconquest our cultures

aes
2020-07-22 18:10
Thanks a lot. I'd be happy to hear about similar work in the future :slightly_smiling_face:

jmalloch
2020-07-22 19:12
Hi Simon :slightly_smiling_face:

a.r.jensenius
2020-07-22 20:06
Thanks for the reply. Fascinating! I am also personally interested in this, because my wife is from Chile, and I have been trying to learn more about traditional instruments of the region.

tom.mitchell
2020-07-23 07:26
Hi @c.n.reed, thanks for your talk yesterday, really interesting! Reminded me of some research studies by Jess McIntosh, sensing and classifying hand movements and gestures from muscle activity in the wrist. Jess did some work with https://research-information.bris.ac.uk/ws/portalfiles/portal/67056500/Jess_McIntosh_EMPress.pdf, http://library.usc.edu.ph/ACM/CHI%202017/1proc/p1923.pdf and https://dl.acm.org/doi/pdf/10.1145/3126594.3126604. Full thesis here: https://research-information.bris.ac.uk/en/studentTheses/exploring-the-practicality-of-wearable-gesture-recognition

eskimotion
2020-07-23 08:39
*Paper for session 03 Sensors, Actuators and Haptics*  The Daīs: A Haptically Enabled NIME for Controlling Physical Modeling Sound Synthesis Algorithms Paper 119 in proceedings Force dynamics as a design framework for mid-air musical interfaces Paper 70 in proceedings Knotting the memory//Encoding the Khipu: Reuse of an ancient Andean device as a NIME Paper 94 in proceedings Surface Electromyography for Direct Vocal Control Paper 88 in proceedings Touch Responsive Augmented Violin Interface System II: Integrating Sensors into a 3D Printed Fingerboard Paper 32 in proceedings

info041
2020-07-23 10:47
I mentioned Vasulka because she was one of the pioneers if not the pioneer of controlling video interactions through different strings of her violin which she calls ZETA violin, I would check her work with ZETA as it might be relevant to your goals/interests with video interaction http://www.vasulka.org/Steina/Steina_ViolinPower/ViolinPower.html

ko.chantelle
2020-07-23 15:03
Thank you for the updated link. But when I clicked the video, the file is unable to play on my computer. I've found another source here, I think it's the same: https://vasulkakitchen.org/en/steina-violin-power

julian
2020-07-23 17:30
:wow:

julian
2020-07-23 17:30
nice thanks

c.n.reed
2020-07-24 19:28
@shatin I've compiled schematics and some basic Bela code for digital filtering here: https://github.com/courtcourtaney/vocal-EMG But yes, still hoping at some point to get everything into a "pre-packaged" setup and send it around to people to try out/test in some other contexts. I'll keep in touch about that, if you'd like just let me know where I reach you normally :slightly_smiling_face: