joe.wright
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

joe.wright
2020-07-18 13:59
@joe.wright set the channel purpose: Paper Session 14: HCI / New Interfaces / Instruments

niccolo.granieri
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

hassan.hussain5
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

overdriverecording
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

lamberto.coccioli
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

jonathan.pearce
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

richard.j.c
2020-07-18 13:59
has joined #papers14-hci-new-interfaces-instruments

eskimotion
2020-07-20 09:25
has joined #papers14-hci-new-interfaces-instruments

edmund.hunt
2020-07-20 09:25
has joined #papers14-hci-new-interfaces-instruments

acamci
2020-07-20 17:01
has joined #papers14-hci-new-interfaces-instruments

aaresty
2020-07-20 17:21
has joined #papers14-hci-new-interfaces-instruments

10068197
2020-07-20 17:21
has joined #papers14-hci-new-interfaces-instruments

a.nonnis
2020-07-20 17:22
has joined #papers14-hci-new-interfaces-instruments

a.macdonald
2020-07-20 17:23
has joined #papers14-hci-new-interfaces-instruments

andreas
2020-07-20 17:24
has joined #papers14-hci-new-interfaces-instruments

dianneverdonk
2020-07-20 17:25
has joined #papers14-hci-new-interfaces-instruments

likelian
2020-07-20 17:25
has joined #papers14-hci-new-interfaces-instruments

ko.chantelle
2020-07-20 17:25
has joined #papers14-hci-new-interfaces-instruments

anika.fuloria
2020-07-20 17:26
has joined #papers14-hci-new-interfaces-instruments

clemens.wegener
2020-07-20 17:26
has joined #papers14-hci-new-interfaces-instruments

michon
2020-07-23 08:21
Papers for this session: *Curating Perspectives: Incorporating Virtual Reality into Laptop Orchestra Performance* Paper 30 in proceedings *New Interfaces for Spatial Musical Expression* Paper 47 in proceedings *Star Interpolator ? A Novel Visualization Paradigm for Graphical Interpolators* Paper 10 in proceedings *TorqueTuner: A self contained module for designing rotary haptic force feedback for digital musical instruments* Paper 52 in proceedings *Non-Rigid Musical Interfaces: Exploring Practices, Takes, and Future Perspective* Paper 3 in proceedings

lja
2020-07-23 18:33
@michon Curating Perspectives is being aired in paper session 11, I believe the duplicate has been removed from this session?

x
2020-07-24 16:33
@l.mice What made you choose the form of the instrument?

marije
2020-07-24 18:43
this should be in #papers12-evaluation-of-nimes ?

michon
2020-07-24 19:54
@lja Thanks for pointing that out! For some reasons that change didn't make it to the final program. I just updated the program of the session.

michon
2020-07-24 19:55
Updated (and accurate) papers list for this session: *New Interfaces for Spatial Musical Expression* Paper 47 in proceedings *Star Interpolator ? A Novel Visualization Paradigm for Graphical Interpolators* Paper 10 in proceedings *TorqueTuner: A self contained module for designing rotary haptic force feedback for digital musical instruments* Paper 52 in proceedings *Non-Rigid Musical Interfaces: Exploring Practices, Takes, and Future Perspective* Paper 3 in proceedings

vincze
2020-07-25 11:23
@michon Hey, could you please add time and link to these paper presentations? Thanks



marije
2020-07-25 12:00
*14:45 - 16:00 (UTC+1)*

hassan.hussain5
2020-07-25 12:06

hassan.hussain5
2020-07-25 13:41
we?re starting in 5 mins!

niccolo.granieri
2020-07-25 13:41
Whoop whoop, I will miss these daily paper session reminders!

hassan.hussain5
2020-07-25 13:41
also, as mentioned previously: when asking a question in response to a paper, please indicate in your message to which paper presentation you are responding.  Either by mentioning the title of the paper or using the @ to direct it to the presenter. This will make it easier for people to follow the presentations and the Q&A later (due to being in different time zones). and please keep replies to the question in a thread!

mario.buoninfante
2020-07-25 13:47
the audio is not working

mario.buoninfante
2020-07-25 13:48
@michon

mario.buoninfante
2020-07-25 13:48
all fine now :slightly_smiling_face:

michon
2020-07-25 13:49
Yup

marije
2020-07-25 13:55
@Ivica the earliest NISME I'm aware of was made in 1970, described here: http://oro.open.ac.uk/48743/

vincze
2020-07-25 13:58
@Ivica Seems really cool, nice presentation. When controlling the system with hand, is there only an aural feedback of what one is doing or also a visual feedback, and how is it scaled, does one need to move in the entire space in order to profit from the full potential?

ico
2020-07-25 13:58
Thanks for the heads-up, Marije. Happy to look into it further.

vincze
2020-07-25 14:02
@ico Sounds great, thank you for clarifying :slightly_smiling_face:

cagri.erdem
2020-07-25 14:02
Thanks a lot @ico. Does spatialization influence the perceived ?shapes? in any different way when compared to the shape of the movement (or the sound)?

marije
2020-07-25 14:05
2nd paper!

jmalloch
2020-07-25 14:06
Thanks for the great presentation @ico. Not a questions, but here are some related pubs from IDMIL: ? Georgios Marentakis, Joseph Malloch, Nils Peters, Mark T. Marshall, Marcelo M. Wanderley and Stephen McAdams. "Influence of Performance Gestures on the Identification of Spatial Sound Trajectories in a Concert Hall". In Proceedings of the International Conference on Auditory Display (ICAD), 2008. ? Mark T. Marshall, Joseph Malloch and Marcelo M. Wanderley. "Gestural Control of Sound Spatialization for Live Musical Performance". In Miguel Sales Dias, Sylvie Gibet, Marcelo M. Wanderley and Rafael Bastos , eds. Gesture-Based Human-Computer Interaction and Simulation ˇ 7th International Gesture Workshop, GW 2007, Lisbon, Portugal, May 23-25, 2007, Revised Selected Papers, LNCS, vol. 5085, Springer Verlag, 2009, pages 227?238. DOI: 10.1007/978-3-540-92865-2_25. ? Mark T. Marshall, Joseph Malloch and Marcelo M. Wanderley. "Non-Conscious Control of Sound Spatialization". In Proceedings of the International Conference on Enactive Interfaces (ENACTIVE), 2007, pages 377?380. ? Mark T. Marshall, Joseph Malloch and Marcelo M. Wanderley. "A Framework for Gesture Control of Spatialization". In International Gesture Workshop, 2007. ? Mark T. Marshall, Nils Peters, Alexander Refsum Jensenius, Julien Boissinot, Marcelo M. Wanderley and Jonas Braasch. "On the development of a system for gesture control of spatialization". In Proceedings of the International Computer Music Conference (ICMC), ICMA, 2006, pages 360?266.

ico
2020-07-25 14:06
@cagri.erdem yes it does. We have a pending paper in Audio Mostly where we have conducted a series of scientific studies to observe how per-loudspeaker manipulation of sound affects the user's capacity to perceive the ensuing aural shapes.

ico
2020-07-25 14:07
Thank you for sharing @jmalloch. Looking forward to checking these out.

cagri.erdem
2020-07-25 14:08
Great! It?s very interesting to investigate how such shapes are transformed between different domains. Looking forward to reading this paper as well as the upcoming one at AM.

jmalloch
2020-07-25 14:10
Your Cube facility looks amazing btw!

ico
2020-07-25 14:10
:+1:

mario.buoninfante
2020-07-25 14:10
@ico any particular reason why Max has been chosen over Pd-L2ork?

marije
2020-07-25 14:10
@gibsond great overview of interpolation approaches!

ico
2020-07-25 14:11
Thanks! I completely agree with you :slightly_smiling_face:

ico
2020-07-25 14:14
@mario.buoninfante This was mainly because we have not yet integrated pd-l2ork into the Cube environment, because Max has a lot of objects that are user study friendly, and because the UI widgets in pd-l2ork 1.x are fairly inefficient. These days a lot of focus is on the Purr-Data (a.k.a. Pd-L2Ork 2.x) which will address many of the previous bottlenecks). Hope this helps.

harrap
2020-07-25 14:16
have you thought about ?wrapping? the star with an outline to make the shape more ?recognizable? at a glance? There is a fair amount of work on that in scientific visualization.

mario.buoninfante
2020-07-25 14:16
yap it does make sense. From what I saw in the presentation the UI plays quite an important role, and I suppose that since the software already deals with multiple channels, having the UI slowing things down is not an option even on more powerful modern machines

mario.buoninfante
2020-07-25 14:17
interestingly Sensel have just released a Pd module for Morph :)

ico
2020-07-25 14:18
Thanks, @marije for pointing out the Stockhausen's seminal work. If we really go back in time, we could also talk about corri spezatti and antiphonal choirs, as well as conscious spatial choices in music. However, none of these provide an ability to directly manipulate sound from within the space and as such in my view do not apply to this context.

gibsond
2020-07-25 14:21
@harrap Hopefully I manage to answer your question at the end of the presentation?

harrap
2020-07-25 14:22
Yes, perfectly. I hadn?t thought about the visual overlay confusion that would result.

harrap
2020-07-25 14:24
In case you haven?t seen it, the Borderlands synth on ios uses a vaguely similar configuration and iirc it only shows the ?star? when you actually touch the node. It is a granular synth and the parameter are grain parameters etc. It also uses a spatial node interpolator based on overlaying the nodes on polygons that are sound clips/sources. Just in case you are looking for parallels.

ico
2020-07-25 14:27
Yes, my team worked on that and it will be included by default in Purr-Data :slightly_smiling_face:

dianneverdonk
2020-07-25 14:28
@mathias.kirkegaard Thanks for presenting the TorqueTuner! You posed that the standalone haptic knob would be suitable as a device for DMI designers/those who are interested. Are they purchasable/rather easy to build? I'd like to be a tester, somehow, or see if I could use it in one of the projects/performances I'm making, perhaps. Did you have any feedback from users of the knob already?

gibsond
2020-07-25 14:28
@harrap That sounds interesting. It?s not one I have come across already, but I?ll definitely take a look at it. Thanks for the pointer.

a.mcpherson
2020-07-25 14:28
@mathias.kirkegaard Nice work. Technical question: the stepper runs at 12V and has a holding current up to 2A. You're powering the device via a boost converter from a 3.7V battery. That means in principle up to 6A of current draw from the battery. Do you use any special kind of battery or power converter, and have you found power draw to be a limitation?

mark.durham
2020-07-25 14:29
On the TorqueTuner - I thought this was a great presentation, thanks. 2 questions (a) what is the rough cost for producing a single device? (b) it looked like the motor was producing vibration feedback to the use from the resulting sound - is that the case? If so, is there scope for this being part of an accessible instrument for the deaf?

a.mcpherson
2020-07-25 14:30
GitHub link on the slide just now: https://github.com/IDMIL/TorqueTuner

timo.dufner
2020-07-25 14:30
following...

julian
2020-07-25 14:36
Awesome work @mathias.kirkegaard! How easy is it to set up force feedback with libmapper? Sorry if I missed it in your presentation, but can you just configure a ?resistance? parameter or so?

marije
2020-07-25 14:36
The Osaka-work, did provide a controller to direct the spatialisation.

boem.alberto
2020-07-25 14:37
@mathias.kirkegaard very cool work! It's so true that prototyping tools force-feedback must be enhanced. Just to add another previous example, this is a toolkit used for prototyping many force-feedback interfaces produced and used in Japan http://arcdevice.com/products/DATK/DATK.html

jmalloch
2020-07-25 14:37
We experimented with similar mappings using past versions of the T-Stick with embedded vibration actuators (https://josephmalloch.wordpress.com/portfolio/vibration-feedback/). Latency was high but could benefit from current work at the IDMIL on embedding synthesis in the T-Stick.

mathias.kirkegaard
2020-07-25 14:39
It is pretty straight forward:D We generalized the control of the forcefeedback as transferfunctions embedded on the device, and libmapper can then be used to modify these by scaling, translating etc.

mathias.kirkegaard
2020-07-25 14:40
Hi, @boem.alberto , cool thanks for sharing!

ico
2020-07-25 14:41
Good point. That and the fact that Stockhausen was arguably first to recognize localization as the fifth dimension of the musical expression indeed warrants that his work be mentioned in the NISME paper.

dianneverdonk
2020-07-25 14:41
Is there a English version of the site, or do i miss a knob for it?

mathias.kirkegaard
2020-07-25 14:41
+1 for Joseph Mallocs comment. Vibrotactile feedback would requirre dedicated vibration motors such as the one on the picture above.

ico
2020-07-25 14:41
:+1:

dianneverdonk
2020-07-25 14:42
I could use google's translate suggestion, but just wondering

marije
2020-07-25 14:43
Great work, @boem.alberto !

a.r.jensenius
2020-07-25 14:44
Very nice @boem.alberto. I am surprised that there are so few examples of non-rigid interfaces in the NIME proceedings. Did you notice any difference over time? For example, are there more of them in recent years? (this is probably answered in the paper...)

mark.durham
2020-07-25 14:44
Thanks @mathias.kirkegaard and @jmalloch for the replies. The section I was looking at is at 3.39 on your video here: https://www.youtube.com/watch?v=V8WDMbuX9QA&list=PLz8WNY_I2S5QAKDKa_tb57PzlQNY3ikbe&index=3

mark.durham
2020-07-25 14:45
So what is driving the feedback there?

matthew.mosher
2020-07-25 14:46
@boem.alberto Thanks for putting all this together! Very helpful!

v.zappi
2020-07-25 14:46
Thank you all! Here is a link to a website that summarizes objectives, methods and findings: https://deformableui.com/nime2020/

dianneverdonk
2020-07-25 14:48
@boem.alberto great! Have you perhaps found this material? https://tg0.co.uk/ Found it a while ago, looks really interesting, haven't looked into it really deeply yet, but seems promising.


jmalloch
2020-07-25 14:48
I think vibration actuators and force feedback are complementary ? it would be fantastic to build T-Sticks with both!

christian.frisson
2020-07-25 14:48
@boem.alberto @v.zappi and colleagues, among the devices you surveyed, do many change shape fast and strongly enough to provide haptic (force) feedback?

boem.alberto
2020-07-25 14:50
Thanks everyone ! but this is really a collective effort between me @v.zappi @leprotto.giacomo and Giovanni Maria Troiano (not here)

dianneverdonk
2020-07-25 14:50
@boem.alberto et al. really interesting research, thanks a lot!

decampo
2020-07-25 14:50
@boem.alberto very good analysis! I am wondering whether the physical entanglement of non-rigid interfaces might work best when mirrored by parameter entanglement (i.e. complex many to many mappings)?

marije
2020-07-25 14:51
@boem.alberto hysteresis in the data is probably also a challenge in the mapping approaches, did you encounter comments abou that?

dianneverdonk
2020-07-25 14:51
@boem.alberto (I liked the slow voice :slightly_smiling_face:)

cagri.erdem
2020-07-25 14:52
Thank you @boem.alberto et al for the great work. It is very interesting how you link non-rigidity with deformation of sound. What do you think about the musical and embodied motivation and/or pleasure in playing such interfaces?

mathias.kirkegaard
2020-07-25 14:52
The feedback is pre-defined as haptic effects embedded on the device, which can be modified through libmapper. In this case a haptic detent, see snapshot from paper below. At every detent, the device transmits a trigger signal via libmapper to SC which runs a karplus-strong like physical-model.

mario.buoninfante
2020-07-25 14:53
amazing! :slightly_smiling_face:

mathias.kirkegaard
2020-07-25 14:55
@dianneverdonk Thank you! They are pretty easy to build, and i will make shure to update the assembly instructions on github to improve it further! Everything is of the shelf components :smile: We did not have any feedback yet, but let's keep in touch about replicating. My email is:

jmalloch
2020-07-25 14:55
Hi @christian.frisson - I know Avrum Hollinger was considering vibration as a feedback modality for the Ballagumi (using piezoelectric properties of silicone) but it was never implemented...

marije
2020-07-25 14:56
Martin Marier used a parameter interpolation approach. In the Web (STEIM) the entanglement of the sensors was enough to then do more 'straightforward' mapping of the signals.

mark.durham
2020-07-25 14:57
I see - makes sense now. I'll also read the paper, thanks.

dianneverdonk
2020-07-25 14:57
@v.zappi curious for the link to the paper you just mentioned!

mathias.kirkegaard
2020-07-25 14:57
@a.mcpherson Thanks! Yes, up to 6A current draw, but the LiPo's used can deliver much more. Though, they run dry very fast :confused: I am aware that the power-supply could need an optimisation.

a.mcpherson
2020-07-25 14:58
Do you need to use LiPos designed for RC cars or drones to get that kind of current draw? I don't have a lot of experience with LiPos, but I believe that many of the consumer battery packs have built-in current limiting to avoid fires.

christian.frisson
2020-07-25 14:59
Hey @jmalloch thanks I have seen this intriguing Ballagumi on a shelf, but never in action :slightly_smiling_face:

ico
2020-07-25 14:59
Thank you! :+1:

dianneverdonk
2020-07-25 14:59
@boem.alberto and @v.zappi, I'm thinking about the term 'material-mapping' (choosing the material combined with the sound output ánd the lessons someone has in mind) regarding you research :smile:

niccolo.granieri
2020-07-25 15:00
** as previously mentioned, here is a link to a zoom room that will be open all day long for you to pop-in and chat with other NIME attendees! https://us04web.zoom.us/j/75307801251?pwd=VTF3MFJ4UTNaY1psTHQ4Qllkckhndz09

juan.jpma
2020-07-25 15:00
Thanks for the shoutout to our Soma design paper @v.zappi also this year @a.nonnis presented a poster on Olly a textile-based deformable musical installation for children

v.zappi
2020-07-25 15:01
Indeed a great year to start a new chapter in the exploration of non-rigid musical interfaces!

florent.berthaut
2020-07-25 15:02
@v.zappi @boem.alberto Thank you for the very interesting survey ! Have you looked at virtual deformable interfaces (e.g. 3D sculpting) ? Would they be interesting as a way to investigate the separate effects of haptic feedback and shape modifcation ?

jmalloch
2020-07-25 15:02
For the Spine the entanglement of physical meant that mappings were essentially entangled& integral even if you tried to design separable mappings. Some of the composers we worked with kept trying to design separable mappings though!

boem.alberto
2020-07-25 15:03
@christian.frisson not really. Actually 100% of the one surveyed in this paper are input only. The only haptic feedback is provided by the material and the structure. In non-musical examples there are, I can send you some references later.

v.zappi
2020-07-25 15:04
Very good point! We considered looking at those too [we have a list] but we decided to start from physical interfaces, to first frame big-picture questions. Hopefully more to come in the future on that!

marije
2020-07-25 15:04
@boem.alberto @v.zappi My performance Chrysalis (performed at NIME 2017 in Copenhagen) also makes use of non-rigid sensing. https://marijebaalman.eu/projects/wezen-chrysalis.html I guess the next step might be to also look at performances employing this kind of interfaces, and not just papers?

marije
2020-07-25 15:07
And perhaps the Wind Instruments I've made also fall into that category: https://marijebaalman.eu/projects/wind-instruments.html https://marijebaalman.eu/projects/v-l-i-g.html

jmalloch
2020-07-25 15:07
Great work @boem.alberto @v.zappi et al. :slightly_smiling_face: I think you missed an interface from NIME 2006 ? it was quite primitive but employed an interesting method (ambient light) for sensing deformation of a large spring: Denis Lebel and Joseph Malloch. "The G-Spring Controller". In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2006, pages 85?88.

florent.berthaut
2020-07-25 15:08
@v.zappi It makes a lot of sense to start with this, there's already a lot to be done. I'd be interested to know what your experts would say about virtual versions

jmalloch
2020-07-25 15:08
Axel Mulder's work comes to mind.

boem.alberto
2020-07-25 15:08
@jmalloch yes thats super relevant

boem.alberto
2020-07-25 15:09
and not only for virtual but also for physical interfaces mapping deformations to sound. Still a great work

florent.berthaut
2020-07-25 15:09
@jmalloch Yes ! exactly the one that pops into my mind whenever i think about 3D sculpting

v.zappi
2020-07-25 15:10
These are invaluable comments that add to those we collected, thank you guys! @dianneverdonk here is the paper I mentioned, it presents a study in which participants explore ready-made non-rigid interfaces [they are relieved from the burden of design] and the "evolution" of their approach is studied over time [not very long time, but still...]: G. M. Troiano, E. W. Pedersen, and K. Hornbćk. Deformable Interfaces for Performing Music. In _Proc. CHI_, 2015. https://dl.acm.org/doi/pdf/10.1145/2702123.2702492?casa_token=3sSk13lmQToAAAAA:-JiaFvCSPAqB0Dwll7CkA16mLm9uPa-14O5Y8xpaRWeXImLA1WTK5qO758y5LHoHkjMOP-im9wn6

marije
2020-07-25 15:11
arguably The Web (STEIM) can be regarded a non-rigid interface?

marije
2020-07-25 15:14
@boem.alberto @v.zappi Where do you see the border between non-rigid interfaces and/or wearables? if there is any?

cagri.erdem
2020-07-25 15:15
Perfect question @marije, I was just thinking about that.

boem.alberto
2020-07-25 15:15
@marije yes that's a very good point that we discussed during the process. We decided to concentrate on published works for two reasons: they are archived, and provide a certain amount of information needed to understand the content. We are aware that probably many things were left out but, that's certainly the next challenge I believe and not only for this type of interfaces. Any suggestions, we will be super happy to include it -at least- in the online database we are curating http://www.deformableui.com

christian.frisson
2020-07-25 15:16
Thanks for your reply @boem.alberto! I guess the overview on shaping-changing interaction from Alexander et al. in your references in a great hub paper for non-musical deformable haptic references.

v.zappi
2020-07-25 15:16
This is a great point! We faced the issue of "access" and documentation, as performances are much more difficult to track down and throughly understand [IF there are videos, it is often difficult to understand what's going on]. But all the emphasis on a new NIME publication will likely fix?at least in part?these issues. We'll do our best to seek after artistic works. Now that I think about it, I remember your performance... shame on me!

marije
2020-07-25 15:16
Also, the work of Afroditi Psarra is relevant here, I think: http://www.afroditipsarra.com/

boem.alberto
2020-07-25 15:19
@marije that's an important point but we define already in another work that we think is non-rigid depends by an interplay of elements mostly materials but mostly the input and interactive techniques. We basically considered non-rigid interfaces that use changes of shape in any form,a nd therefore afford stretching or squeezing. So, there can be non-rigid wearables (there are many examples) but not every wearable because they are just "soft" they are non rigid. More info here :slightly_smiling_face: : http://www.albertoboem.com/wp-content/uploads/2017/04/Towards_Non_Rigid_HCI__A_Review_of_Deformable_Interfaces_and_Interactions-2.pdf

a.nonnis
2020-07-25 15:19
..Also that of Birgitta Cappelen and Anders-Petter Andersson: Expanding the role of the instrument and that of Lara Grant http://lara-grant.com/ However it was a great overview!

boem.alberto
2020-07-25 15:23

v.zappi
2020-07-25 15:29
Indeed, several performers mentioned embodiment as a target or a "feature", both directly and indirectly. In some cases, interaction was even deemed as "playful", "enjoyable", "joyful". But we also noticed this seemed to be an end of the spectrum; whereas other experts experienced high challenge and even frustration, which stand at the opposite end. It seems that some designs managed to provide a sweet spot between excessive difficulty and toy-like entertainment, suggesting that virtuosity can be achieved with non-rigid interfaces too. But the relative novelty of the technologies, the technical limitations and the increased psychophysiological burden call for much more in depth research in the field.

v.zappi
2020-07-25 15:30
Axel Mulder's work inspired part of my PhD!

v.zappi
2020-07-25 15:41
@christian.frisson, @jmalloch is correct, some participants reported initial plans to design shape-changing features for feedback, but technology was [and still is] characterized by many limitations, that moved these ideas in the backburner. As I mentioned during the Q&A, a first example of deformable output [i.e., shape-change] in a musical context has popped out this year at NIME! Juan Martinez Avila et al. Soma Design for NIME We are also personally working on the use of shape-change for haptic feedback, as it seems one of the most promising applications for this paradigm.

mathias.kirkegaard
2020-07-25 15:44
Yes exactly, LiPo?s for rc/drones usually have very high discharge rate. We used this one: https://www.robotshop.com/en/37v-2000mah-5c-lipo-battery.html Which (at full charge) can theoretically deliver 10A continuously for 12.5 min

marije
2020-07-25 15:48
thanks!

christian.frisson
2020-07-25 15:51
Thanks @boem.alberto and @v.zappi for the references! I'll follow your work!

jmalloch
2020-07-25 16:18
For the Spine I also sketched plans for actuation, but it wasn't central to the design. The dancers learned to 'puppeteer' the spines when hand-held which had a similar visual effect but obviously didn't provide them any force feedback.

a.nonnis
2020-07-25 17:21
Thanks @juan.jpma for mentioning our work!...that's very kind of you :relaxed: Your work was very interesting indeed!..However..I would like to emphises that Olly is a very limited musical instrument at its current state, and some might even argue on that. :slightly_smiling_face: The focus for me was not much on the technical aspects and complexity allowed by the instrument as much as on its benefits on social activities and inclusion at a broader level and by following a multidisciplinary approach that focuses on the environment and the changes that we can make to it (including the design of TUIs) so that they promote social integration. Music in my case was more of a medium than the final aim. I was fortunate enough to be able to exploit the intrinsic power of music because I worked with children who liked it. However, I agree with @v.zappi et al. that it's important to acknowledge the opportunities and challenges offered by the different mediums while still considering the different and varied applications in which they are deployed. Nonetheless I might argue that when working with different deformable interfaces maybe standardizations of mappings and other parameters might not always be necessary. In my humble opinion, perhaps we should exploit more the unpredictability offered by some materials and work together with it instead of just exerting extreme control over them.

schwarz
2020-07-25 17:40
erm, what about Pierre Schaeffers gesture-controlled spatialization, 1950s?

juan.jpma
2020-07-25 18:14
Yes! The unpredictability of these kinds of materials make really interesting musical results

marije
2020-07-25 18:17
ah yes, that one too.

marije
2020-07-25 18:20
yes, Jacques Poullin (1951) potentiometre d'espace.

marije
2020-07-25 18:20
I should remember my own PhD thesis better

marije
2020-07-25 18:24
"the new process is a dialectic of sound in space and I think that the term _spatial music_ could fit it better than stereophony", as Pierre Schaeffer said

v.zappi
2020-07-25 18:34
@a.nonnis absolutely! Indeed what was stressed out by nearly all our participants was the need for more accessible technologies and tools to alleviate what we defined as the "burden of design", rather than for standardized mapping strategies. We believe that access to more mature technologies and collaborations with researchers/practitioners in material science/nano tech may indeed allow musicians/designers to shift back most of their focus on creative exploration, mappings, metaphors and music. Hopefully this comes clear in the paper's discussion! We appreciate a lot comments like yours and we are eager to receive more insights on interfaces that we missed and that came out recently! This includes comments that may be in contrast to how we interpreted the material collected so far. We are playing a very small role in Non-rigid interfaces for musical expression, the community is doing all the heavy lifting!

dianneverdonk
2020-07-25 19:03
Thanks!! We'll be in touch, I'll look into your work/git package soon! My e-mailadress is , feel free to e-mail me for anything around your projects. Cheers!

schwarz
2020-07-25 20:49
Thanks, @gibsond, for the extensive bibliography on preset interpolation! Another variant using RBFs is hidden in Freed, MacCallum, Schmeder, Wessel nime2010 hybridization interfaces.

schwarz
2020-07-25 20:54
Also, the visualisation reminded me of how one can display the original dimensions in a 2D PCA projection (the "factor loadings").

gibsond
2020-07-27 10:47
@schwarz Thanks. I looked at this paper a number of years back, but I think I need to revisit it. Thanks for the reminder.

gibsond
2020-07-27 10:52
@schwarz That?s an interesting idea??.I?ll need to spend a bit of time thinking about how a PCA could help users gain an insight (or maybe inhear?) to the sound. Thanks for your thoughts. I?ll let you know where it leads me.