joe.wright
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

joe.wright
2020-07-18 14:07
@joe.wright set the channel purpose: Paper Session 15: Interactive Music Systems / Coding and Robotics

niccolo.granieri
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

hassan.hussain5
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

overdriverecording
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

lamberto.coccioli
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

jonathan.pearce
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

richard.j.c
2020-07-18 14:07
has joined #papers15-interactive-music-systems-coding-and-robotics

eskimotion
2020-07-20 09:25
has joined #papers15-interactive-music-systems-coding-and-robotics

edmund.hunt
2020-07-20 09:25
has joined #papers15-interactive-music-systems-coding-and-robotics

acamci
2020-07-20 17:01
has joined #papers15-interactive-music-systems-coding-and-robotics

aaresty
2020-07-20 17:21
has joined #papers15-interactive-music-systems-coding-and-robotics

10068197
2020-07-20 17:21
has joined #papers15-interactive-music-systems-coding-and-robotics

a.nonnis
2020-07-20 17:22
has joined #papers15-interactive-music-systems-coding-and-robotics

a.macdonald
2020-07-20 17:23
has joined #papers15-interactive-music-systems-coding-and-robotics

andreas
2020-07-20 17:24
has joined #papers15-interactive-music-systems-coding-and-robotics

dianneverdonk
2020-07-20 17:25
has joined #papers15-interactive-music-systems-coding-and-robotics

likelian
2020-07-20 17:25
has joined #papers15-interactive-music-systems-coding-and-robotics

ko.chantelle
2020-07-20 17:25
has joined #papers15-interactive-music-systems-coding-and-robotics

anika.fuloria
2020-07-20 17:26
has joined #papers15-interactive-music-systems-coding-and-robotics

clemens.wegener
2020-07-20 17:26
has joined #papers15-interactive-music-systems-coding-and-robotics

marije
2020-07-25 12:01
Adnan Marquez-Borbon *Collaborative Learning with Interactive Music Systems* _*paper 113*_ Christodoulos Benetatos, Joseph VanderStel, Zhiyao Duan *BachDuet: A Deep Learning System for Human-Machine Counterpoint Improvisation* _*paper 125*_ Alex Mclean *Algorithmic Pattern* _*paper 50*_ Michael Sidler, Matthew C Bisson, Jordan Grotz, Scott Barton *Parthenope: A Robotic Musical Siren* _*paper 56*_ Ning Yang, Richard Savery, Raghavasimhan Sankaranarayanan, Lisa Zahray, Gil Weinberg *Mechatronics-Driven Musical Expressivity for Robotic Percussionists* _*paper 26*_



marije
2020-07-25 12:04
*17:00 - 18:15 (UTC+1)*

hassan.hussain5
2020-07-25 15:46
FINAL PAPER SESSION starting in 15 mins... you probably know the drill by now but, as a reminder: when asking a question in response to a paper, please indicate in your message to which paper presentation you are responding.  This can be done by mentioning the title of the paper or using the @ to direct it to the presenter. This will make it easier for people to follow the presentations and the Q&A later (due to being in different time zones). + please keep replies to the question in a thread!

hassan.hussain5
2020-07-25 15:48

alex
2020-07-25 16:01
Hello all

jim.murphy
2020-07-25 16:11
@adnan.marquez For courses with larger bodies of students, how well do you think these approaches would scale to larger courses? Is there a maximum cohort size, do you think?

p.stapleton
2020-07-25 16:11
@adnan.marquez did improvisation play a role in your group workshops, and if so how?

mcbisson
2020-07-25 16:12
@adnan.marquez For paper 113, can you talk a little bit about the individual student experience in the trial?

o.green
2020-07-25 16:12
@adnan.marquez I might have missed this in the talk, but what sorts of musical backgrounds did the students have? For exmaple, did they have existing instrumental practices, and were the kinds of idea the group came up with influenced (in any clear way) by these backgrounds?

harrap
2020-07-25 16:12
Was the physical / gestural signaling that is seen in the video happen because of an agreement beforehand or was it emergent?

juan.jpma
2020-07-25 16:14
@adnan.marquez @o.green beat me to it, did you observe particular approaches to improvisation based on their primary instrument backgrounds?

marije
2020-07-25 16:18
2nd paper running now

adnan.marquez
2020-07-25 16:18
Hi @harrap, for this particular example we did have to agree beforehand as a way to guide the exercise. But he had tried it without cues before.

marije
2020-07-25 16:20
and it is paper 126 in the proceedings. Apologies for putting the wrong number up earlier

marije
2020-07-25 16:21
seems to be a mistake in the bibtex file...

sdbarton
2020-07-25 16:25
@c.benetatos this is great. I wonder what happens when you move into harmonically ambiguous areas that are not full modulations?

steventkemper
2020-07-25 16:25
@c.benetatos What happens if you feed the Bach-Duet system sequences outside of 18th European Baroque music? Is this system extensible to other types of music?

rschramm
2020-07-25 16:26
@c.benetatos. Congratulations. I am amazed with your results. Would it be possible to extend the work to 4 voices? Sorry, I didn't read the paper yet. Maybe this is a nonsense question.

ko.chantelle
2020-07-25 16:27
@c.benetatos is it in future work to adapt the BachDuet for other instrumental inputs besides the keyboard?

sdbarton
2020-07-25 16:28
@c.benetatos this would be cool on a Disklavier

msidler
2020-07-25 16:28
@c.benetatos did you perform any tests where the human lead voice was unaware if the duet partner was a human or BachDuet?

s.holland
2020-07-25 16:29
Pretty damn good. You probably said in the paper (& maybe talk) but what was the training set? @c.benetatos

lamberto.coccioli
2020-07-25 16:30
@c.benetatos hate to admit it, but I was fooled in your HH/HM example!

edmund.hunt
2020-07-25 16:30
@c.benetatos fascinating paper. Does BachDuet always perform the bass part, or would it be able to improvise a soprano line over a human bass line? Sorry if I missed this in the full paper.

decampo
2020-07-25 16:31
How much background did the human player have in baroque music?

noris
2020-07-25 16:33
@c.benetatos impressive! couldn't tell which was which (HH/HM) in the test

edmund.hunt
2020-07-25 16:33
@c.benetatos Thank you.

jim.murphy
2020-07-25 16:35
Paper 50 (Algorithmic Pattern) running now (by @alex)

lja
2020-07-25 16:40
@alex this system looks like it has a ton of components! how would someone go about learning it?

alex
2020-07-25 16:40

alex
2020-07-25 16:40
there's a link to the "learning tidalcycles" course there too

alex
2020-07-25 16:41
which is based on pre-recorded videos

alex
2020-07-25 16:41
I showed a colour example first but it's designed for making music

sdbarton
2020-07-25 16:42
@maclea11 this is really neat. I?m thinking about the possibilities of integrating notions of ?good? patterns from psychological and musicological research ala Garner, W. R. (1970). Good patterns have few alternatives. _American Scientist_, _58_, 34?42. Toussaint, G. T. (2016). _The Geometry of Musical Rhythm: What Makes a? good? Rhythm Good?_ Chapman and Hall/CRC.

dianneverdonk
2020-07-25 16:42
@alex: thanks for your inspiring presentation. Unfolding what algorithmic pattern can be regarding iterative making processes: I feel very addressed. And for me, what you have also done via this presentation is making ?algorithmic pattern? into a soft, gentle, open and playful container of a word/definition. It lost a bit of its bold, solid appearance that it often seems to have. Very inspiring, thanks a lot!

p.stapleton
2020-07-25 16:42
@maclea11 "interacting with the code to see where it takes me" is a great line. Are you ever surprised by where you end up? If so, can you give an example?

jim.murphy
2020-07-25 16:44
The patterns that you showcase appear to be quite grid-based. In this case, do you think ?pattern? could be interchanged with the word ?grid??

harrap
2020-07-25 16:44
It would be interesting to wire it to an automated loom, and perhaps then live feed the loom output into some kind of noisy sensor as an installation piece.

js.vanderwalt
2020-07-25 16:44
@alex How important is it that the eventual pattern is perceptible? Patterns in visual examples are easy to see, is that the case with rhythmic patterns?

marije
2020-07-25 16:45
livecoding an etextile weave, that is then used as a sensor?

ko.chantelle
2020-07-25 16:45
I know there are other instrument forms of MIDI, like a midi violin and a midi sax. I've never tried them so I don't know how good they are. But they could open up the opportunity to include a more diverse group of participants without having to work too hard on audio pitch tracking.

harrap
2020-07-25 16:46
a physical weave that then flows over a surface to an image array that is deliberately (perhaps interactively) skewed.

a.r.jensenius
2020-07-25 16:46
Great @alex. Like your comments about the 3d structure!

marije
2020-07-25 16:49
In textiles patterns are not exclusively linked to grid-based things. Patterns for cutting cloth, and to then sow them, are just drawn.

v.zappi
2020-07-25 16:49
The Hybrid Live Coding website seems the perfect place where to search for and learn about projects of this kind: https://hybrid-livecode.pubpub.org/workshop2020 TOPLAP is a great and related hub too, I am quite sure I read about something similar in the forum: https://toplap.org/

matsuura.nonymous
2020-07-25 16:49
@alex How do you think about an iterative process that updating a pattern itself can be coded? and could it be included in algorithmic pattern?

jim.murphy
2020-07-25 16:50
Ahhh -- really good point!

alex
2020-07-25 16:50
@jim.murphy To clarify, I think patterns often exist within a grid, but not necessarily, and the terms aren't interchangeable. Pattern is more about how things are made.

jim.murphy
2020-07-25 16:51
Paper 56 running now (Parthenope: A Robotic Musical Siren)

alex
2020-07-25 16:51
Yep! I'm not sure if I'd describe such a pattern as an _algorithmic_ one though.

marije
2020-07-25 16:51
@alex ... just to keep them apart!

alex
2020-07-25 16:52
@js.vanderwalt A briefer answer - very important!

alex
2020-07-25 16:52
In fact @marije and I did exactly this!

marije
2020-07-25 16:53
the instructions how to arrive at them, may be algorithmic.

alex
2020-07-25 16:53
well the loom was semi-automated, but driven by code.

marije
2020-07-25 16:54
yes, and I did use the sound as picked up by a piezo mic from it.

alex
2020-07-25 16:54
@matsuura.nonymous Yes that's a nice though - meta-pattern! It's the sort of thing that you do in weaving, with lift plans and so on.

mark.durham
2020-07-25 16:55
@maclea11 Have you ever thought to add a visual representation of the current to tidal?

marije
2020-07-25 16:55
that does need an iteration though, as I had certain assumptions about the rhythms that would be produced by the loom (pauses when one part was not moved), so the final result was not as cool as I thought it would be yet.

c.benetatos
2020-07-25 16:56
That's very true. The quality of the generated result is very much affected by the abilities of the user on keyboard. We wanted to try midi flute, but we didn't have access to one. Thanks

alex
2020-07-25 16:57
I've experimented a little Mark but I think others have done better with a/v tidal patterns, e.g. Atsushi Tadokoro and Malitzin Cortes

c.benetatos
2020-07-25 16:57
Good question. BachDuet currently doesn't support more than 2 voices, but we could achieve that with some changes on the neural net. This would be very interesting to investigate. Thanks

c.benetatos
2020-07-25 16:58
Indeed. If only I had one !! Thanks

steventkemper
2020-07-25 16:59
Congrats on the creation of Parthenope! What kind of dynamic range does the instrument have?

gndunning
2020-07-25 16:59
Yeah Parthenope is great!

jim.murphy
2020-07-25 16:59
Why not use a loudspeaker and physical modelling approaches? What does this offer that a loudspeaker approach not allow?

alex
2020-07-25 16:59
@marije yes lets revisit, I have made that change in response to our performance

c.benetatos
2020-07-25 17:00
Soprano-Bass duets extracted from Bach Chorales. Thanks !

matsuura.nonymous
2020-07-25 17:00
@msidler @mcbisson @jordantaylorg007 @sdbarton I want to add a historic context for a programmable siren: YAMAHA had been made a "Music Siren" in 1950s~1990s which are placed on a rooftop of shopping malls, office bulldings and schools. 5 or 6 are still working today in Japan and I've heard one of them- it was surprisingly loud that cannot be realized with electro-magnetic speaker, which can be heard from hundreds of meters apart! Though I could not find any article available in English, here is a collection of the sound from them https://www.youtube.com/playlist?list=PLKnSy08QbO9ZoeDAJqoYpQPAlQ_ESEzsz

marije
2020-07-25 17:00
Cool!

ahsu
2020-07-25 17:01
@msidler @mcbisson @jordantaylorg007 @sdbarton Congrats Parthenope team!! Looking forward to hearing more compositions too!! Great work!

c.benetatos
2020-07-25 17:01
That's good to hear : - ) . Thanks !

steventkemper
2020-07-25 17:01
@msidler @mcbisson @jordantaylorg007 @sdbarton Are there issues with mechanical noise/noise floor?

v.zappi
2020-07-25 17:02
@msidler, to me it sounds like some kind of acoustic chiptune... talking about idiosyncrasies:heart:

g.moro
2020-07-25 17:02
@msidler @mcbisson @jordantaylorg007 @sdbarton is it playable directly with a keyboard?Is the latency low enough?

alex
2020-07-25 17:02
true!

c.benetatos
2020-07-25 17:02
All of the participants are very familiar with baroque music, and classical music theory, and improvisation in general. 2 of them had also very good jazz background. Also, 5 of them are familiar with classical music improvisation. Thanks

mark.durham
2020-07-25 17:02
Oh nice - looking some of that up. I guess it's also a question of listening or looking at patterns as you say. Thanks for the reply

g.moro
2020-07-25 17:03
@msidler thanks for your answer, I guess I was worried about the mechanical latency. I think you mentioned 50ms during the presentation?

marije
2020-07-25 17:03
Perhaps of interest as a reference, Edwin van der Heide's Pneumatic Sound Field: https://www.evdh.net/pneumatic_sound_field/

info041
2020-07-25 17:03
really cool instrument Parthenope group, would be interested to try it :slightly_smiling_face: I like the physicality of it would not want to replace it with a synthesiser!

adnan.marquez
2020-07-25 17:03
@msidler @mcbisson @jordantaylorg007 @sdbarton Sounds great! How important were the idiosyncrasies you found in further developing compositions and/or interactions?

ko.chantelle
2020-07-25 17:04
Yes it looks like a really useful tool. I would like to try it someday, but my keyboard skills are rather lacking

c.benetatos
2020-07-25 17:04
Good to know : - ) Thanks !

g.moro
2020-07-25 17:04
me too!

jim.murphy
2020-07-25 17:05
Thanks so much for the inspiring papers and questions! Paper 26 running now: Mechatronics-Driven Musical Expressivity for Robotic Percussionists (@richard)

mcbisson
2020-07-25 17:06
Hi Steve, thanks for the question. We had some issues with mechanic vibrations from the spinning disks but were able to mitigate most of them. We used rubber gaskets and a foam bottom piece to absorb the vibrations from the instrument. In addition, the volume of the instrument is loud enough to play over the minor mechanical sounds left over.

mcbisson
2020-07-25 17:07
Thank you so much!

mcbisson
2020-07-25 17:07
Thank you very much!

msidler
2020-07-25 17:08
If you are interested, the paper details sound level testing including the background noise level, spinning disk noise level, and volume of each of the three nozzles.

mcbisson
2020-07-25 17:08
Thank you so much!

p.stapleton
2020-07-25 17:10
@maclea11 by the way, I loved your set in the lunchtime concert, although I now can't get M.I.A. out of my head.

msidler
2020-07-25 17:10
Thanks for the question! Yes, it is playable in real time with a small latency introduced by the actuation time of the solenoids which is ~25ms (the 50ms number in the presentation was for on and off).

mcbisson
2020-07-25 17:11
They were very important. In my experience, composing actually became a very interactive experience. I could walk in with a distinct idea of what I wanted to create, but upon playback, had to completely adapt my composition to fit the instrument's strengths. As we played different musical gestures on the instrument, we have been able to find the gestures that the instrument excels with.

jordantaylorg007
2020-07-25 17:11
Thank you!

lauren.s.hayes
2020-07-25 17:12
Nice paper @alex. I have several indigenous students here in the southwest working with code. Do you have some good references/examples that you've come across from indigenous scholars/that center indigenous knowledge on patterning in relation to code that could be helpful here? Thanks!

marije
2020-07-25 17:13
Hey Paul, you are aware that we have a McLean and a MacLean here, right?

mario.buoninfante
2020-07-25 17:13
Interestingly this seems to suggest that a 7 bit range is good enough to represent the human dynamic range !?

steventkemper
2020-07-25 17:13
Thanks for your work, Shimon sounds great! Nina Yang and @richard How do the authors define ?expressivity? Does expressivity for robotic instruments necessarily equate to similarity to human performance?

adnan.marquez
2020-07-25 17:14
Great! I would be interesting to further explore those unexpected spaces. Thanks!

msidler
2020-07-25 17:14
Thank you!

sdbarton
2020-07-25 17:15
@richard nice developments. Did you compare these actuators vs. voice coils ala MechDrum? Van Rooyen, R., Schloss, A., & Tzanetakis, G. (2017). _Voice Coil Actuators for Percussion Robotics_.

jim.murphy
2020-07-25 17:15
A funny hack that I did a few years ago for improving dynamic range with solenoids and motorized stops: https://quod.lib.umich.edu/i/icmc/bbp2372.2014.103/1

msidler
2020-07-25 17:15
Thanks! Our ability to compose for it was slightly limited due to COVID, but we hope to be able to continue working with it in the future

msidler
2020-07-25 17:17
Thanks for the link. It has an amazing sound!

mcbisson
2020-07-25 17:18
Where can we find the complete version of that demo video?

g.moro
2020-07-25 17:19
Live now: Bela drop-in Q&A. Come say "`hi`" and "`?`"  ! https://us04web.zoom.us/j/75698918087?pwd=VjdzcEtPN1RTZnNSbDNHaStzZlFTdz09

cagri.erdem
2020-07-25 17:19
@richard congrats for the groundbreaking work. Speaking about the relationship between human expressivity and dynamic range, studies on p-center vs loudness can be fruitful to checkout, such as https://asa-scitation-org.ezproxy.uio.no/doi/full/10.1121/10.0000724

adnan.marquez
2020-07-25 17:19
Thanks @jim.murphy!

niccolo.granieri
2020-07-25 17:20
** as previously mentioned, here is a link to a zoom room that will be open all day long for you to pop-in and chat with other NIME attendees! https://us04web.zoom.us/j/75307801251?pwd=VTF3MFJ4UTNaY1psTHQ4Qllkckhndz09

marije
2020-07-25 17:20
and check the installations, if you haven't done so yet: https://nime2020.bcu.ac.uk/installations/

v.zappi
2020-07-25 17:20
Thanks to all the the authors!

jim.murphy
2020-07-25 17:20
Thanks for a wonderful session!


abi
2020-07-25 17:20

g.moro
2020-07-25 17:20
oh OK thanks. Does that value vary depending on the interval you play (e.g.: if the motor has to speed up/ down in the meantime) ?

jim.murphy
2020-07-25 17:20
Very exciting to see the innovation happening in musical robotics/mechatronics - the level of attention to detail on these instruments is accelerating so much from year to year. Such an exciting field to be in, and great to watch it continue to evolve!

richard
2020-07-25 17:20
Thanks for asking something that lets us share our videos :slightly_smiling_face: we recorded 4 'music videos'


p.stapleton
2020-07-25 17:21
Oops! Sorry about that.

richard
2020-07-25 17:21
You can see the others on this youtube channel https://www.youtube.com/user/GTCMT/videos

sdbarton
2020-07-25 17:22
good job chairing the session @jim.murphy, now do you go back to bed?

p.stapleton
2020-07-25 17:23
@alex, see above.

matsuura.nonymous
2020-07-25 17:24
Sometimes I imagine is if we could build a synthesizer without electronics but only with pneumatic energy in a different form from an organ. Maybe a combination with pneumatic/fluidic amplifier will be fantastic http://www.douglas-self.com/MUSEUM/COMMS/auxetophone/auxetoph.htm

richard
2020-07-25 17:25
Thanks for the question. It's a really interesting point, that we at times explore more about robotic expressivity actually means. The more accurate and likely more interesting approach would be custom definitions of robotic expressivity. This paper is really a new example of the baseline use of the new motors, so we just outsourced the meaning of expressivity. But in the future it would be great to explore more.

sdbarton
2020-07-25 17:25
It is also really different in person, visually and acoustically, which is one of the motivations for building these kinds of instruments in the first place.

sdbarton
2020-07-25 17:26
neat, thank you

sdbarton
2020-07-25 17:26
hi-tech, lo-fi !

ko.chantelle
2020-07-25 17:27
When I saw this question, I think it also opens up a can of worms a bit because how well a person perceives musical patterns is largely based on their musical education and cultural background. A western classical musician who has gone through musicianship training can aurally recognize chord progression patterns much easier than a non-musician. And traditional indian and african music is much more rhythmically complex than western music, so a person trained in these styles might be able to aurally recognize more complex patterns whereas someone else may find it chaotic.

alex
2020-07-25 17:27
Thanks! Good question, although a little hard for me to get my head round as we dont' really have indigenous knowledge in the UK

alex
2020-07-25 17:29
It's definitely worth looking up Audrey Bennett on "heritage algorithms"

richard
2020-07-25 17:29
Thanks for pointing this out. Of course it's not, we just end up at MIDI to fit better into wider discourse and for comparison to our past systems. We did much longer dynamic range and listening tests. While we scale it to between 0 and 127 there was a much wider range than that. More subtle tests would be better to test dynamic range, but for this test we really focused on the loudest and softest, as well as being able to match a group of different volumes performed by humans

info041
2020-07-25 17:30
Great, hope to one day to experience it!

alex
2020-07-25 17:31
E.g. "Ethnocomputational creativity in STEAM education: A cultural framework for generative justice"

richard
2020-07-25 17:31
Thanks for sharing, we didn't physically compare unfortunately, our comparison on voice coils only extended to checking literature.

richard
2020-07-25 17:33
Thanks for sharing, will check this out for sure

js.vanderwalt
2020-07-25 17:34
That's interesting, and I take your point. On further reflection, it seems to me that Alex is talking about the activity of pattern-_ing_ as much as the patterns themselves: about producing patterns as much as consuming (seeing/hearing) them.

alex
2020-07-25 17:35
I love Paola Torres Núñez del Prado's work connecting code, music, weaving and mathematics

alex
2020-07-25 17:36
Including on quipu structures

alex
2020-07-25 17:42
For the general complexity of Andean textile structures, you could look at the work of Denise Arnold https://www.amazon.co.uk/Andean-Science-Weaving-Structures-Techniques/dp/0500517924

alex
2020-07-25 17:43
I suppose the problem is that pattern is often implicit/tacit, rather than written as explicit codes.

alex
2020-07-25 17:43
Hope that's useful @lauren.s.hayes, I'll think on it more!

lauren.s.hayes
2020-07-25 17:47
Thanks @alex! I will take a look.

sdbarton
2020-07-25 17:56
@jim.murphy the reasons that we make acoustic musical machines in general involve visual identity, physical gesture, spatialization, acoustic richness, and physical idiosyncrasies. The latter two are particularly important in the case of Parthenope. There is so much sonic nuance when you play with the instrument in person that emerges as you modulate parameters such as the temporal interval of notes. I wouldn?t argue that this isn?t replicable in software, but it certainly would be difficult. Another primary reason is that of design philosophy: we are starting with physical phenomena and then exploring how to harness it (as opposed to building sounds ?from scratch?). The sonic identity was shaped as the instrument was designed and built. These choices were constrained by physical realities such as disc size and mass. The lack of such constraints in software lead to different kinds of choices. In software, you have different kinds of constraints imposed by those that built the system. In our case, we are free from some of that as we are designing and building a new system. Different materials, different methods, different results.

sdbarton
2020-07-25 17:59
Let us know if you are ever in Worcester, MA!

alex
2020-07-25 19:11
Yes definitely a can of worms! I think we'd probably agree that listening is a creative act, and therefore that the listener might recognise different pattern structures than any that the composer was thinking about..

alex
2020-07-25 19:12
I remember accidentally attending an recital of church organ music and couldn't recognise any patterns, it just sounded completely alien

alex
2020-07-25 19:13
Heh glad you enjoyed it. I was going in a more improvised direction but after the crash I fell back on more of a 'set piece' with the m.i.a sample

alex
2020-07-25 19:14
I tried to answer your question in the stream, in case you missed it yes I am very often surprised by where I end up. Let me try to find an example..

alex
2020-07-25 19:17
One of my favourite improvised bits is here from around 23:30 https://www.youtube.com/watch?v=Vomnc9R-7mw

alex
2020-07-25 19:18
I'm just trying things out to see how they fit together. In the end it really comes together when I add a steady kick on top

alex
2020-07-25 19:33
As I mentioned in the talk, although weaves are notated with grids, the actual weave structures are not constrained to grids.

alex
2020-07-25 19:44
Reflecting on this a bit.. I think "grid-based" is often used in a pejorative sense in e.g. electroacoustic circles. From my perspective it doesn't really make sense to define some musics as grid-based and others as not. Discrete symbols and continuous gestures are present in all music, in mutual support.

p.stapleton
2020-07-25 19:52
Thanks @alex, looking forward to checking out that video.

marije
2020-07-25 19:54
true

jim.murphy
2020-07-26 04:55
Straight to bed! Dreamt of mechanical sirens! :grinning: