OK let’s get going then, these are recent and subtle ones, some of my favourite. For more classic ones, check IRCAM’s database, Brahms (http://brahms.ircam.fr/ http://brahms.ircam.fr/) which might give you hundreds
Alex Harker’s Fluence, where the algorithmically composed ‘fixed media’ follows the clarinet through real-time description (but so much more too!) http://www.alexanderjharker.co.uk/Releases.html http://www.alexanderjharker.co.uk/Releases.html
Sam Pluta’s latest trombone piece for RAGE THORMBONES - they literally play descriptors through mic’ed mutes - http://www.sampluta.com/compositionMatrixForGeorgeLewis.html http://www.sampluta.com/compositionMatrixForGeorgeLewis.html
Aaron Einbond’s abuse and subversions of CataRT in the last 10 years - check his website. https://aaroneinbond.wordpress.com/projects/ https://aaroneinbond.wordpress.com/projects/
If I may, my last 7 pieces all use descriptor-based real-time granulation (and sometimes musaiking) which allow me to analyse a ring-buffer and then compose the granular process (i.e. swarms of quiet pitched material, or pulsed loud transient echos, etc). I talk about it a bit here in an informal talk about calibration (to allow such use, one has to think of ways to normalise input!) https://www.youtube.com/watch?v=QhjeoCwqed0 https://www.youtube.com/watch?v=QhjeoCwqed0 and in this paper (http://eprints.hud.ac.uk/id/eprint/33478/ http://eprints.hud.ac.uk/id/eprint/33478/)
I can put you in contact with those people off-list, should that help you, but again these are just personal favourites (and subtlest), there are a lot of pieces doing things like that to various degree of success.
Good luck!
p
On 7 Mar 2020, at 21:03, Vinicius Cesar oviniciusc.oliveira@gmail.com wrote:
Dear Pierre,
I mean machine listening as the retrieving of musical and sound attributes through audio descriptors. From simple implementation like pitch and envelop tracking to more complex methods involving audio descriptors like MFCC, spectral centroid etc. I'm looking for mixed works that makes use of these type of process to create interaction between instrumental writing and electroacoustic process.
On Sat, Mar 7, 2020 at 3:23 PM Pierre Alexandre <p.a.tremblay@hud.ac.uk mailto:p.a.tremblay@hud.ac.uk> wrote: Dear Vinicius
There are a lot of works doing so indeed! Maybe if you define a threshold of what you considering machine listening, that would help me point at various works in various places?
p
On 7 Mar 2020, at 18:10, Vinicius Cesar <oviniciusc.oliveira@gmail.com mailto:oviniciusc.oliveira@gmail.com> wrote:
Hello,
My name is Vinicius, I am a brazilian composer. I am currently working on a master research at the Estate University of Campinas in São Paulo, about compositional strategies and interactive music system in mixed works, and I'm looking for mixed works that makes use of machine listening/music information retrieval. It can be pieces for any instrumentation, but must be notated in score. If someone has pieces with these characteristics or knows some composer that works with this kind of music, contact me, please!
All the best,
Vinicius
Pd-list@lists.iem.at mailto:Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list https://lists.puredata.info/listinfo/pd-list
University of Huddersfield inspiring global professionals. [http://marketing.hud.ac.uk/_HOSTED/EmailSig2014/EmailSigFooterMarch2019.jpg http://marketing.hud.ac.uk/_HOSTED/EmailSig2014/EmailSigFooterMarch2019.jpg]
This transmission is confidential and may be legally privileged. If you receive it in error, please notify us immediately by e-mail and remove it from your system. If the content of this e-mail does not relate to the business of the University of Huddersfield, then we do not endorse it and will accept no liability. _______________________________________________ Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list