Call for Participation: New Interfaces for Musical Expression 2020, Royal Birmingham Conservatoire, UK
We would like to invite you to be part of NIME 2020 – The International Conference on New Interfaces for Musical Expression. We welcome submissions of original research, both scientific and artistic. A non-exhaustive list of NIME-related topics is found below.
Topics
Original contributions are encouraged in, but not limited to, the following topics:
Musical interfaces designed by/with disabled/neurodiverse musicians
Musical interfaces in/as education
Increasing musical choices for disabled musicians through new accessible interfaces
Easier/cheaper approaches to the design of bespoke accessible instruments (that can be adapted to a user’s requirements), in a world dominated by mass production
Strategies that improve the reach and replicability of one-off accessible instrument projects, particularly those that are unlikely to have full commercial potential
Musical interfaces tailored to formally trained musicians
Novel controllers, interfaces or instruments for musical expression
Augmented, embedded and hyper instruments
Technologies or systems for collaborative music-making
Mobile music-making
Music-related human-computer interaction and mapping strategies
Sensor and actuator technologies, including haptics and force feedback devices
Explorations of relationships between motion, gesture and music
Evaluation and user studies of new interfaces for musical expression or commercially available “off the shelf” interfaces
Musical robotics
Interactive sound art and installations
Performance rendering and generative algorithms
Machine learning in musical performance
Artificial intelligence and new interfaces for musical expression
Web-based and/or telematic music performance
Software frameworks, interface protocols, and data formats, for supporting musical interaction
Historical, theoretical or philosophical discussions about designing or performing with new interfaces
Supporting cultural diversity through musical interfaces
Discussions about the artistic, cultural, and social impact of new interfaces
Pedagogical perspectives or reports on student projects in the framework of NIME-related courses
Practice-based research approaches/methodologies/criticism relating to the use of musical interfaces
Important Dates
24 January 2020: Paper, Poster, Music, Installation and Workshops submission deadline
31 January 2020: Final submission upload deadline (no extension)
15-22 March 2020: Notification of acceptances/rejections
22 March 2020: Early Bird Registration Opens
10 April 2020: Non-Paper Demo submission deadline
15 April 2020: Camera-ready submission and presenter registration deadline
30 April 2020: Early-bird registration deadline
21 July 2020: Pre-conference workshops
22-24 July 2020: The conference
25 July 2020: The unconference
More information about the conference can be found at nime202.bcu.ac.uk <http://nime202.bcu.ac.uk/>
NIME 2020 Royal Birmingham Conservatoire Organising Committee
Hi, I'm happy to share the final release of [vstplugin~] v0.2.0. Binaries are available on Deken. The source code is here: https://git.iem.at/pd/vstplugin/-/releases
Note that you can also load LV2 plugins with lv2vst: https://github.com/x42/lv2vst. I've tested this on Debian and generally it seems to work fine.
Have fun!
Christof
---
Changelog:
new features:
* VST2 shell plugin support (e.g. "Waves", "Blue Ripple Sound")
* (experimental) VST3 support including sample accurate automation and auxiliary inputs/outputs for side-chaining
* soft-bypass
* faster search/probe (parallel)
* cache search/probe results in a file to speed up subsequent searches
* [param_set( and [param_get( now also accept parameter names instead of indices (whitespace is bashed to underscores).
* set editor window position with [pos( message.
changes:
* switched whole project to CMake
* removed 'vstsearch' object because of the new cache file system.
* removed [precision( message (the processing precision can only be set at creation time).
internal changes:
* use .ini like syntax for plugin info
* hard-bypass prefers the plugin's bypass method
* single event loop shared by all plugins
* probably many more...
Hi,
October 29th I'm giving a talk about r_cycle, an open source
library/middleware for creative coding with Launchpad, at the Novation
pop-up shop in Shoreditch, London.
About the library/middleware:
r_cycle allows to fully control Launchpad from Pd (bi-directional
communication), giving the user the possibility to interact with the
device, to create custom UIs in real-time, on-the-fly, simply creating
specific objects (part of the library) in Pd.
It has different 'widgets' and audio/MIDI/utilities objects, all written in
Pd Vanilla, thus doesn't use any external.
For example creating the [KEYBOARD] object in your patch, would generate a
'chromatic keyboard' on the LP (1 octave/12 notes keyboard represented on
the pads).
This would be interactive (ie you press a pad part of the keyboard UI on
the Launchpad and the object in the patch returns one or more values) and
can be deleted simply deleting the [KEYBOARD] object in Pd.
It is possible to use multiple widgets at the same time (you can fill the
Launchpad layout), and also to call multiple scenes.
If you're in London on Tue and you wanna know more, here's more details
about the event:
https://www.eventbrite.co.uk/e/novation-london-learn-hackspace-r-cycle-crea…
Cheers,
Mario
**xCoAx 2020: 8th International Conference on**
**Computation, Communication, Aesthetics and X**
8-10 July 2020, Graz, Austria
http://xcoax.org/
Please feel free to distribute this call to your networks.
**Submission deadline: 31 January 2020**
xCoAx is an exploration of the intersection where computational tools
and media meet art and culture, in the form of a multi-disciplinary
enquiry on aesthetics, computation, communication and the elusive X
factor that connects them all.
xCoAx has been an occasion for international audiences to meet and
exchange ideas, in search for interdisciplinary synergies among computer
scientists, artists, media practitioners, and theoreticians at the
thresholds between digital arts and culture. Starting in 2013 in
Bergamo, xCoAx has so far taken place also in Porto, Glasgow, Lisbon,
Madrid, and Milan.
The focus of xCoAx is on the unpredictable overlaps between the freedom
of creativity and the rules of algorithms, between human nature and
machine technology, with the aim to evolve towards new directions in
aesthetics.
In 2020 xCoAx will take place in Graz, Austria, directly adjoining the
symposium **Algorithms that Matter (ALMAT)**. On the two days preceding
xCoAx, ALMAT gathers artists and researchers to present and discuss the
relations between algorithmic agency and artistic research, the practice
of algorithmic experimentation and its impact on compositional process.
ALMAT has issued a separate call for participation -
https://almat.iem.at/symposium.html - and reduced combi-tickets will be
available for xCoAx + ALMAT.
**Topics**
You are invited to submit theoretical, practical or experimental
research work that includes but is not limited to the following topics:
Computation, Communication, Aesthetics, X, Algorithms / Systems /
Models, Artificial Aesthetics, Artificial Intelligence, Audiovisuals /
Multimodality, Creativity, Design, Interaction, Games, Generative Art /
Design, History, Mechatronics / Physical Computing, Music / Sound Art,
Performance, Philosophy of Art / of Computation, Technology / Ethics /
Epistemology.
**Submission Categories**
Papers, artworks, performances, ongoing doctoral research (all details
for submission at http://xcoax.org/). For this edition of xCoAx, a
special call for performances for MUMUTH's Ligeti Hall is also being
launched.
**Publications**
All accepted works will be published in a proceedings book with ISBN
(see http://proceedings.xcoax.org for previous editions) and the authors
of the best papers will be invited to a special issue of the
Scopus-indexed Journal of Science and Technology of the Arts
(http://artes.ucp.pt/citarj/).
**Keynote Speakers**
Yuk Hui
Špela Petrič
**Doctoral Symposium Chairs**
Marko Ciciliani
**Organizing Committee**
André Rangel: CITAR / i2ADS
David Pirrò: Institute of Electronic Music and Acoustics, University of
Music and Performing Arts, Graz
Hanns Holger Rutz: Institute of Electronic Music and Acoustics,
University of Music and Performing Arts, Graz
Jason Reizner: Bauhaus-Universität Weimar
Luís Nunes: i2ADS / Faculty of Fine Arts, University of Porto
Luísa Ribas: CIEBA / Faculty of Fine Arts, University of Lisbon
Mario Verdicchio: Università degli Studi di Bergamo
Miguel Carvalhais: INESC TEC / Faculty of Fine Arts, University of Porto
**Local Organizing Committee**
David Pirrò: Institute of Electronic Music and Acoustics, University of
Music and Performing Arts, Graz
Hanns Holger Rutz: Institute of Electronic Music and Acoustics,
University of Music and Performing Arts, Graz
Daniele Pozzi: Institute of Electronic Music and Acoustics, University
of Music and Performing Arts, Graz
**Contacts**
info(a)xcoax.org
http://xcoax.orghttps://twitter.com/xcoaxorghttp://facebook.com/xcoax.orghttps://www.instagram.com/xcoaxorg/
**ALMAT - Algorithms that Matter**
**Symposium on Algorithmic Agency in Artistic Practice**
6--7 July 2020, Graz Austria
https://almat.iem.at/symposium.html
Please feel free to distribute this call to your networks.
**Submission deadline: 31 January 2020**
Artists and scientists have worked with digital computers for over
seventy years, and algorithmic practices exist for a lot longer. But in
recent years, increased computing power and decreased costs and
miniaturisation of machines have created a new quantity and quality of
everyday exposure, economic and political criticality, and with it a
wave of public attention and discourse. As artists-researchers, how do
we incorporate this new situation into our practices, and more
importantly, how does this changed situation retroact on our
understanding of the role of digital art, sound art and artistic
practice itself?
Rather than understanding algorithms as existing and transparent tools,
the ALMAT Symposium is interested in their genealogical, processual
aspects and their transformative potential. We seek critical approaches
that avoid both mystification and commodification, that aim at opening
the black box of "wonder" that is often presented to the public when
utilising algorithms.
The foundation for the symposium is given by the eponymous project ALMAT
- Algorithms that Matter. ALMAT is an artistic research project by Hanns
Holger Rutz and David Pirrò funded by the Austrian Science Fund (FWF AR
403-GBL) and hosted by the Institute of Electronic Music and Acoustics
(IEM) at the University of Music and Performing Arts Graz.
ALMAT 2020 will take place (06–07 July) adjoining the 8th Conference on
Computation, Communication, Aesthetics & X – xCoAx (08–10 July). xCoAx
is an exploration of the intersection where computational tools and
media meet art and culture, in the form of a multi-disciplinary enquiry
on aesthetics, computation, communication and the elusive X factor that
connects them all. xCoAx has issued a separate call for participation –
http://www.xcoax.org – and reduced combi-tickets are available for xCoAx
+ ALMAT.
**Call for Contributions**
The ALMAT Symposium calls for artistic research contributions in the
following two categories:
1. Contributions exploring the symposium's theme and the questions
arising from it. This may include:
- What are the material qualities specific of algorithms and
algorithmic practices? How does the algorithmic become malleable
as material?
- Are there particular affordances of the algorithmic?
- How does algorithmic agency unfold, how can it be observed,
formulated, or communicated? Which alternatives to traditional
concepts such as control/controller could be formulated?
- How does the reconfigurative "intrinsic" or "speculative"
movement of algorithms extend to or retroact on the artist or
recipient, how does it shape their interactions?
- How can artistic experimentation with algorithms be communicated
to an audience, how may it help sensitise and empower people to
take ownership of the algorithmic?
- Which are the thresholds of heteronomy/autonomy, what makes an
algorithmic practice become generative?
- What are philosophical, technological, aesthetic or artistic
consequences of acknowledging the agency of algorithms?
2. Contributions that explicitly refer to the research, the
experiences and the case studies of the ALMAT research
project. Contributions may be commentary, continuation, critique
or, more in general, a response to one or more aesthetic and
theoretic manifestations and artefacts reflected in the project's
documentation. The project's (ongoing) documentation is an online
hypertext starting at the Continuous Exposition:
https://www.researchcatalogue.net/view/381565/381566. In
particular, we identified a number of works that are
good candidates for responses, as they will be visible or audible
during the symposium (see submission page).
For more information and details of contribution formats and
application process, please refer to:
https://www.researchcatalogue.net/view/381565/698006
Contact: For any questions, please write to <almat(a)iem.at>.
To Pd announce:
Pd 0.50-2 is out. It fixes key binding trouble in MacOS (the 64 bit version
only). In particular the backspace key no longer acts as "delete" as it did in
0.50-1.
cheers
Miller
[please forward]
Georgia Tech School of Music
Guthman Musical Instrument Competition
2020 Call for Submissions
Deadline extended to October 6, 2019
http://guthman.gatech.edu
Georgia Tech's 2020 Margaret Guthman Musical Instrument Competition is an
annual event aimed at identifying the world's next generation of musical
instruments and unveiling the best new ideas in musicality, design,
engineering, and impact.
The Guardian called the competition "The Pulitzer of the New Instrument
World," and The New York Times described the "special, otherworldly sound
that you can feel permeating your soul" which became the hallmark of the
competition. Fast Company explained how Guthman's "Futuristic Instruments
will change how we make music," and the Atlanta Magazine suggests that "at
the Guthman Competition, innovative instruments just might predict the
future of music."
The Guthman Competition will take place March 6-7, 2020 at Georgia Tech's
Ferst Center for the Arts, in Atlanta, Georgia.
The deadline for submissions is October 6, 2019. Approximately Fifteen
semi-finalists will be invited to demonstrate, discuss, and perform with
their instruments as they compete for $10,000 in cash prizes.
Submit Your Instrument at: http://guthman.gatech.edu/guthman-submissions
///////////////////////////////////////////////////
:::: This Year's Judges ::::
REBECCA FIEBRINK - Reader, Goldsmiths, University of London
JORDAN RUDESS - Composer, Keyboardist, Member Dream Theater
DAVE SMITH - Engineer, Musician, Founder of Sequential
///////////////////////////////////////////////////
To learn more about the competition, visit guthman.gatech.edu or our
Facebook page at https://www.facebook.com/guthmancompetition
We’d love to hear your comments!
I've just did a big update to the easyflow library
You can already get easyflow v0.3.0 on deken or on github<https://github.com/HenriAugusto/easyflow>!
Since this is a big update be sure to read the change log<https://github.com/HenriAugusto/easyflow/blob/master/changeLog.md>!
(There is some minor backwards incompatible changes)
Some highlights of new abstractions:
-[dictionary] - getting and retrieving (key,value) pairs
-[listCollect] - useful for reconstructing lists after processing them element-wise
-[iterator] - list iterator
-[public][private][member][getMember][setMember][method][callMethod][methodInput][methodOutput][objAccess] - experimental workflow to give more control over data sharing across patches.
big bugfix:
-the library now can be used in OS with case-sensitive file system
Dear list,
my "audiolab" abstraction library is now available on deken. You'll
need Pd-0.50 or later to run this.
Please report any bugs on github: https://github.com/solipd/AudioLab
cheers,
philipp
Hey all
On Mon, 2018-04-30 at 11:20 -0400, William Brent wrote:
> Has anyone done something like this using [struct]/[polygon], even
> just for sequence display purposes and not editable via mouse
> clicking/dragging? I did a quick search of the archives but haven't
> found anything.
Late, but nevertheless, I had another go at this:
https://github.com/reduzent/unpunch/
Roman