(Apologies for cross-postings)
ICAD 2020 Call for Submission of Papers, Extended Abstracts, Workshops,
and Tutorials26th International Conference on Auditory DisplayUniversity of
Florida, Gainesville, FL, USAJune 7-11 2020THEME: “SAFE AND SOUND”
Sound is used in a wide variety of applications to alert listeners to the
status of a person or environment. At ICAD 2020, we want to highlight
sonification work that is used to maintain awareness in some capacity
(outside navigation, hospitals, air traffic control, etc). Papers are not
limited to this theme, as we will value and embrace all types of
submissions, including papers, posters, multimedia (videos/audios), demos,
and concert pieces.
First held in 1992, ICAD is a highly interdisciplinary academic conference
with relevance to researchers, practitioners, musicians, and students
interested in the design of sounds to support tasks, improve performance,
guide decisions, augment awareness, and enhance experiences. It is unique
in its singular focus on auditory displays and the array of perception,
technology, and application areas that this encompasses. Like its
predecessors, ICAD 2020 will be a single-track conference, open to all,
with no membership or affiliation requirements.
ICAD 2020, the 26th International Conference on Auditory Display, will be
held at the University of Florida, June 7 to 11, 2020. The graduate student
ThinkTank (doctoral consortium) will be held on Sunday, June 7, before the
main conference.
PAPERS AND EXTENDED ABSTRACTS
The ICAD 2020 committee is seeking papers and extended abstracts that will
contribute to knowledge of how sonification can support awareness in
various contexts. For details on topics of interest, proposal format,
submission instructions, and additional conference information please visit
http://icad2020.icad.org , where details will be updated as they are made
available.
WORKSHOPS AND TUTORIALS
ICAD workshops and tutorials provide in-depth opportunities for conference
attendees to discuss and explore important aspects of the field of auditory
display with like-minded researchers and practitioners. Sessions can range
from applications and programming methodologies to interdisciplinary
research skills, emerging research areas, and challenge problems, to
sonification/compositional Maker-sessions.
IMPORTANT DATES:
-
Monday, March 2, 2020 - Deadline for submission of full papers
-
Monday, March 9, 2020 - Deadline for submission of workshops and
tutorials
-
Monday, March 9, 2020 - Deadline for submission to student think tank
-
Monday, April 6, 2020 - Deadline for submission to sonification concert
and installations
-
Friday, April 10, 2020 - Notification of decisions
-
Monday, April 20, 2020 - Extended abstract submission (some full paper
submissions may be recommended for the extended abstract category)
Papers Chair - Bruce Walker - icad2020papers(a)icad.org
Workshop Chair - Derek Brock - icad2020workshops(a)icad.org
Sponsorship Chair - Myounghoon (Philart) Jeon - icad2020sponsorship(a)icad.org
Think Tank Chair - Areti Andreopoulou - icad2020thinktank(a)icad.org
Communications Chair - Katie Wolf - icad2020accessibility(a)icad.org
Steering Chair - Matti Gröhn - icad2020steering(a)icad.org
Conference Chair - Kyla McMullen - icad2020chair(a)icad.org
--
Kyla McMullen
Chair of ICAD 2020: http://icad2020.icad.org
--
*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*
Kyla A. McMullen, Ph.D.
Web: http://www.kylamcmullen.com
LinkedIn: KylaMcMullen <https://www.linkedin.com/in/kyla-mcmullen-16902222>
Twitter: @Dr_Kyla <https://twitter.com/Dr_Kyla>
Facebook: Kyla McMullen <https://www.facebook.com/kyla.mcmullen.90/>
Google Scholar: Kyla McMullen
<https://scholar.google.com.au/citations?hl=en&user=DzvXnyoAAAAJ>
Need a synchronous conversation? <http://calendly.com/kyla-mcmullen>
*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*
First Call for Papers: 2020 Joint Conference on AI Music Creativity
(CSMC + MuMe)
Oct 22-24 2020 @ KTH and KMH, Stockholm, Sweden
http://kth.se/aimusic2020
The computational simulation of musical creativity continues to be an
exciting and significant area of academic research, and is now making
impacts in commercial realms. Such systems pose several theoretical and
technical challenges, and are the result of an interdisciplinary effort
that encompasses the domains of music, artificial intelligence,
cognitive science and philosophy. This can be seen within the broader
realm of Musical Metacreation, which studies the design and use of such
generative tools and theories for music making: discovery and
exploration of novel musical styles and content, collaboration between
human performers and creative software “partners”, and design of systems
in gaming and entertainment that dynamically generate or modify music.
The 2020 Joint Conference on AI Music Creativity brings together for the
first time two overlapping but distinct research forums: The Computer
Simulation of Music Creativity conference
(https://csmc2018.wordpress.com, est. 2016), and The International
Workshop on Musical Metacreation (http://musicalmetacreation.org, est.
2012). The principal goal is to bring together scholars and artists
interested in the virtual emulation of musical creativity and its use
for music creation, and to provide an interdisciplinary platform to
promote, present and discuss their work in scientific and artistic contexts.
The three-day program will feature two keynotes, research paper
presentations, demonstrations, discussion panels, and two concerts.
Keynote lectures will be delivered by Professor Emeritus Dr. Johan
Sundberg (Speech, Music and Hearing, KTH,
https://scholar.google.co.uk/citations?user=UXXUEcoAAAAJ
<https://scholar.google.co.uk/citations?user=UXXUEcoAAAAJ&hl=en&oi=ao>)
and Dr. Alice Eldridge (Music, Sussex University, UK,
https://scholar.google.co.uk/citations?user=uvFGFagAAAAJ).
Topics
We encourage submissions of work on topics related to CSMC and MuMe,
including, but not limited to, the following:
Systems
*
systems capable of analysing music;
*
systems capable of generating music;
*
systems capable of performing music;
*
systems capable of (online) improvisation;
*
systems for learning or modeling music style and structure;
*
systems for intelligently remixing or recombining musical material;
*
systems in sound synthesis, or automatic synthesizer design;
*
adaptive music generation systems;
*
music-robotic systems;
*
systems implementing societies of virtual musicians;
*
systems that foster and enhance the musical creativity of human users;
*
music recommendation systems;
*
systems implementing computational aesthetics, emotional responses,
novelty and originality;
*
applications of CSMC and/or MuMe for digital entertainment: sound
design, soundtracks, interactive art, etc.
Theory
*
surveys of state-of-the-art techniques in the research area;
*
novel representations of musical information;
*
methodologies for qualitative or quantitative evaluation of CSMC
and/or MuMe systems;
*
philosophical foundations of CSMC and/or MuMe;
*
mathematical foundations of CSMC and/or MuMe;
*
evolutionary models for CSMC and/or MuMe;
*
cognitive models for CSMC and/or MuMe;
*
studies on the applicability of music-creative techniques to other
research areas;
*
new models for improving CSMC and/or MuMe;
*
emerging musical styles and approaches to music production and
performance involving the use of CSMC and/or MuMe systems
*
authorship and legal implications of CSMC and/or MuMe;
*
future directions of CSMC and/or MuMe.
Paper Submission Format
There are three formats for paper submissions:
*
Full papers (8 pages maximum, not including references);
*
Work-in-progress papers (5 pages maximum, not including references);
*
Demonstrations (3 pages maximum, not including references).
The templates will be released early 2020, and EasyChair submission link
opened soon thereafter. Please check the conference website for updates:
http://kth.se/aimusic2020
Since we will use single-blind reviewing, submissions do not have to be
anonymized. Each submission will receive at least three reviews. All
papers should be submitted as complete works. Demo systems should be
tested and working by the time of submission, rather than be
speculative. We encourage audio and video material to accompany and
illustrate the papers (especially for demos). We ask that authors
arrange for their web hosting of audio and video files, and give URL
links to all such files within the text of the submitted paper.
Accepted full papers will be published in a proceedings with an ISBN.
Furthermore, selected papers will be invited for expansion and
consideration for publication in the Journal of Creative Music Systems
(https://www.jcms.org.uk).
Important Dates
Paper submission deadline: August 14 2020
Paper notification: September 18 2020
Camera-ready paper deadline: October 2 2020
Presentation and Multimedia Equipment:
We will provide a video projection system as well as a stereo audio
system for use by presenters at the venue. Additional equipment required
for presentations and demonstrations should be supplied by the
presenters. Contact the Conference Chair (bobs(a)kth.se
<mailto:bobs@kth.se>) to discuss any special equipment and setup
needs/concerns.
Attendance
At least one author of each accepted submission should register for the
conference by Sep. 25, 2020, and attend the workshop to present their
contribution. Papers without authors will be withdrawn. Please check the
conference website for details on registration: http://kth.se/aimusic2020
About the Conference
The event is hosted by the Division of Speech, Music and Hearing, School
of Electrical and Computer Engineering (KTH) in collaboration with the
Royal Conservatory of Music (KMH).
Conference chair: Bob L. T. Sturm, Division of Speech, Music and
Hearing, KTH
Paper chair: Andy Elmsley, CTO Melodrive
Music chair: Mattias Sköld, Instutitionen för komposition, dirigering
och musikteori, KMH
Panel chair: Oded Ben-Tal, Department of Performing Arts, Kingston
University, UK
Publicity chair: André Holzapfel, Division of Media Technology and
Interaction Design, KTH
Sound and music computing chair: Roberto Bresin, Division of Media
Technology and Interaction Design, KTH
Questions & Requests
Please direct any inquiries/suggestions/special requests to the
Conference Chair (bobs(a)kth.se <mailto:bobs@kth.se>).
--
Kind regards
Andre Holzapfel,
Ph.D. (Computer Science)
Ph.D. (Ethnomusicology)
Assistant Professor
KTH Royal Institute of Technology
School of Electrical Engineering and Computer Science
Media Technology and Interaction Design
www.kth.se/profile/holzap
Several people asked for an update of project Slice//Jockey. It dates
from Pd-Extended times. SliceJockey3 finally works with vanilla Pd.
Find it here:
www.katjaas.nl/slicejockey/slicejockey.html
It needs a few external libraries which are available from deken for
all current platforms. The project was verified to work on Linux
(Intel and ARM), MacOS and Windows (tested through Wine).
ELSE 1.0 beta 26 is out! Highlight is a massive update to the [slider2d]
abstraction, which now relies on updates I made to the [canvas.mouse]
object and a new [savechange] object. Sadly, compatibility has been broken
here. There's also a new variant object, a two dimensional circular slider
called [circle] - for more details, check:
https://github.com/porres/pd-else/releases/tag/v1.0-beta26 (some binaries
are already available via Pd).
Now for a heads up, Pd 0.51 is coming out and the next update to the ELSE
library is gonna break compatibility of dozens objects! This is because
[inlet~] will now allow us to send messages in the main left inlet. This
also will fix some issues in objects like [else/bl.square~]. Anyway,
hopefully this upcoming massive overhaul will also get me very close to a
new "release candidate" phase.
My tutorial has also been updated to be compatible to ELSE beta 26, see:
https://github.com/porres/Live-Electronic-Music-Tutorial/releases/tag/v1.0-…
Examples affected by the new [circle] object are:
https://github.com/porres/Live-Electronic-Music-Tutorial/blob/master/Exampl…
and
https://github.com/porres/Live-Electronic-Music-Tutorial/blob/master/Exampl…
updated my "audiolab" abstraction library
there are two more objects, "pp.twisted-delays2~" and "pp.spacer~"
sorry for the stupid names.
all best,
Philipp
Dear List
I would usually post this message on the facebook group because I don't
think this is worthy, but I have been off social media for a while and
enjoying it, so I hope not to annoy anyone with this email. I have
redesigned my preset patching system so it works completely "out of the
box" with pd vanilla. I am building more documentation and little "GIff"
step tutorials so the github is still in the making. However, decided to
share it and get some feedback or maybe someone interested in using it.
basically it is a little "lib" of abstractions that wrap around pd
vanilla's GUIs and a "msc" one to use it with not vanilla GUI's.
https://github.com/JRSV/RSVP.V3.full-vanilla/blob/master/README.md
best
--
José Rafael Subía Valdez
www.jrsv.net
Hello,
I am happy to announce the release of PuREST JSON 1.4.3, code name:
Y U no stable API.
This release is a minor update and contains changes to the build scripts
and bugfixes. It will be the last in the 1.x.x branch, the next release
will introduce breaking changes.
PuREST JSON is a library for working with RESTful HTTP webservices, and
JSON data.
Authentication and authorization for webservices are available with
basic HTTP auth, cookie authentication, and OAuth. As an example for
OAuth authenticated webservices, a Twitter client is included.
This release is available from deken for Windows 32 and 64 bit, Linux
i386, x86_64 and arm, as well as OS X with the name "purest_json".
Changes since 1.4.2:
- Update of build scripts and documentation
- Usage of CI for build
See the full changelog at
https://github.com/residuum/PuRestJson/blob/1.4.3/Changelog.txt
Github repository:
https://github.com/residuum/PuRestJson
Source code packages:
https://github.com/residuum/PuRestJson/releases
Full documentation:
https://github.com/residuum/PuRestJson/wiki
Build instructions for all platforms:
https://github.com/residuum/PuRestJson/wiki/Compilation
Have fun,
Thomas
--
"We left all that stuff out. If there's an error, we have this
routine called panic, and when it is called, the machine crashes,
and you holler down the hall, 'Hey, reboot it.'" (Dennis Ritchie)
http://www.residuum.org/
Call for Participation: New Interfaces for Musical Expression 2020, Royal Birmingham Conservatoire, UK
We would like to invite you to be part of NIME 2020 – The International Conference on New Interfaces for Musical Expression. We welcome submissions of original research, both scientific and artistic. A non-exhaustive list of NIME-related topics is found below.
Topics
Original contributions are encouraged in, but not limited to, the following topics:
Musical interfaces designed by/with disabled/neurodiverse musicians
Musical interfaces in/as education
Increasing musical choices for disabled musicians through new accessible interfaces
Easier/cheaper approaches to the design of bespoke accessible instruments (that can be adapted to a user’s requirements), in a world dominated by mass production
Strategies that improve the reach and replicability of one-off accessible instrument projects, particularly those that are unlikely to have full commercial potential
Musical interfaces tailored to formally trained musicians
Novel controllers, interfaces or instruments for musical expression
Augmented, embedded and hyper instruments
Technologies or systems for collaborative music-making
Mobile music-making
Music-related human-computer interaction and mapping strategies
Sensor and actuator technologies, including haptics and force feedback devices
Explorations of relationships between motion, gesture and music
Evaluation and user studies of new interfaces for musical expression or commercially available “off the shelf” interfaces
Musical robotics
Interactive sound art and installations
Performance rendering and generative algorithms
Machine learning in musical performance
Artificial intelligence and new interfaces for musical expression
Web-based and/or telematic music performance
Software frameworks, interface protocols, and data formats, for supporting musical interaction
Historical, theoretical or philosophical discussions about designing or performing with new interfaces
Supporting cultural diversity through musical interfaces
Discussions about the artistic, cultural, and social impact of new interfaces
Pedagogical perspectives or reports on student projects in the framework of NIME-related courses
Practice-based research approaches/methodologies/criticism relating to the use of musical interfaces
Important Dates
24 January 2020: Paper, Poster, Music, Installation and Workshops submission deadline
31 January 2020: Final submission upload deadline (no extension)
15-22 March 2020: Notification of acceptances/rejections
22 March 2020: Early Bird Registration Opens
10 April 2020: Non-Paper Demo submission deadline
15 April 2020: Camera-ready submission and presenter registration deadline
30 April 2020: Early-bird registration deadline
21 July 2020: Pre-conference workshops
22-24 July 2020: The conference
25 July 2020: The unconference
More information about the conference can be found at nime202.bcu.ac.uk <http://nime202.bcu.ac.uk/>
NIME 2020 Royal Birmingham Conservatoire Organising Committee
Hi, I'm happy to share the final release of [vstplugin~] v0.2.0. Binaries are available on Deken. The source code is here: https://git.iem.at/pd/vstplugin/-/releases
Note that you can also load LV2 plugins with lv2vst: https://github.com/x42/lv2vst. I've tested this on Debian and generally it seems to work fine.
Have fun!
Christof
---
Changelog:
new features:
* VST2 shell plugin support (e.g. "Waves", "Blue Ripple Sound")
* (experimental) VST3 support including sample accurate automation and auxiliary inputs/outputs for side-chaining
* soft-bypass
* faster search/probe (parallel)
* cache search/probe results in a file to speed up subsequent searches
* [param_set( and [param_get( now also accept parameter names instead of indices (whitespace is bashed to underscores).
* set editor window position with [pos( message.
changes:
* switched whole project to CMake
* removed 'vstsearch' object because of the new cache file system.
* removed [precision( message (the processing precision can only be set at creation time).
internal changes:
* use .ini like syntax for plugin info
* hard-bypass prefers the plugin's bypass method
* single event loop shared by all plugins
* probably many more...