Hi all,
I want to share an special audio limiter that I've been using for years
in my recordings.
It's "special" because it *can't* be used live or with shorts delays. :(
It's for use with already recorded material and a special side-chain
file must be generated. For this I made an external. My first C thing.
Prior to this external I used to generate the side-chain file on an
[array] but I encounter the single precision limitation for files bigger
than 16777216 samples.
It has no attack/release time settings, that magic happens on the
side-chain file.
Files and brief explanation on:
https://github.com/Lucarda/pd-pulqui
Or get it from Deken. Search "pulqui".
:)
--
Mensaje telepatico asistido por maquinas.
Howdy, I have big updates to the ELSE library and my Live Electronics
Tutorial!
Raspberry Pi binaries for ELSE are still missing, but it'll be up soon.
Find available releases on github while they don't show up in deken.:
https://github.com/porres/pd-else/releases/tag/v1.0-beta18
<https://l.facebook.com/l.php?u=https%3A%2F%2Fgithub.com%2Fporres%2Fpd-else%…>
There are 30 new objects in ELSE. This is the first release over 300
objects and there's already 324 of them! I'm shocked on how much it's
growing... I realize now how this library is unique in the number of
objects towards DSP and audio processing. There are 219 tilde objects,
that's more objects than Pd itself has, or all of cyclone... So, what's the
most exciting new stuff?
- There's a bunch of reverb objects here - [mono.rev~], [stereo.rev~],
[echo.rev~], [giga.rev~], [free.rev~] (check also [fdn.rev~] which wasn't
really ready).
- There are a few dynamic processors as well: [compress~], [duck~] and
[expand~].
- A couple of FX like [chorus~] and [phaser~].
- A [spectrograph~] object.
- Some objects that deal with lists ([regroup], [pick], [sort],
[scramble], [merge], [unmerge] & [rand.seq]).
- A [trigger2] object that behaves like [trigegr] in Purr Data and Max,
that people from time to time request to be in Pd as well (if Pd's trigger
includes such functionalities, I'll delete this object).
Check complete changelog here:
https://github.com/porres/pd-else/releases/tag/v1.0-beta18
<https://l.facebook.com/l.php?u=https%3A%2F%2Fgithub.com%2Fporres%2Fpd-else%…>
.
As usual, I have breaking changes while we're at the beta stage.
As for my Live Electronics tutorial, which is based on this library, most
of the changes reflect the new objects from the ELSE library. So then, most
importantly, the reverb section just got a major upgrade. I'm also now
including an appendix with s quick start on Data Structures! Check it out
at
https://github.com/porres/Live-Electronic-Music-Tutorial/releases/tag/v1.0-…
<https://l.facebook.com/l.php?u=https%3A%2F%2Fgithub.com%2Fporres%2FLive-Ele…>
cheers
ICAD 2019 — Call for Submissions to the ICAD 2019 Algorave.
25th International Conference on Auditory Display
Northumbria University, Newcastle-upon-Tyne, UK
23–27 June, 2019
https://icad2019.icad.orghttps://twitter.com/ICAD2019
Theme/Special Focus of ICAD 2019 is "Digital Living: Sonification for Everyday Life".
Digital technology and artificial intelligence are becoming embedded in the objects all around us, from consumer products to the built environment. Everyday life happens where People, Technology, and Place intersect. Our activities and movements are increasingly sensed, digitised and tracked. Of course, the data generated by modern life is a hugely important resource not just for companies who use it for commercial purposes, but it can also be harnessed for the benefit of the individuals it concerns. Sonification research that has hit the news headlines in recent times has often been related to big science done at large publicly funded labs with little impact on the day-to-day lives of people. At ICAD 2019 we want to explore how auditory display technologies and techniques may be used to enhance our everyday lives. From giving people access to what’s going on inside their own bodies, to the human concerns of living in a modern networked and technological city, the range of opportunities for auditory display is wide.
ALGORAVE
For the first time at an ICAD meeting the conference programme includes an Algorave event. "Algoraves focus on humans making and dancing to music. Algorave musicians don’t pretend their software is being creative, they take responsibility for the music they make, shaping it using whatever means they have. More importantly the focus is not on what the musician is doing, but on the music, and people dancing to it. Algoraves embrace the alien sounds of raves from the past, and introduce alien, futuristic rhythms and beats made through strange, algorithm-aided processes. It’s up to the good people on the dancefloor to help the musicians make sense of this and do the real creative work in making a great party.” (https://algorave.com/about).
The ICAD 2019 committee seeks submissions to this event which will provide an exciting complement to the other conference categories. For details on proposal format, submission instructions, and additional conference information please visit https://icad2019.icad.org/call-for-participation.
IMPORTANT DATES:
Thursday 12th May 2019 — Deadline for submissions to the Algorave tracks.
Algorave Chair:
Shelly Knotts
icad2019algorave(a)icad.org
Conference Chairs:
Paul Vickers and Matti Gröhn
icad2019chairs(a)icad.org
ABOUT ICAD
First held in 1992, ICAD is a highly interdisciplinary conference with relevance to researchers, practitioners, artists, and graduate students working with sound to convey and explore information. The conference is unique in its specific focus on auditory displays and the range of interdisciplinary issues related to their use. Like its predecessors, ICAD 2019 will be a single-track conference, open to all, with no membership or affiliation requirements.
--
Dr Paul Vickers BSc PhD CEng MIEE FHEA
Co-chair of ICAD 2019: https://icad2019.icad.org
This message is intended solely for the addressee and may contain confidential and/or legally privileged information. Any use, disclosure or reproduction without the sender’s explicit consent is unauthorised and may be unlawful. If you have received this message in error, please notify Northumbria University immediately and permanently delete it. Any views or opinions expressed in this message are solely those of the author and do not necessarily represent those of the University. Northumbria University email is provided by Microsoft Office365 and is hosted within the EEA, although some information may be replicated globally for backup purposes. The University cannot guarantee that this message or any attachment is virus free or has not been intercepted and/or amended.
---------- Forwarded message ---------
From: 'Monty Adkins' via CEC-Conference <cec-conference(a)googlegroups.com>
Date: Wed, 3 Apr 2019 at 10:42
Subject: [cec-c] Marie curie
To: cec-conference(a)googlegroups.com <cec-conference(a)googlegroups.com>
The School of Music, Humanities and Media invites expressions of interest
for the Marie Curie Post-Doctoral Fellowship at the University of
Huddersfield.
We have world-leading resources for Music and Music Technology focused
around the Centre for Research in New Music.
We are interested in receiving applications in all areas of Music and Music
Technology.
Please make sure to read the weblinks to ensure your eligibility to the two
schemes.
Many thanks
Prof. Monty Adkins
University of Huddersfield
UK
University of Huddersfield inspiring tomorrow's professionals.
[http://marketing.hud.ac.uk/_HOSTED/EmailSig2014/EmailSigFooterMarch2019.jpg
]
This transmission is confidential and may be legally privileged. If you
receive it in error, please notify us immediately by e-mail and remove it
from your system. If the content of this e-mail does not relate to the
business of the University of Huddersfield, then we do not endorse it and
will accept no liability.
--
You received this message because you are subscribed to the Google Groups
"CEC-Conference" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to cec-conference+unsubscribe(a)googlegroups.com.
To post to this group, send email to cec-conference(a)googlegroups.com.
Visit this group at https://groups.google.com/group/cec-conference.
For more options, visit https://groups.google.com/d/optout.
Dear all,
Andrew McPherson and I are writing a paper on the relationships between music programming languages and the digital instruments, compositions and performances they support. As part of that, we'd like to invite your participation in an online survey about your digital tools and musical practices. The survey contains 27 questions, and we expect it will take around 20 minutes of your time.
https://sopi.aalto.fi/dmiip/idiomaticity/survey/ <https://sopi.aalto.fi/dmiip/idiomaticity/survey/>
Thanks for your participation, apologies for cross-posting, and please feel free to share the link with any other designers and musicians you think might be interested.
best,
Koray
-------------------------------------
M.Koray Tahiroğlu
Department of Media,
Aalto University
School of Arts, Design and Architecture
http://sopi.aalto.fi/http://mlab.taik.fi/~korayt
tel. +358 50 4088441
ICAD 2019 — Call for Submission of Concert pieces and Installations/Demos
25th International Conference on Auditory Display
Northumbria University, Newcastle-upon-Tyne, UK
23–27 June, 2019
https://icad2019.icad.orghttps://twitter.com/ICAD2019
Theme/Special Focus of ICAD 2019 is "Digital Living: Sonification for Everyday Life".
Digital technology and artificial intelligence are becoming embedded in the objects all around us, from consumer products to the built environment. Everyday life happens where People, Technology, and Place intersect. Our activities and movements are increasingly sensed, digitised and tracked. Of course, the data generated by modern life is a hugely important resource not just for companies who use it for commercial purposes, but it can also be harnessed for the benefit of the individuals it concerns. Sonification research that has hit the news headlines in recent times has often been related to big science done at large publicly funded labs with little impact on the day-to-day lives of people. At ICAD 2019 we want to explore how auditory display technologies and techniques may be used to enhance our everyday lives. From giving people access to what’s going on inside their own bodies, to the human concerns of living in a modern networked and technological city, the range of opportunities for auditory display is wide.
CONCERT AND INSTALLATIONS
The ICAD 2019 committee is seeking submissions to the ICAD 2019 Sonification Concert and proposals for Sonification Installations/Demos that will contribute to knowledge of how sonification can support everyday life.
For details on topics of interest, proposal format, submission instructions, and additional conference information please visit https://icad2019.icad.org/call-for-participation
IMPORTANT DATES:
Thursday 18th April 2019 — Deadline for submissions to the Concert and Installations/Demos tracks.
Concert Chairs:
Bennett Hogg with John Bowers, Tim Shaw, and David Worrall
icad2019concert(a)icad.org
Conference Chairs and Installation Chairs:
Paul Vickers and Matti Gröhn
icad2019chairs(a)icad.org
icad2019installations(a)icad.org
ABOUT ICAD
First held in 1992, ICAD is a highly interdisciplinary conference with relevance to researchers, practitioners, artists, and graduate students working with sound to convey and explore information. The conference is unique in its specific focus on auditory displays and the range of interdisciplinary issues related to their use. Like its predecessors, ICAD 2019 will be a single-track conference, open to all, with no membership or affiliation requirements.
--
Dr Paul Vickers BSc PhD CEng MIEE FHEA
Co-chair of ICAD 2019: https://icad2019.icad.org
This message is intended solely for the addressee and may contain confidential and/or legally privileged information. Any use, disclosure or reproduction without the sender’s explicit consent is unauthorised and may be unlawful. If you have received this message in error, please notify Northumbria University immediately and permanently delete it. Any views or opinions expressed in this message are solely those of the author and do not necessarily represent those of the University. Northumbria University email is provided by Microsoft Office365 and is hosted within the EEA, although some information may be replicated globally for backup purposes. The University cannot guarantee that this message or any attachment is virus free or has not been intercepted and/or amended.
(Apologies for cross-postings, please re-distribute at will)
_______________________________________________________________________
Audio Mostly 2019: A Journey in Sound
18th to 20th September 2019
University of Nottingham, Nottingham, UK
www.audiomostly.com<http://www.audiomostly.com>
Facebook: https://www.facebook.com/AudioMostly/
Twitter: https://twitter.com/AudioMostly @AudioMostly
_______________________________________________________________________
AUDIO MOSTLY 2019
Audio Mostly is an audio focused interdisciplinary conference on design, interacting with sound and technology, which embraces applied theory and practice-based research.
It is an annual conference which brings together thinkers and doers from academia and industry who share an interest in sonic interaction and the use of audio for interface design. This remit covers product design, auditory displays, computer games and virtual environments, new digital musical instruments, educational applications and workplace tools, as well as the topics listed below. It further includes fields such as the psychology of sound and music, cultural studies, systems engineering, and everything in between in which sonic Human-Computer Interaction plays a role.
Audio Mostly 2019 will be an inclusive event for all, bringing together a whole range of people and communities. It will be a lively and sociable mix of oral and poster paper presentations, demos, and workshops. We welcome submissions from industry, academia and interested parties in each of these categories.
As in previous years, the Audio Mostly 2019 proceedings will be published by the Association for Computing Machinery (ACM) (to be confirmed) and made available through their digital library. Regular papers, posters and demos/installations will be double-blind peer reviewed. It is envisaged that there will be a special issue of a journal relating to the conference, as with previous years.
CONFERENCE THEME - A Journey in Sound
The special theme for the conference this year is A Journey in Sound and we would particularly welcome papers relating to this theme for at the conference this year. We often have different experiences of sound and music though out our lives, there are sounds that remind us of different places and people. We also have different playlists and songs that take us back and remind us of certain times and events. Throughout our lives we are interacting with sounds and music, we are on a journey in sound. This year the theme of the conference is open to interpretation, but people might think about the following, in relation to the theme:
* Sonic aspects of digital stories, documentaries and archives
* The soundtrack to our lives. Archiving and sharing sound
* The emotional potential of a sound, how might this be used to support interaction
* The different uses of music across different settings
* The re-use of recollections and memories by composers & sound designers
* The development of musical tools that can let us express our experiences over time
* Socio-technical uses of AI create highly personalised soundtracks that respond to one's context
* Adaptive music use in journeys, time and the creative use of data
Audio Mostly 2019 encourages the submission of regular papers (oral/poster presentation) addressing such questions and others related to the conference theme and the topics presented below.
LIST OF TOPICS
The Audio Mostly conference series is interested in sound Interaction Design & Human-Computer Interaction (HCI) in general. The conference provides a space to reflect on the role of sound/music in our lives and how to understand, develop and design systems which relate to sound and music - we are particularly interested in this from a broad HCI perspective. We encourage original regular papers (oral/poster presentation) addressing the conference theme or other topics from the list provided below. We welcome multidisciplinary approaches involving fields such as music informatics, information and communication technologies, sound design, music performance, visualisation, composition, perception/cognition and aesthetics.
* Accessibility
* Aesthetics
* Affective computing applied to sound/music
* AI, HCI and Music
* Acoustics and Psychoacoustics
* Auditory display and sonification
* Augmented and virtual reality with or for sound and music
* Computational musicology
* Critical approaches to interaction, design and sound
* Digital augmentation (e.g. musical instruments, stage, studio, audiences, performers, objects)
* Digital music libraries
* Ethnographic studies
* Game audio and music
* Gestural interaction with sound or music
* Immersive and spatial audio
* Interactive sonic arts and artworks
* Intelligent music tutoring systems
* Interfaces for audio engineering and post-production
* Interfaces or synthesis models for sound design
* Live performing arts
* Music information retrieval & Interaction
* Musical Human-Computer Interaction
* New methods for the evaluation of user experiences of sound and music
* Participatory and co-design methodologies with or for audio
* Philosophical or sociological reflections on Audio Mostly related topics
* Psychology, cognition, perception
* Semantic web music technologies
* Spatial audio, interaction design and ambisonics
* Sonic interaction design
* Sound and image interaction: from production to perception
* Sound and soundscape studies
SUBMISSION INSTRUCTIONS
Regular paper, poster, demo and workshop contributions must be submitted via the EasyChair Audio Mostly 2019 submission portal (https://easychair.org/conferences/?conf=am2019).
All Audio Mostly 2018 papers should be submitted using the 2017 ACM Master Article Template specified below for your contribution. Authors should use the ACM Computing Classification System (CCS) to provide the proper indexing information in their papers (see instructions on the 2017 ACM Master Article Template page). All papers must be submitted in the PDF format.
IMPORTANT DATES
(Papers & Posters)
Deadline for Submissions: 24th May 2019
Notification of Acceptance: 14th June 2019
Camera-ready submissions: 9th August 2019
Early Registration Deadline: 10th August 2019
Conference: 18 - 20 September 2019
Call for Workshops
IMPORTANT DATES (Workshops)
Deadline for Submissions: 24th May 2019
Notification of Acceptance: 14th June 2019
Workshops: 17th September 2019
Call for Demos
IMPORTANT DATES (Demos & Installations)
Deadline for Submissions: 1st July 2019
Notification of Acceptance: 15th July 2019
Submission Deadline: 22nd July 2019
Submission Site
https://easychair.org/conferences/?conf=am2019
LOCATION
This year, the conference is hosted by the Mixed Reality Lab (in the School of Computer Science) and the Department of Music at the University of Nottingham - The conference will be located on the University Park.
The University Park is The University of Nottingham's largest campus at 300 acres. Part of the University since 1929, the campus is widely regarded as one of the largest and most attractive in the country. Set in extensive greenery and around a lake, University Park is the focus of life for students, staff and visitors. Conveniently located only two miles from the city centre. The campus is well connected, the nearest airport is the East Midlands Airport, local train stations are Nottingham, and Beeston.
-
-
Zürcher Hochschule der Künste
Zurich University of the Arts
-
Dr. Daniel Hug (Dipl.-Des.)
Co-Director Master Sound Design
__
Toni-Areal, Pfingstweidstrasse 96 P.O. Box, CH-8031 Zurich
-
Phone +41 78 768 59 49
daniel.hug(a)zhdk.ch<mailto:daniel.hug@zhdk.ch>
-
www.zhdk.ch/sounddesign<http://www.zhdk.ch/sounddesign>
-
Hi all,
I have an exhibition opening soon in London, UK.
https://sonicelectronicsfestival.org/exhibition/
Featuring
- raytracing in curved space
- badly-played tetris
- generative techno
- audiovisual sliding tile puzzle
- interactive graph-directed IFS
- zooming hybrid fractals
Opening 11th April 2019 6pm, until 27th April, at Chalton Gallery,
96 Chalton Street, Camden, London NW1 1HJ, UK. Check website for times.
Two of the works are realized with Pure-data, also using pdlua and Gem.
Thanks also to Andy Farnell for his CheeseBox/AnalogueDrumEngine patch.
Curated by Laura Netz.
Supported using public funding by the National Lottery through Arts
Council England.
Claude
--
https://mathr.co.uk