Dear List,
Version 2.3 of the Click Tracker is out.
This version was generously supported by the Quatuor Bozzini
(https://www.quatuorbozzini.ca/), and reflects mainly improvements
gained from the perspective of the users.
The new features are divided into 3 categories:
Syntax features:
- added meters with mixed denominators
- added "x Y" command to repeat inputed events
- added fermatas
GUI features:
- new GUI layout
- removed "record" button
- added reset button for pickup bar
GUI features for the application and Max patch:
- added file drop to open a score
- change the window size to scale the contents
- added new control keys g l t u, also combined with shift for reset
As in the previous version, you can use in any of the following ways:
- as an android app (https://bit.ly/click-tracker-mob or
https://bit.ly/clicktracker-playstore)
- as a closed desktop app in windows (http://bit.ly/ClickTracker2-3Win)
or apple (http://bit.ly/ClickTracker2-3Apple)
Due to Apple's recent security settings, you'll need to allow the Pd and
other externals to run on your system.
WARNING: M1 users will need to run the program with Rosetta.
- as the traditional Pure Data patch (https://bit.ly/ClickTracker2-3)
- as a Max/MSP patch in windows (http://bit.ly/ClickTracker2-3MaxWin) or
apple (http://bit.ly/ClickTracker2-3MaxApple)
For more information, refer to the Click Tracker's website at
http://j.mp/click-tracker.
You can also visit the Click Tracker on facebook -
http://j.mp/clicktrackerfb, or check out the click track library in
http://jmmmp.github.io/clicktracker/index-library.
With best regards,
João Pais
--
Click Tracker Mobile -https://bit.ly/click-tracker-mob
Click Tracker Website -http://j.mp/click-tracker
Click Tracker Library -https://bit.ly/ClickTrackerLibrary
Facebook -http://j.mp/clicktrackerfb
Hey all,
If anyone is anywhere near Margate (England, north-eastern tip of the south east peninsula) I will be exhibiting a quadrophonic interactive sound installation in the Margate School from the 18th to the 22nd of March.
This is an artist-in residence program, and there are three artists exhibiting on those dates - my sensor-driven Pd sound-artwork, a 3D CGI designer and another who works with light sculpture.
The Margate School is an art college housed in a very large (deserted by Woolworths) department store in Margate, England, which is also a great seaside resort. I am capturing sound clips from the building and creating a set of Pd patches that will intreract with the audience through infra-red sensors.
Please come if you are in the area.
Best wishes,Ed Kelly
News | The Margate School
|
|
|
| | |
|
|
|
| |
News | The Margate School
See the latest news from The Margate School
|
|
|
Driftwood - the latest album by Lone Shark, now available at https://synchroma.bandcamp.com/releases
For Lone Shark releases, Pure Data software and published Research, go to http://sharktracks.co.uk
Dear all,
the iem is co-hosting the 1st international conference on "Data Art and
Climate Action" (together with Hongkong School of Creative Media).
Today we have a Matinee Concert, that includes performances in Pd.
If you like, you are happily invited to join our live stream at
https://go.iem.at/daca22
The concert starts at 10:45CET (in about 45minutes) so be quick to make
up your mind.
For the program notes, see the attached PDF.
fsamd
IOhannes
Hi,
I am happy to announce a new bug fix release for [vstplugin~] - a Pd
external for hosting VST2 and VST3 plugins on Windows, macOS and Linux.
It is available on Deken (search for "vstplugin~"). Please upgrade!
Here is the full change log: https://git.iem.at/pd/vstplugin/-/releases
Please report any issues at https://git.iem.at/pd/vstplugin/-/issues
Have fun!
Christof
Please pardon cross-posting,
The following announcement may be of particular interest to graduate
students seeking PhD oppportunities. Please share widely, as appropriate.
Virginia Tech Human-Centered Design (HCD), a unique individualized
interdisciplinary PhD program, in collaboration with the Linux Laptop
Orchestra (L2Ork) has a new fully funded graduate research assistantship
available. Supported as part of a grant from the Office of Naval Research,
the assistantship is looking for a graduate PhD-level student who will be
fully funded for up to 4 years (contingent on adequate progress) to work on
a project that combines knowledge in sound and music with that of
cybersecurity and K-12 education. Desired expertise includes:
- Solid C and JS programming skills
- Familiarity with emscripten and building Web apps
- Solid knowledge of acoustics, psychoacoustics, and of digital signal
processing using visual dataflow programming languages, such as Pd-L2Ork/Pd
or Max
While student will be required to work on the funded project, they will
also have an opportunity to develop facets of the said project into their
own dissertation, as well as explore their own unique research
trajectories. Students who may not possess all the desired expertise but
only a subset are also welcome to inquire and apply. Before applying,
students are strongly encouraged to contact me to ensure they may have the
right skillset.
Virginia Tech offers top tier research facilities in areas of spatial
sound, immersion, and telematics, including a $30M Moss Arts Center and the
Institute for Creativity, Arts, and Technology's Cube with its 140+
loudspeaker array, multi-projection surfaces, and a high resolution motion
capture system. The said space is supported by a constellation of other
labs, including the Perform studio with its 24.4 Genelec and Mocap system,
DISIS with its 24.2 system, Create Studio with access to cutting edge
fabrication tools, and many more. Most of the audio facilities on campus
are networked using the Dante protocol.
If you would like to learn more about the HCD iPhD program please visit
https://hcd.icat.vt.edu
For additional info on related opportunities, visit:
https://www.icat.vt.eduhttp://ci.icat.vt.eduhttps://l2ork.icat.vt.edu
Questions? Please feel free to email me at <ico(a)vt.edu>
--
Ivica Ico Bukvic, D.M.A.
Director, Creativity + Innovation
Director, Human-Centered Design iPhD
Institute for Creativity, Arts, and Technology
Virginia Tech
Creative Technologies in Music
School of Performing Arts – 0141
Blacksburg, VA 24061
(540) 231-6139
ico(a)vt.edu
ci.icat.vt.edul2ork.icat.vt.eduico.bukvic.net
Dear all,
We are hiring three early career researchers for the project AMBIENT:
Bodily Entrainment to Audiovisual Rhythms
<https://www.uio.no/ritmo/english/projects/ambient/index.html>. The
project will study how the sonic and visual "background" of indoor
environments influence people's bodily behaviour.
* 1-2 Doctoral Research Fellowships in Audiovisual Rhythms
<https://www.jobbnorge.no/en/available-jobs/job/217521/1-2-doctoral-research…>
* 1-2 Post-Doctoral Research Fellowships in Audiovisual Rhythms
<https://www.jobbnorge.no/en/available-jobs/job/217519/1-2-post-doctoral-res…>
Application deadline: *15 March 2022*.
We are looking for people with backgrounds in musicology, music
technology, sound studies, psychology, sound and music computing,
computer science, human movement science, or other relevant field. The
aim is to put together a cross-disciplinary team that together covers
the following methods: sound analysis, video analysis, interviews,
questionnaires, motion capture, physiological sensing, statistics,
signal processing, machine learning, interactive (sound/music) systems.
Please forward to relevant candidates. Do not hesitate to get in touch
if you want to know more about the positions or the project.
--
Alexander Refsum Jensenius [he/him]
Professor, Department of Musicology, University of Oslo
https://people.uio.no/alexanje
Deputy Director, RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
https://www.uio.no/ritmo/english/
Director, fourMs Lab
https://fourms.uio.no
Chair, NIME Steering Committee
https://www.nime.org
New master's programme: "Music, Communication & Technology"
http://www.uio.no/mct-master
Dear Pd community,
in July this year, we, The Center for Haptic Audio Interaction Research
(CHAIR for short), released a VST3i plug-in: EXC!TE SNARE DRUM. It's an
exciter-resonator physical modelling snare drum syth. The exciter can be
triggered via MIDI. The plug-in is a free download.
In the PRO version of the plug-in (20€), on top of the MIDI triggers
there is an audio sidechain which allows direct excitation of the
waveguide resonator.
Video here:
https://www.chair.audio/product/excte-snare-drum-pro/
The VST3/AU is built using Steinberg's VST3 SDK and libpd. Yes, the
audio synthesis is done in Pd.
We are quite happy with the overall performance and negligible CPU
overhead. We were able to run 70 instances of the plug-in before we
started to hear dropouts. Nobody needs 70 parallel snares :)
The core of the audio synthesis is open source, in fact you can open Pd,
search for "CHAIR" in Deken and get the example patches with the snare
drum sound.
There is also a paper about the algorithm: "Efficient Snare Drum Model
for Acoustic Interfaces with Piezoelectric Sensors"
A bit longer story of the plug-in development can be read in our journal
article: "EXC!TE SNARE DRUM — Making an Audio Plugin with Pure Data
inside" both are to be found here:
https://discourse.chair.audio/t/its-science-papers-about-our-work/44
We are thankful that we were able to build upon the work which has gone
into Pd (and libpd on top of that). If you have contributed Pd or libpd
in any form and would like to try the PRO version of the plug-in, please
send me a quick email.
Thank you Miller, Dan, IOhannes, Peter, Pierre, Christof, Antoine and
everyone else.
Max (+ Clemens, Philipp and Sebastian)