To Pd-announce:
Pd version 0.53-0test2 is available from http://msp.ucsd.edu/software.htm
or (source only) via github: https://github.com/pure-data/pure-data
This contains more bug fixes and documentation updates.
Iohannes is still working on an update to teh string translation
mechanism; I'm planning to incorporate that into the final release,
hopefully early next week.
cheers
Miller
To Pd-announce:
Pd version 0.53-0test1 is available from http://msp.ucsd.edu/software.htm
or (source only) via github: https://github.com/pure-data/pure-data
This is a test version that mostly just fixes bugs in version 0.52, including
one serious problem reading soundfiles.
cheers
Miller
YAY!!! I'm really proud of this one.
This took kind of a while so there's lots and lots of new stuff and
breaking changes. This is the 1st release to reach and exceed 450 objects
(459 in total now, next milestone, 500 objects?). Total number of examples
in the tutorial is now 477. Here are the release's highlights:
- MOST ESPECIALLY, WE FINALLY HAVE DECENT COMPILED BANDLIMITED OSCILLATORS
([bl.saw~], [bl.tri~], [bl.vsaw~], [bl.square~],[bl.imp~], [bl.imp2~])
thanks to Tim Schoen from PlugData, who has become a great partner in the
development of ELSE, since ELSE is part of PlugData ;) This was one of the
last milestones for a proper final release, so maybe we're close.
- Now we have a new [numbox~] GUI compiled object for monitoring and
generating signals (this replaces [diisplay~], which was removed). This is
thanks to Tim Schoen again.
- The [bicoeff] object has been renamed to [bicoeff2] and [bicoeff] is now
a new GUI object based on the "filtergraph" external (this one is still
very experimental and not really acceptable yet).
- All objects with random generators have been revised to take a seed value
and generate unique seeds every time you open the patch. I made a PR to
Pure Data so we have something similar for [random], [noise~] and [array
random].
- [brown~] has an impulse signal input to generate random steps now as well.
- There are 13 new objects, most importantly and not yet mentioned: [brown]
(control browninan motion), [bl.osc~] (a band limited oscillator based on
wavetables), [blip~] (a bandlimited impulse generator) and [scala] (for
importing tunings in the Scala software format).
Complete changelog is here:
https://github.com/porres/pd-else/releases/tag/v1.0-rc3
It's up on deken, have fun.
** sorry for xx-posting **
--
PikselXX is scheduled for November 17 – 20. 2022
Piksel 20 years Anniversary
Dear friends,
We are glad to announce the call for projects for the 20 years Piksel
edition!
To celebrate the anniversary we open a new track for texts, if you are
a previous Piksel participant and want to share with us your
experience, this section is yours. Selected articles from the open call
together with some curated texts from Piksel artists and colleagues
will be included in the Piksel 20 years book.
Piksel will go hybrid again. Screen-based artworks and PikselSavers are
primarily intended for the Piksel XX Cyber Salon. Ideas for
collaborative online/physical activities are welcome. This year we want
to be back to physicality, we encourage you to present art
installations that can be built in Bergen to minimize the international
transport, according to the green strategy.
Please feel free to submit your projects to any one of the open tracks:
Presentations, workshops, concerts, installations, and the texts call.
Deadline is 31st of August !!
Please use the online submit form at: https://pretalx.com/piksel22/
Piksel22 is supported by the Municipality of Bergen, Arts Council
Norway, Vestland fylkeskommune and others.
more info: https://piksel.no/
**Piksel is an international festival for electronic art and
technological freedom. Part workshop, part festival, it is organised in
Bergen, Norway, and involves participants from more than a dozen
countries exchanging ideas, coding, presenting art and software
projects, doing workshops, performances and discussions on the
aesthetics and politics of art and free technologies.**
--
**open CALL for PROJECTS**
For the exhibition and other parts of the program we currently seek
projects in the following categories:
**1. Installations**
Projects to be included in the exhibitions.
The works must be realized by the use of free and open source
technologies.
**2. Audiovisual performance**
Live art realized by the use of free software and/or open/DIY
hardware. We encourage audio-visual projects, online “orchestra”
collaborations with local actors,...
**3. Presentations**
Innovative DIY/open hardware and audiovisual software tools or software
art released under a free/open license. (Also includes presentations of
artistic projects realized using free/open technologies.)
**4. Workshops**
Hands on workshops utilizing free software and/or open/DIY hardware for
artistic use. Workshops can be on a virtual basis too.
**5. PikselSavers**
Video and software art based on the screensaver format – short
audiovisual (non)narratives made for endless looping. Possible thematic
fields includes but are not limited to: sustainable resource
allocation, renewable technologies, energy harvesting, fair trade
hardware, free content, open access, open data, DIY economy, shared
development. The works must be realized by the use of free/open source
technologies.
**6. Texts**
Anecdotes and reflections from the 20 years history of Piksel for the
anniversary book. We are specially interested in hearing about
collaborations and projects that was initiated as a result of artists
meeting at the festival.
**Deadline extended to August 31. 2022**
Please use the online submit form at: https://pretalx.com/piksel22/
--
In an effort to understand a bit more about neural networks, I wrote a
Pd external, by translating Python code to C, from the book "Neural
Networks from Scratch in Python". After a couple of months of working on
this, I ended up with [neuralnet].
This is an object written in pure C, without any dependencies, for
creating densely connected neural networks for classification,
regression, and binary logistic regression. You can choose among
different activation and loss functions, optimizers, and other settable
features. I've created some examples, trying to replicate some of the
examples in the aforementioned book, and some examples of my own.
The code is on GitHub (https://github.com/alexdrymonitis/neuralnet), and
Linux amd64 and armv7-32 (Raspberry Pi) binaries are uploaded to deken.
I don't have a Mac or Windows machine, and I don't know how to compile
for these architectures on a Linux machine (or if that is even
possible). I would be grateful if anyone can compile for any of these
architectures and upload to deken.
I would also love to get feedback both on how the object performs, and
on the source code itself.
Hi all,
flite (a text to speech external) is good enough for a release. It includes:
- single binary without dependencies.
- 5 built-in voices
- can load .flitevox voice files (english only)
- read from a text file
- threaded functions for "synthesis" and "read file"
You can get it from Deken or at
https://github.com/Lucarda/pd-flite/releases/tag/0.3.2
There are no mac M1 or Linux ARM builds but there shouldn't be a problem
for those who want to compile (and upload to Deken).
:)
--
Mensaje telepatico asistido por maquinas.
Please pardon the cross-posting. My COMPEL project collaborators and I
would appreciate it if you would please distribute this email widely among
various communities whose work is rooted in computer music, starting with
composers, performers, and instrument and installation designers.
Dear all,
As part of the preparations for the Workshop on NIME Archiving
<https://nime.pubpub.org/pub/oyi0po4b> to be held on 28 June, we look for
volunteers to fill out one or more records of *artifact*s (defined broadly)
in this survey:
*https://forms.gle/A8zNrFVxs9N4aBcp9 <https://forms.gle/A8zNrFVxs9N4aBcp9>*
The idea is to check whether categories developed for the COMPEL archive
make sense from the community's perspective. We ask that you please
consider filling out the survey *before 24 June* so that we have a couple
of days to look at results before the workshop. Feel free to make entries
also if you cannot make it to the workshop!
Given that this effort may benefit the broader computer music community,
please note that *both the survey and the workshop are open to any person
who is interested in participating*, regardless whether they are registered
for the conference. Since this year NIME is an online-only conference, *the
zoom link will be forthcoming and will be shared with all survey
contributors and conference participants soon*.
The workshop will continue discussions in the community on how to best
preserve information from the NIME conferences, the NIME community, and the
computer music community at large. The workshop will follow up on threads
from the NIME publication ecosystem workshop
<https://nime2020.bcu.ac.uk/nime-publication-ecosystem-workshop/> (NIME
2020, Birmingham), ICMC 2018 paper
<https://dblp.org/rec/conf/icmc/BukvicO18.html>, SEAMUS 2018 conference
presentation, and the NIMEhub workshop
<https://www.duo.uio.no/handle/10852/50604> (NIME 2016, Brisbane). As we
rebuild the COMPEL platform to sidestep technological limitations of the
old infrastructure, the main task is to find a solution for an open,
future-oriented, engaging, and institutionally recognized archiving
solution for the activities of the community that ensures *reproducibility *of
archived artifacts. While NIME publications are archived according to the FAIR
principles <https://www.go-fair.org/fair-principles/>, currently no
solutions exist for archiving information about instruments/interfaces and
other hardware/software-based artifacts produced in the community. Neither
do we have a system for describing and preserving compositions/pieces,
installations, performances, and workshops. We believe that this challenge
affects the computer music community at large. The goal of this workshop
and forum discussions
<https://forum.nime.org/t/survey-and-workshop-on-nime-archiving/306> is to
propel the project forward and expand the community engagement.
Thank you for your consideration and participation. Should you have any
questions, please do not hesitate to contact one of the workshop organizers
<https://nime.pubpub.org/pub/oyi0po4b>.
Best,
Ico
--
Ivica Ico Bukvic, D.M.A.
Director, Creativity + Innovation
Director, Human-Centered Design iPhD
Institute for Creativity, Arts, and Technology
Virginia Tech
Creative Technologies in Music
School of Performing Arts – 0141
Blacksburg, VA 24061
(540) 231-6139
ico(a)vt.edu
ci.icat.vt.edul2ork.icat.vt.eduico.bukvic.net