Hi, ofelia v1.0.8 is now available.
You can use GEM with ofelia thanks to Arnaud Courcelle
<https://www.facebook.com/arno.courcelle?fref=gs&dti=4729684494&hc_location=…>
who created [ofGemwin] and [ofGemhead] abstractions.
Please try out "ofelia/examples/gem/main.pd" to learn how to use GEM with
ofelia. I've only tested this on macOS by the way.
Also, I added some utility objects for creating message and file
opening/saving dialogs.
screenshot:
https://github.com/cuinjune/ofxOfelia/blob/master/doc/textbox_screenshot.png
You can find these objects inside [pd system] subpatch in UTILS section of
"ofelia/help-intro.pd".
Changes:
* added [ofGetFboTexID] and [ofGetImageTexID]
* added GEM abstractions and example to the "examples/gem" directory
* added ofRectangle related objects and help files
* renamed [ofGetDollarArgs] to [ofGetCanvasArgs]
* added [ofSetCanvasArgs], [ofRemoveCanvas], [ofGetCanvasIndex]
* added [ofKeyCodeListener]
* added ofSystemUtils objects and help files
Upcoming features:
* Video player and grabber
* SVG loader
* GUI abstractions
* Raspberry Pi support
More info about ofelia: https://github.com/cuinjune/ofxOfelia
<https://l.facebook.com/l.php?u=https%3A%2F%2Fgithub.com%2Fcuinjune%2FofxOfe…>
Cheers!
Zack
===================
MUME 2018 Concert - CALL FOR WORKS
===================
Overview
Creators of musically metacreative systems are invited to submit works for
a concert of musical metacreation, as part of the Ninth International
Conference on Computational Creativity, at University of Salamanca,
Salamanca, Spain. Submitted works can be of any musical style, but must
involve a performative element and must involve the use of computational
creativity techniques in the creation of the work. Examples from previous
MuMe concerts include, but are not limited to: reproduction of musical
style using machine learning; the evolution of musical structures using
interactive genetic algorithms; rule-based systems; systems based on
emergence or self-organisation; systems that perform music data mining to
create remixes, and so on.
All up-to-date information can be found here
<http://musicalmetacreation.org/mume-2018-concert/>.
Types of Work
Any work related to MuMe that can be presented in a performance context
will be considered. This includes:
-
Improvising agents that perform alongside live musicians.
-
Automated composition systems that generate music in real-time or
pre-compose music for performers, as well as automated DJ systems.
-
Systems that use AI methods for other creative objectives, such as
harmonising, intelligent looping, or finding suitable matches for a target
phrase.
-
Systems that generate lyrics.
-
Music that uses evolution, emergence or ecosystemic concepts.
Submission Process
Please submit a two page A4-portrait PDF describing your work before the
submission deadline. Email your submission to chairs(a)musicalmetacreation.org
.Submissions should not be anonymised. Your submission should:
-
Describe the work.
-
Explain how the work relates to the MuMe theme.
-
Give a relatively detailed technical description of the system*.
-
Give detailed performance requirements. As necessary, this may include a
stage plan, equipment needs, technical support needs, and performing
musicians (you can use additional PDF pages for any supporting items).
-
Provide a link to any additional documentary material such as audio or
video recordings, or online software (highly recommended).
-
Give credits and biography (name, affiliation, short biography up to 150
words).
* It is common to receive submissions that are highly ambiguous about what
the proposed system actually does. Works will be rejected if this is the
case. The description does not need to be highly technically detailed but
it should remove any fundamental ambiguity.
Selection Process
All submissions will be reviewed by three independent members of the
selection committee. Once these reviews have been completed the reviewers
and the chairs will discuss these works and reviews. The chairs will
provide a meta-review, the three original reviews, and a final decision.
Reviewers will be anonymous to the authors, but not to each other. The
final decision lies with the chairs, in consultation with the committee.
Upon Acceptance
Artists are required to be in attendance and set up their own equipment. At
least one artist must register for the ICCC conference and MuMe workshop.
A schedule for performances, rehearsals and soundchecks will be made
available in the week before the concert. Rehearsals and sound checks will
be on the Sunday 24th June unless otherwise arranged.
Follow us on Twitter @MetaMusical.
Workshop Organizers
===================
Pr. Philippe Pasquier (Workshop Chair)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Canada
http://metacreation.net/https://www.kadenze.com/programs/generative-art-and-computational-creativity
Pr. Arne Eigenfeldt
School for the Contemporary Arts
Simon Fraser University, Canada
Dr. Oliver Bown
Design Lab, Faculty of Architecture, Design and Planning
The University of Sydney, Australia
Kıvanç Tatar
School of Interactive Arts and Technology,
Simon Fraser University, Vancouver, Canada.
----------------------
http://musicalmetacreation.org
--
Kıvanç Tatar
----------------------------------
PhD Candidate
Interactive Arts and Technology
Simon Fraser University, Vancouver, Canada
Email: kivanctatar(a)gmail.com
Website: https://kivanctatar.com/
Hi list, ofelia v1.0.7 is now available.
This version includes GLSL shader loader.
GLSL is a C/C++ similar high level programming language for several parts
of the graphic card. With GLSL you can code short programs called shaders
which are executed on the GPU.
Please try out the examples in the "ofelia/examples/shader" directory.
Changes:
* added primMode argument to cone, cylinder, plane, sphere mesh command
* added ofShader related objects and help files
* added shader examples to the "examples/shader" directory
* added draggableShapes example to the "examples/input" directory
Upcoming features:
* Video player and grabber
* SVG loader
* GUI abstractions
More info about ofelia: https://github.com/cuinjune/ofxOfelia
Cheers!
jit_expr is a clone of the pure data expr/expr~/fexpr~ objects. It
just-in-time compiles its expressions so they should be much more optimized
than the original. If all works as designed, they should use less CPU than
the equivalent vanilla, non-expr, patching and have a significant CPU
advantage over the original expr objects.
This release has no functional difference with the previous 0.1 release,
and there is no update needed for Linux. All this does is support more
versions of Mac OsX. The 0.1 release was built for 10.13.
Thanks go to Marco Matteo Markidis for doing this compilation!
I've put the external, compiled for 64-bit Mac-OS 10.10+, up on deken: in
pd, go to help menu, find externals, search for "jit_expr", click on the
0.1.1 version to install. Unfortunately it shows up below 0.1 in the
current implementation of "find externals" but an update has already been
implemented for that for the future.
Please report any issues here:
https://github.com/x37v/jit-expr/issues
I would still love help building Windows and 32-bit Linux versions of the
externals.
The source code can be found in the git repo:
https://github.com/x37v/jit-expr
-Alex Norman
Hello everyone,
The jmmmp library of abstractions has been updated.
The newest object is Granulator, a Pure Data port of Robert Henke's
Granulator.
You can download it through deken, or follow the link below.
For more informations and a tarball file, visit
http://puredata.info/downloads/jmmmp/releases/0.5
Best,
João Pais
Hi list,
ofelia v1.0.6 is now available.
This version includes [pdgui] abstractions which emulate pd's built-in GUI.
Please try out "ofelia/examples/gui/pdguiExample.pd".
Screenshot:
https://github.com/cuinjune/ofxOfelia/blob/master/doc/pdgui_screenshot.png
More GUI abstractions will be added in the close future.
You can also customize the look/behavior of the GUI if you know the basics
of ofelia.
Changes:
* [ofCreateFbo] auto MSAA scaling is disabled
* fixed bug for mesh editor and getter objects
* [ofReceive], [ofValue] can change name dynamically
* float inlet is removed from [ofGetCanvasName], [ofGetDollarZero],
[ofGetDollarArgs], [ofGetPatchDirectory] as it's problematic when used in
cloned abstraction
* [ofGetPos], [ofGetScale] are renamed to [ofGetWindowPos],
[ofGetWindowScale]
* [ofGetTranslate], [ofGetRotate], [ofGetScale] are added
* [pdgui] abstractions are added to the "examples/gui" directory
* [ofMap] has 5th argument which enables/disables clamping
* [ofGetElapsedTime], [ofGetLastFrameTime] returns time in seconds
* [ofGetElapsedTimeMillis], [ofGetLastFrameTimeMillis] are added
Upcoming features:
* More GUI abstractions
* GLSL shader loader
* Video player and grabber
More info about ofelia: https://github.com/cuinjune/ofxOfelia
Cheers
Hi all,
In an effort to get organized and share work more effectively, I made git
repos for some ongoing projects and some new ones. I've gotten to a
stopping place for now, and uploaded the following items to deken. Note
that these were packaged with deken 4.0, so you may need to update to find
them. Here's a quick rundown:
********
[convolve~]: a partitioned impulse response convolution reverb
- version 0.11 uses FFTW for non-power-of-2 window sizes and therefore
finer control over delay. It includes an "eq" method for shaping the
spectrum of the reverb in 25 Bark-frequency bands. It also accepts a second
argument to specify an IR array for analysis at creation.
https://github.com/wbrent/convolve_tilde.git
********
********
[DRFX]: a dynamic routing system for DSP effects. (Pd-vanilla abstraction)
- [DRFX] automatically creates a signal routing system and associated
controls (routing matrix) based on inputs and effect modules that you
specify. This allows you to make any type of series or parallel connection
chain between your inputs and effects, and change routing on the fly. It
also saves/loads complex routing presets, including effect parameter
settings.
https://github.com/wbrent/DRFX.git
********
********
[martha~]: an oscillator bank designed to accept output from [sigmund~]'s
sinusoidal tracking function. (Pd-vanilla abstraction)
- Version 0.6 adds an option to toggle between oscillators and band-pass
filtered white noise, and vibrato functionality.
https://github.com/wbrent/martha_tilde.git
********
********
[missive~]: a vector synth object. (Pd-vanilla abstraction)
- [missive~] is a vector synth object that uses a wavetable index to
crossfade between neighboring wavetables in a set. Wavetable sets can have
an arbitrary number of wavetables, and are composed of individual .wav
files that each contain one wavetable cycle. The length of each wavetable
can be arbitrary because [missive~] adds the extra guard points required
for Pd's 4-point interpolation scheme.
https://github.com/wbrent/missive_tilde.git
********
********
[streamStretch~]: a time-stretching/pitch-shifting/layering pastiche
effect. (Pd-vanilla abstraction)
- [streamStretch~] buffers multiple copies of incoming live audio and
creates overlapping streams of time-stretched & transposed output that
trail the input to achieve a variety of results.
https://github.com/wbrent/streamStretch_tilde.git
********
********
[timbreID]: an audio analysis and classification library
- version 0.7.3 adds a few methods to the [tabletool] object: NRT
overlap-add, permutations, and sequential output of table contents (like
[list-drip] for tables). As of version 0.7, timbreID uses FFTW to allow for
large and non-power-of-2 window sizes. Several basic time-domain objects
were also added at 0.7.
https://github.com/wbrent/timbreID.git
********
********
[timeStretch~]: a polyphonic time compression/expansion sample player.
(Pd-vanilla abstraction)
- [timeStretch~] is built around the I07.phase.vocoder.pd example patch
from Pd's built-in documentation, and adds functionality for changing
sample arrays on the fly, 16-voice polyphony, predetermined playback
duration, and indefinitely suspending time post-transient until a "release"
command is issued.
https://github.com/wbrent/timeStretch_tilde.git
********
********
[tune~]: a real-time pitch correction object. (Pd-vanilla abstraction)
- [tune~] tunes an input signal to any desired MIDI pitch while keeping
formant structure relatively intact. It is an adaptation of the built-in
Pure Data documentation patch I10.phase.bash.pd. While the original patch
demonstrates the technique in separate analysis and playback stages,
[tune~] is designed for real-time pitch correction.
https://github.com/wbrent/tune_tilde.git
********
--
William Brent
www.williambrent.com
“Great minds flock together”
Conflations: conversational idiom for the 21st century
www.conflations.com
Hey, these have finally been uploaded to deken (old one, so no upgrade to
the new deken plugin needed) - we're only missing Linux 32 bits binaries.
> Date: Sat, 10 Mar 2018 12:54:47 -0300
> From: Alexandre Torres Porres <porres(a)gmail.com>
> To: pd-announce(a)lists.iem.at
> Subject: [PD-announce] ELSE 1.0 Beta 8 released
> Message-ID:
> <CAEAsFmj-g6a3z6j3ZS9L4bbSPRZ6hcGzr_QFXoY7cG0C2e0--Q(a)mail.gmail.
> com>
> Content-Type: text/plain; charset="utf-8"
>
> Hi, it's here: https://github.com/porres/pd-else/releases/tag/v1.0-beta8
> (and will be hopefully and eventually up in deken). This is still
> experimental, meaning (besides instability) I'll still change things and
> compromise backwards compatibility (I'm trying to get all such planned
> changes done with, but still have about two more releases to go before
> moving to the next/final stage).
>
> cheers
jit_expr is a clone of the pure data expr/expr~/fexpr~ objects. It
just-in-time compiles its expressions so they should be much more optimized
than the original. If all works as designed, they should use less CPU than
the equivalent vanilla, non-expr, patching and have a significant CPU
advantage over the original expr objects.
I've put the external, compiled for 64-bit Mac-OS and 64-bit Linux, up on
deken: in pd, go to help menu, find externals, search for "jit_expr".
After installing the external you should be able to change any of your expr
family of objects to just in time compile by loading the library, [declare
-lib jit_expr], and then prefixing the object name with "jit/", for example
[jit/fexpr~ $x1[0] + $y1[-1]].
I believe they are feature complete with the originals but I'd love to know
if there is anything that I'm missing or any bugs that you discover.
I'm not exactly sure how to profile pure data patches. If anyone has a good
approach or original expr~/fexpr~ patches that use a lot of CPU you can
share, let me know.
Compiling in the object takes a little bit of time, so the initial
instantiation of the object/expression will be a bit slower than the
original, FYI.
Please report any issues here:
https://github.com/x37v/jit-expr/issues
BTW, if you're curious to see the llvm assembly produced by your
expression, send the |print( message into the left most inlet of your
object then check out the pd console.
I would love help building Windows and 32-bit Linux versions of the
externals. I'm guessing we could also do raspi/arm builds but we'd need
some changes to the source code as it uses llvm and explicitly generates
code for x86 right now.
The source code can be found in the git repo:
https://github.com/x37v/jit-expr
-Alex Norman
Dear all,
A Doctoral Research Fellowship in Sound and Music Computing is available
at the University of Oslo:
https://www.jobbnorge.no/en/available-jobs/job/148855/
* Deadline: 04.04.2018
* Start date: September 2018
* Duration: 3+1 year
* Salary: 436-490 900 NOK
The fellowship is connected to the new Nordic Sound and Music Computing
Network <https://nordicsmc.create.aau.dk/> (NordicSMC) funded by the
Nordic Research Council. This network brings together a group of
internationally leading sound and music computing researchers from
institutions in five Nordic countries: Aalborg University, Aalto
University, KTH Royal Institute of Technology, University of Iceland,
and University of Oslo. The network covers the field of sound and music
from the “soft” to the “hard,” including the arts and humanities, and
the social and natural sciences, as well as engineering, and involves a
high level of technological competency.
We invite PhD proposals that focus on sound/music interaction with
periodic/rhythmic human body motion (walking, running, training, etc.).
The appointed candidate is expected to carry out observation studies of
human body motion in real-life settings, using different types of mobile
motion capture systems (full-body suit and individual trackers). Results
from the analysis of these observation studies should form the basis for
the development of prototype systems for using such periodic/rhythmic
motion in musical interaction.
The appointed candidate will benefit from the combined expertise within
the NordicSMC network, and is expected to carry out one or more
short-term scientific missions to the other partners. At UiO, the
candidate will be affiliated with RITMO Centre for Interdisciplinary
Studies in Rhythm, Time and Motion
<http://www.hf.uio.no/ritmo/english/>. This interdisciplinary centre
focuses on rhythm as a structuring mechanism for the temporal dimensions
of human life. RITMO researchers span the fields of musicology,
psychology and informatics, and have access to state-of-the-art
facilities in sound/video recording, motion capture, eye tracking,
physiological measurements, various types of brain imaging (EEG, fMRI),
and rapid prototyping and robotics laboratories.
Qualification requirements
* A Master's Degree or equivalent in sound and music computing, music
technology, computer science, or other relevant field. The applicant
is required to document that the degree corresponds to the profile
for the post. The Master's Degree must have been obtained by the
time of application.
* Experience with development of real-time sound and music systems
(Max, PD, SuperCollider, or similar)
* Excellent skills in written and oral English
* Personal suitability and motivation for the position
Please circulate through your networks and forward to relevant
candidates. Apologies for cross posting.
--
Alexander Refsum Jensenius, Ph.D.
Assoc. Professor, Department of Musicology, University of Oslo
<http://people.uio.no/alexanje/>
Deputy Director, RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
<http://www.uio.no/ritmo/>
New book: "A NIME Reader"
<http://link.springer.com/book/10.1007/978-3-319-47214-0>
New master's programme: "Music, Communication & Technology"
<http://www.uio.no/mct-master/>