Hi all,
I'm pleased to announce the release of Camomile, a plugin that allows to
load Pure Data patches inside a DAW.
Feature:
Support for multiple plugin instances.
Automatic creation of GUIs (toggle, slider, radio, comment, numbox)
Automatic creation of parameters (name, label, range, minimum, maximum,
etc.)
Up to 16 audio channels
Up to 64 parameters
Support for MIDI input and output
Support for play head information
Support for BPM information
Available for Linux, Mac and Windows.
I'll be happy to receive your feedback.
Download and further information on the wiki:
https://github.com/pierreguillot/Camomile/wiki
Have a good week-end!
Hello everyone,
Version 1.4 of the Click Tracker is out, with the support of Carl Ludwig
Hübsch. The new features are:
- new website
- beat direction forward/backward
- big display for beats + bar number
You can download the new version at http://j.mp/clicktracker14.
For more informations, refer to the Click Tracker's website at
http://j.mp/click-tracker.
You can also visit the Click Tracker on facebook -
http://j.mp/clicktrackerfb - or on google plus -
http://j.mp/clicktrackergp.
With best regards,
João Pais
--
jmmmpais(a)gmail.com | skype: jmmmpjmmmp
https://www.facebook.com/jmmmpais
PD Berlin meetings are from now at c-base (Rungestrasse 20 Berlin)
every first and third thursday in the month beginning 20:00, the next
one is tomorrows Thursday, the 17. March.
Looking forward to it and thanks c-base for hosting us.
What we do:
The meetings don't replace workshops (which are offered extra ;) ),
scope of these meetings can be any (or none) of the following:
patch/project showcase
programming
work on general Pd documentation
help for beginners (installation, programming tips, ...)
or something else you want to do ...
Just come along, bring your laptop and sit down with
us. The bar is open during the meetings. Necessary prerequisites are
only a computer and good mood. If Pd isn't on it, we'll download and
install it.
http://puredata.info/community/groups/pd-berlin/pd-berlin-users-group/
We thank c-base for providing the space and network for the
meetings.
Hi guys,
I'm Zack and I would like to share the release of GemmaLib with you.
I've been working on GemmaLib for quite a while now and I'm happy to
finally share it with you.
GemmaLib is a set of Pd abstractions that allows you to easily create
high-quality musical apps that run on mobile devices through the Gemma app
(currently available on iOS). Feel free to check out this YouTube video to
learn more: https://www.youtube.com/watch?v=FvPFKOkDOfo
You can get GemmaLib via http://www.iceblinkdigital.com/developers/ or via
https://github.com/iceblinkdigital/GemmaLib-1.0.1
I'm looking forward to hearing your feedback!
Zack
*Call for Papers & Call for Demos*
Submission deadline for Papers & Demos extension: *1 March 2016 (5:00pm
GMT+2)*
http://moco16.movementcomputing.org/index.php/papers/authors
=======================================
*International Workshop on Movement and Computing (MOCO'16)*
July 5-6 2016, Thessaloniki, Greece
MINES ParisTech, France
Paris 8 University, France
University of Macedonia, Greece
Aristotle University of Thessaloniki, Greece
http://moco16.movementcomputing.org/
Following on from the two previous successes of the International Workshop
on Movement and Computing (MOCO’14) at IRCAM (Paris, France) in 2014, as
well as MOCO’15 at Simon Fraser University (Vancouver, Canada) in 2015, we
are pleased to announce MOCO'16, which will be hosted in Thessaloniki,
Greece. MOCO'16 will be organized by MINES ParisTech, (France) in
co-operation with the Paris 8 University (France), the University of
Macedonia, Thessaloniki (Greece) and Aristotle University of Thessaloniki
(Greece).
The vision of MOCO'16 is to bring together academics, researchers,
engineers, designers, technologists, technocrats, creative artists,
anthropologists, museologists, ergonomists and other practitioners
interested in the phenomenon of the symbiosis between the human and the
creative process, e.g. dancer-digital medias, musician-instrument,
craftsman-object etc. This symbiosis takes the form of an interactional and
gravitational relationship, where the human element is both a trigger and a
transmitter, connecting perception (mind/environment interaction and
cognition), knowledge (theoretical understanding of a process) and gesture
(semantic motor skills).
MOCO'16 invites researchers that have experiences of capturing the combined
key elements of perception, knowledge and gesture/movement. MOCO'16 will be
of interest to artists who work on the elucidation of the intersection
between art, meaning cognition and technology by unlocking the hidden
components in human creativity. The workshop also provides a forum for
industrial partners, for whom the movement and gestures of the
workers/operators consist of key elements in terms of ergonomics and
health, to see and present state-of-the-art technologies.
A key feature of the MOCO'16 Workshop will be to open some of its
demonstrations and artistic activities to the public-at-large in order to
provide this extended audience with the opportunity to be informed about
current scientific issues and topics by experts in an informal setting.
*Suggested Topics*
=======================================
* Movement in Digital and Performing Arts, which focus on the use and
interaction between arts and movement in the following domains: music,
dance, song, graffiti, painting etc..
* Technical and Craftsmanship Gestures, highlighting the importance of
gestures in the professional context, whether technical or cultural.
* Interaction, Communication and Design of User Experience, which put the
emphasis on gestures and movement as interfaces between humans and machines.
* Analysis and Modelling, centred on the use of mathematical, statistical
or methodological tools for a better understanding of gestures and movement.
These topics overlap and are in no way exhaustive, so we also welcome
contributions focusing on other areas, with titles which might include any
of the following keywords:
* Finger-based interaction
* Embodied and whole body interaction design
* Professional movement and gesture
* Movement analysis and analytics
* Movement expression in avatar, artificial agents, virtual humans or robots
* Sonification and visualization of movement and gesture
* Modeling movement, gesture and expressivity
* Sensori-motor learning with audio-visual feedback
* Motion-driven narrative
* Dance and technology
* Movement representation
* Embodiment and embodied cognition
* Mediated choreography
* Mechatronics and creative robotics
* Movement in affective computing
* Music and movement
* Somatic practice and design
* Dance and neuroscience
* Vocal tract movements in singing voice
* Design for movement in digital art
* Movement computation in ergonomics, sports, and health
*Participation in the workshop*
=======================================
The workshop is an opportunity to present a research or study or details of
collaborative work. Participants will have the opportunity to offer a
presentation of the results of their research on one of the themes of the
workshop and to interact with their scientific/ artistic peers, in a
friendly and constructive environment.
If you are interested in offering an oral presentation of your work, please
submit a paper and/or a demo and/or a poster.
*Submission*
=======================================
The submission categories are:
* Long paper with oral presentation (8 pages maximum)
* Research note with oral presentation (4 pages maximum)
* Extended abstracts with poster presentation (2 pages maximum)
* Demonstration (one of the above papers (2 pages minimum + Demo proposal
form).
All submissions should be in pdf format and should use the MOCO’16 template
– adapted from ACM SIGCHI template
http://moco16.movementcomputing.org/index.php/papers/authors
It is possible for participating authors to submit a demonstration proposal
in addition to their regular paper submission by completing the Demo
proposal form and sending it along with their submission. Together with the
demo proposal form, authors have to provide a link to a video about their
work. The demo proposal form is mandatory for all demo submissions and must
include details about technical set-up and space requirements.
Online submission: All submissions must be made through the Open Conference
System (OCS)
All submissions must be anonymous and will be peer-reviewed. The MOCO
proceedings will be indexed and published in the ACM digital library.
*Important Dates*
=======================================
Submission deadline for Papers & Demos extension : *1 March 2016 (5:00pm
GMT+2)*
Notification: 20th April 2016
Early bird registration: 30th May 2016
Early program: 10th June 2016
*Venue*
=======================================
University of Macedonia
156 Egnatia Street, GR-546 36 Thessaloniki, Greece
and
Aristotle University Research Dissemination Center
3rd September Avenue, GR-546 36 Thessaloniki, Greece
*MOCO Steering Committee*
=======================================
* Thecla Schiphorst, SIAT, SFU, Vancouver, Canada
* Philippe Pasquier, SIAT, SFU, Vancouver, Canada
* Sarah Fdili Alaoui, UPSud, INRIA, Ex-SITU, Orsay, France
* Frederic Bevilacqua, Ircam, Paris, France
* Jules Françoise, SIAT, SFU, Vancouver, Canada
Contact email: moco-share(a)sfu.ca
*MOCO'16 Organizing Committee*
=======================================
* Sotiris Manitsaris, General Conference Chair, MINES ParisTech, Paris,
France
* Leontios Hadjileontiadis, General Scientific Chair, Aristotle University
of Thessaloniki, Greece
* Jean-François Jégo, General Artistic Chair, Paris 8 University, France
* Vincent Meyrueis, General Demo Chair, Paris 8 University, France
* Athanasios Manitsaris, Local Committee Chair, University of Macedonia,
Thessaloniki, Greece
Contact email: moco16(a)uom.edu.gr
Cordialement | Regards | Με τιμή,
Dr. Sotiris *Manitsaris*
Senior Researcher | Research Project Leader
Centre for Robotics | MINES ParisTech | PSL Research University
A : 60, boulevard Saint Michel | 75272 Paris cedex 06 | France
T : +33 01 40 51 91 69 | M : sotiris.manitsaris(a)mines-paristech.fr
W : LinkedIn <https://fr.linkedin.com/in/sotirismanitsaris>
<http://moco16.multimedia.uom.gr>
—
Participate at MOCO’16 <http://moco16.movementcomputing.org/>
3rd International Workshop on Movement and Computing
5-6 July 2016 | Thessaloniki | Greece
The CfP
<http://moco16.movementcomputing.org/index.php/papers/call-for-papers> is
now open!
apologies for doubles...
Anyone with problems with the deadlines, please email me,
-David
The 22nd International Conference on Auditory Display (ICAD 2016)
Australian National University, Canberra, July 2-July 8 2016
CALL FOR PAPERS, POSTERS, SONIFICATIONS, INSTALLATIONS,
COMPOSITIONS, WORKSHOPS, PANELS AND DEMONSTRATIONS
Co-chairs:Dr David Worrall, Australian National University and
Dr Stephen Barrass, University of Canberra
Please check the conference website for updates:
http://icad.org/icad2016/
ICAD is a highly interdisciplinary academic conference with relevance to
researchers, practitioners, musicians, and students interested in the
design of sounds to support tasks, improve performance, guide decisions,
augment awareness, and enhance experiences. It is unique in its singular
focus on auditory displays and the array of perception, technology, and
application areas that this encompasses. Like its predecessors, ICAD
2016 will be a single-track conference, open to all, with no membership
or affiliation requirements.
ICAD 2016-the 22nd International Conference on Auditory Display-will be
held at the Australian National University in Canberra, Australia, from
July 2 to 8, 2016. The conference venue is the ANU School of Music, in
the downtown centre of Canberra. Workshops and the graduate student
ThinkTank (doctoral consortium) will be on the weekend of July 2 and 3,
before the main conference.
Note that ICAD is back-to-back with the conference on New Interfaces for
Musical Expression (NIME) which will be held in Brisbane the following
week, so international attendees can attend two international
conferences for the one trip to Australia!
THEME: SONIC INFORMATION DESIGN
The designed world is rapidly replacing the natural world. Design has
been called the "third culture" and has been distinguished from the
Sciences and Arts by Nigel Cross in terms of
* /things to know/: the natural world in science, human experience in
art, and the artificial world in design.
* /ways of knowing/: rationality and objectivity in science,
reflection and subjectivity in art, and imagination and practicality
in design.
* /ways of finding out/: experiment and analysis in science, criticism
and evaluation in art, and modelling and synthesis in design.
This year's theme - Sonic Information Design - has the aspiration that
artificial sounds may be designed to make the world a better place. Like
other design disciplines, Sonic Information Design takes a synergetic
user-centred view of the relationship between artefacts, those that are
affected by them, and the social contexts in which they occur. A Design
orientation pays particular attention to the phenomenology of user
experience - including physical, cognitive, emotional, and aesthetic
issues; the relationship between form, function, and content; and
emerging concepts such as fun, playfulness and design futures.
Practice-based research is considered as a generative process of
exploration, speculation and discovery, with outcomes that can be
provisional, contingent and aspirational, while aiming for richer, more
situated understandings that lead to the advancement of knowledge and
the proliferation of new realities.
Sonic Information Design draws on theoretical approaches from multiple
disciplines to guide hypothesis testing at multiple points during an
iterative process -what Bill Gaver calls "humble theory". Sonic
Information Design recognises usefulness as critical for evaluating
artefacts, and the perceptual alignment with data characteristics as
critical for effective designs.
ICAD 2016 invites contributions that take a design approach, introduce
design theory and apply design methods to Auditory Display and Data
Sonification, with a view to building a conceptually robust foundation
for Sonic Information Design.
TOPICS
Topics for ICAD2016 include new and emerging themes, as well as more
traditional ICAD ones. Themes include but are not limited to:
* Sonic Information Design
* Stream-based Sonification and Auditory Scene Design
* Acoustic Sonification
* Small Data (personal, intimate) sonification and the quantised self
* Sonification, soundscape and screensound
* Sonification in Health and Environmental Data (soniHED)
* Musification - sonifications and music
* Sonification, personal fabrication and maker culture
* Sonification in the Internet of Things
* Auditory Data Mining and Big Data sonification
* 3D and Spatial Audio
* Aesthetics, Philosophy, and Culture of Auditory Displays
* Accessibility
* Applications
* Design Theory and Methods
* Evaluation and Usability
* Human Factors and Interaction
* Mappings from Data to Sound
* Psychology, Cognition, Perception, and Psychoacoustics
* Sonification and Exploration of Data through Sound
* Sound as Art
* Technologies and Tools
*Presentations will be organised according to four major themes:*
* Auditory Data Mining
* Interactive Sonication, including for sports and health.
* Musification and Aesthetics
* Auditory Perception, including streaming, spatialisation and
inter/poly modality.
KEY DATES (2016)
29 February Submission Deadline for Full Papers, Posters and
Extended Abstracts
14 March Submission Deadline for Workshop proposals
28 March Acceptance Notification of Papers, Posters and Extended
Abstracts
04 April Submission Deadline for Sonifications / Installations /
Compositions / Extended Abstracts
11 April Acceptance Notification of Workshop proposals
09 May Submissions Deadline for Camera-Ready materials
16 May Acceptance Notification of Sonifications / Installations /
Compositions
2-3 July Conference ThinkTank and Workshops
4-8 July ICAD1016 Conference Proper (Programme details TBA)
PUBLICATION
We are aiming to select papers for a special issue of a leading
journal. Details to follow.
WORKSHOPS
Proposals for half and full-day workshops are called for.
*Deadline for Submission of Workshop proposals: 14 March 2016*
INSTALLATIONS
Installations at ICAD 2016 will be afforded their own individual
space and, depending on the number of submissions, will likely be
featured for an entire day. Spaces available include
o A public but relatively quiet space
o An entrance foyer space
o A pub and a café space (with table-top Bluetooth speakers if
applicable)
*Deadline for submission of Installation proposals: 4 April 2016*
EXTRA-CURRICULA
We have organised a rich array of natural and cultural activities to
ensure your trip down under is not all work and no play!
MORE INFORMATION
Visit the conference website (currently in development) for updates
and other information:
http://icad.org/icad2016/
CORRESPONDENCE
Please address correspondence to: icad2016chair _at_ icad.org
<http://icad.org/>
WELCOME!
We look forward to you joining us in making a wonderful conference!
--
------------------------------------------------------------------------
Prof. Dr. David Worrall
International Audio Laboratories Erlangen
Fraunhofer-Institut für Integrierte Schaltungen IIS
Email: david.worrall(a)iis.fraunhofer.de
Adjunct Senior Research Fellow
School of Music, Australian National University
david.worrall(a)anu.edu.au
personal website: avatar.com.au <http://avatar.com.au> /NetSon/
<http://avatar.com.au/netson>
ICAD2016logo <http://icad.org/icad2016> Co-Chair ICAD2016
Canberra 2–8 July
icad.org/icad2016/ <http://icad.org/icad2016/>
======================
2nd CALL FOR SUBMISSION
======================
((( MUME 2016 )))
4th International Workshop on Musical Metacreation
http://www.musicalmetacreation.org
<https://www.easychair.org/conferences/?conf=mume2016>
June 27, 2016
MUME 2016 is to be held in Paris at Université Pierre et Marie Curie
(UPMC), in conjunction with the Seventh International Conference on
Computational Creativity, ICCC 2016.
=== Important Dates ===
Workshop submission deadline: May 1, 2016
Notification date: June 1, 2016
Camera-ready version: June 10, 2016
Workshop date: June 27, 2016
======================
We are delighted to announce the 4th International Workshop on Musical
Metacreation (MUME 2016) to be held June 27, 2016, in conjunction with the
Seventh International Conference on Computational Creativity, ICCC 2016.
MUME 2016 builds on the enthusiastic response and participation we received
for the past occurrences of MUME series:
- MUME 2012 (held in conjunction with AIIDE 2012 at Stanford):
http://musicalmetacreation.org/index.php/mume-2012/
- MUME 2013 (held in conjunction with AIIDE 2013 at NorthEastern):
http://musicalmetacreation.org/index.php/mume-2013/
- MUME 2014 (held in conjunction with AIIDE 2014 at North Carolina):
http://musicalmetacreation.org/index.php/mume-2014/
Metacreation involves using tools and techniques from artificial
intelligence, artificial life, and machine learning, themselves often
inspired by cognitive and life sciences, for creative tasks. Musical
Metacreation explores the design and use of these tools for music making:
discovery and exploration of novel musical styles and content,
collaboration between human performers and creative software “partners”,
and design of systems in gaming and entertainment that dynamically generate
or modify music.
MUME aims to bring together artists, practitioners, and researchers
interested in developing systems that autonomously (or interactively)
recognize, learn, represent, compose, generate, complete, accompany, or
interpret music. As such, we welcome contributions to the theory or
practice of generative music systems and their applications in new media,
digital art, and entertainment at large.
Topics
======
We encourage paper and demo submissions on MUME-related topics, including
the following:
-- Models, Representation and Algorithms for MUME
---- Novel representations of musical information
---- Advances or applications of AI, machine learning, and statistical
techniques for generative music
---- Advances of A-Life, evolutionary computing or agent and multi-agent
based systems for generative music
---- Computational models of human musical creativity
-- Systems and Applications of MUME
---- Systems for autonomous or interactive music composition
---- Systems for automatic generation of expressive musical interpretation
---- Systems for learning or modeling music style and structure
---- Systems for intelligently remixing or recombining musical material
---- Online musical systems (i.e. systems with a real-time element)
---- Adaptive and generative music in video games
---- Techniques and systems for supporting human musical creativity
---- Emerging musical styles and approaches to music production and
performance involving the use of AI systems
---- Applications of musical metacreation for digital entertainment: sound
design, soundtracks, interactive art, etc.
-- Evaluation of MUME
---- Methodologies for qualitative or quantitative evaluation of MUME
systems
---- Studies reporting on the evaluation of MUME
---- Socio-economical Impact of MUME
---- Philosophical implication of MUME
---- Authorship and legal implications of MUME
Submission Format and Requirements
=================================
Please make submissions via the EasyChair system at:
https://easychair.org/conferences/?conf=mume2016 .
The workshop is a full day event that includes:
- Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
- Presentations of POSITION PAPERS and WORK-IN-PROGRESS PAPERS (5 pages
maximum)
- Presentations of DEMONSTRATIONS (3 pages maximum) which present outputs
of systems (working live or offline).
All papers should be submitted as complete works. Demo systems should be
tested and working by the time of submission, rather than be speculative.
We encourage audio and video material to accompany and illustrate the
papers (especially for demos). We ask that authors arrange for their web
hosting of audio and video files, and give URL links to all such files
within the text of the submitted paper.
Submissions do not have to be anonymized, as we use single-blind reviewing.
Each submission will be reviewed by at least three program committee
members.
Workshop papers will be published as MUME 2016 Proceedings and will be
archived with an ISBN number. Submissions should be formatted using the
AAAI, 2-column format; see instructions and templates here:
http://www.aaai.org/Publications/Author/author.php
Submission should be uploaded using MUME 2016 EasyChair portal:
https://www.easychair.org/conferences/?conf=mume2016
For complete details on attendance, submissions and formatting, please
visit the workshop website: http://www.musicalmetacreation.org
Presentation and Multimedia Equipment:
==========================================
We will provide a video projection system as well as a stereo audio system
for use by presenters at the venue. Additional equipment required for
presentations and demonstrations should be supplied by the presenters.
Contact the Workshop Chair to discuss any special equipment and setup
needs/concerns.
Attendance
=======================================
It is expected that at least one author of each accepted submission will
attend the workshop to present their contribution. We also welcome those
who would like to attend the workshop without presenting. Workshop
registration will be available through the ICCC2016 conference system.
Questions & Requests
======================================
Please direct any inquiries/suggestions/special requests to the Workshop
Chair, Philippe Pasquier (pasquier(a)sfu.ca).
Workshop Organizers
===================
Pr. Philippe Pasquier (Workshop Chair)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Canada
Pr. Arne Eigenfeldt
School for the Contemporary Arts
Simon Fraser University, Canada
Dr. Oliver Bown
Design Lab, Faculty of Architecture, Design and Planning
The University of Sydney, Australia
Kıvanç Tatar (MUME Administration and Publicity Assistant)
School of Interactive Arts and Technology,
Simon Fraser University, Vancouver, Canada.
----------------------
http://www.musicalmetacreation.org
======================
--
Kıvanç Tatar
----------------------------------
Researcher, Metacreation Lab
Interactive Arts and Technology
Simon Fraser University, Vancouver, Canada
GSM : 1 778 858 6073
Email: kivanctatar(a)gmail.com
Website: https://kivanctatar.wordpress.com/
Howdy all,
I’m pleased to announce that the PdParty BETA has started!
After almost two months of work, there have been quite a number of improvements and I feel the app is now close to version 1.0 status.
I just need your help to find bugs, suggest improvements, and create demo scenes.
How this works
1. Send me your name & email*
2. I add you to the tester list
3. You should receive a notification email
4. Download the free TestFlight app form the App Store
5. Open TestFlight and install the latest PdParty build
*Those of you who participated in the alpha testing should already have received an email.
Info
PdParty User Guide <https://github.com/danomatika/PdParty/blob/master/doc/guide/PdParty_User_Gu…>
PdParty Composer Pack <http://docs.danomatika.com/PdParty_composerpack.zip>
Happy Patching!
--------
Dan Wilcox
@danomatika <https://twitter.com/danomatika>
danomatika.com <http://danomatika.com/>
robotcowboy.com <http://robotcowboy.com/>
Hi everyone,
I'd like to announce that we've released version 1.0.0 of PD for Android on
a maven repository on JCenter.
There were no changes done to the API of pd-for-android but rather to the
way that it can be used. It is much simpler now to integrate the library in
Android apps and the project can now be easily used with Android Studio,
which has replaced Eclipse as the standard tool used for developing Android
apps.
For further information please see the project's page:
https://github.com/libpd/pd-for-android
Best wishes,
Tal Kirshboim