*Call for projects to be developed within the framework of the
international workshop Interactivos?'13 Nuvem Autonomías: Rural
Sciences. Call for Projects*
Nuvem, in collaboration with Medialab-Prado in Madrid (Spain), issues a
call for the selection of 5 residency proposals to be collaboratively
developed at the rural hacklab rural of *Visconde de Mauá (Rio de
Janeiro, Brazil) from September 5 to 20, 2013.*
Researches, professionals, and amateurs can submit their proposals on
communications, food, drinking, health, energy, transportation or
bio-construction from a rural point of view. *Deadline: August* *3, 2013
*
**For submissions and more information:*
http://nuvem.tk/
*
--
Nerea García Garmendia
Medialab-Prado
Plaza de las Letras
Calle Alameda, 15. 28014 Madrid
difusion(a)medialab-prado.es
http://www.facebook.com/MedialabPradoMadrid
Twitter: @medialabprado <https://twitter.com/medialabprado>
/"Antes de imprimir este documento asegúrate de que es realmente
necesario. ¡Gracias por tu colaboración!"/
(Apologies for cross-postings, French version below)
Call for Projects 2014-2015: Musical & Artistic Research Residency Program
Submission Deadline: October 1st, 2013 (Midnight Paris Time)
Details and submission procedure: http://www.ircam.fr/rm-residence.html?&L=1
The 5th edition of Ircam's Musical & Artistic Research Residency program is now open for online submissions for the 2014-2015 school calendar. Ircam (Institute for Research and Coordination in Acoustics and Music) offers experimental environments where composers/artists strive to expand their artistic experience at one end, and scientists aim at extending research and technological paradigms for new artistic expressions. In addition, candidates can apply for the INEDIT Track, a collaborative project between Ircam, Grame and LaBRI funded by the French National Research Agency.
For its 5th edition, Ircam is inviting composers and artists to submit projects for the 2014-2015 Research Residency program. The program is open to international artists, regardless of age or nationality, who wish to carry experimental research using Ircam's facilities and extensive research environment. Submission is online only. Each submission will be evaluated by an international panel of experts including researchers, composers, computer musicians and artists. Upon nomination, each candidate will be granted a residency at Ircam during a specific period (three to six months) and in association with a team/project at Ircam. In addition, laureates receive an equivalent of 1200 Euros per month to cover expenses in France.
------------
Appel à projet 2014-2015: Résidence de Recherche Musicale et Artistique
Date limite de candidature: October 1st 2013 (Minuit, Paris)
Plus d'information et procédure de candidature: http://www.ircam.fr/rm-residence.html
L’Ircam (Institut de recherche et coordination acoustique/musique) offre un environnement unique pour l’expérimentation, permettant aux compositeurs d’étendre leur expérience musicale et de repenser leur pratique à travers les concepts et idées liés aux développements des nouvelles technologies les plus récentes. Ces technologies sont le résultat des défis posés aussi bien par les impulsions artistiques que par les nouveaux domaines de recherche explorés par les équipes scientifiques. Les lauréats ont également la possibilité de postuler dans la catégorie spécifique au projet INEDIT financé par ANR entre Ircam, Grame et LaBRI.
La cinquième édition du programme de résidences de recherche est ouverte aux artistes internationaux, sans condition d’âge ou de nationalité, qui souhaitent conduire un projet de recherche musicale en bénéficiant des facilités matérielles proposées à l’Ircam et de la richesse de son environnement de recherche. Les candidats seront choisis selon un processus de sélection faisant appel à l’expertise de chercheurs et artistes internationaux. Les lauréats vont bénéficier d'une résidence à l’Ircam pour une durée de trois à six mois en lien étroit avec les équipes de recherche de l’Ircam. Le candidat recevra une indemnité forfaitaire de l'ordre 1200 euros par mois pour couvrir ses frais pendant sa résidence en France.
Arshia Cont
Director, Research/Creativity Interfaces Department,
Ircam - Centre Pompidou, Paris, France.
http://www.ircam.fr/irc.html?L=1
http://youtu.be/veaq_9XUAV0
Check it out, spread the word
Electroacoustic Studies (EA) at Concordia immerses students in the world of sound perception, creation and capture. Whether live or in the studio, the EA program explores the infinitely complex, invisible domain of sound, helping students to really hear and to use the well-equipped facilities for their own compositions, performances and recordings. The Concordia Laptop Orchestra and frequent opportunities to work collaboratively with other artists in music, media, video, dance, animation and theatre make EA one of the most active communities on campus. Watch this video to see how leading faculty inspire students to navigate and innovate in a genre that is constantly redefining itself.
Eldad
Hello,
I am happy to announce version 0.13.0 of PuREST JSON, code name: heady stuff
PuREST JSON is a library for working with RESTful HTTP webservices, and
JSON data.
Authentication and authorization for webservices are available with
basic HTTP auth, cookie authentication, and OAuth. As an example for
OAuth authenticated webservices, a Twitter client is included.
Changes in this version:
- Setting HTTP headers possible
- Cancelling of requests possible while waiting (experimental)
- Switched Makefile to libary template 1.0.14
- Semantic versioning
Cancelling requests is still an experimental feature, it will not work
reliably and most probably still contains memory leaks. But as real
cancellation may involve a lot of refactoring, I have released this
version anyway.
Github repository: https://github.com/residuum/PuRestJson
Source code packages: https://github.com/residuum/PuRestJson/releases
Binaries for Windows and Debian i386, amd64, and armhf:
http://ix.residuum.org/pd/purest_json.html
Build instructions for all platforms:
https://github.com/residuum/PuRestJson/wiki/Compilation
Have fun,
Thomas
--
"Theoretically, [the amount of money in circulation] is watched
carefully by clever, serious economists. In practice, all the world's
money is one big swirling, whirling pool." (Cory Doctorow - For The Win)
http://www.residuum.org/
Hi
Let me share the recording of a netpd session I had yesterday with Sqgl
from Sydney. I like it myself and thus I thought it might be worth
sharing:
http://www.netpd.org/sessions/2013-07-09_prism-breaks.mp3
Sqgl and me are having regular sessions every Tuesday at ~08:00 UTC
(this is 10 in the morning for most of Central Europe and evening for
Westaustralia). Everyone is invited to join.
Roman
---
http://www.netpd.org
---
Workshop "Bodynet - How to make a network of bodies?"
*Call for Collaborators*
Call to collaborate in the project development of the *five selected
projects
<http://medialab-prado.es/article/bodynet_como_hacer_una_red_de_cuerpos_proy…>*
for the workshop**within the European project *METABODY*
<http://www.metabody.eu/> - */Media Embodiment Tékhné and Bridges of
Diversity/*/./
Projects deal with topics such as performances and their relationship
with the body, heteronomativity or the blurry boundaries between reality
and fiction through technology.
Free registration. Dealine: 23 july, 2013. Limited space.
Dates: July 24 -31, 2013.
Venue: Medialab-Prado in Madrid (Spain)
Collaborators' Profile
The call is open to the participation of anyone interested. However, due
to the type of projects, we are looking also for profiles such as
cultural minorities, anthropologists, sociologists, biologists (how to
body networks operate in this and other societies and ecosystems),
social workers or care specialists (body, means and functional
diversity), hackers and developers (how to engage the body in a digital
network based on the model of the Brain Talk Communities).
List of selected proposals
http://medialab-prado.es/article/bodynet_como_hacer_una_red_de_cuerpos_proy…
More information:
http://medialab-prado.es/article/bodynet_colaboradores
--
Nerea García Garmendia
Medialab-Prado
Plaza de las Letras
Calle Alameda, 15. 28014 Madrid
difusion(a)medialab-prado.es
http://www.facebook.com/MedialabPradoMadrid
Twitter: @medialabprado <https://twitter.com/medialabprado>
/"Antes de imprimir este documento asegúrate de que es realmente
necesario. ¡Gracias por tu colaboración!"/
Call for Participation -- please distribute widely
=============================
Musical Metacreation 2013
DEADLINE EXTENSION
Submissions Now Due July 9
=============================
((( MUME 2013 )))
2nd International Workshop on Musical Metacreation
http://www.metacreation.net/mume2013/
Held at the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13)
Northeastern University, Boston, Massachusetts, USA
October 14-15, 2013
----------------------
News:
New Deadline for Paper and Demo Submissions:
*** July 9, 2013 ***
New Info for Interested Industry Presenters:
http://www.metacreation.net/mume2013/industry
======================
We are delighted to announce the 2nd International Workshop on Musical Metacreation (MUME2013) to be held October 14 and 15, 2013, in conjunction with the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13). MUME2013 builds on the enthusiastic response and participation we received for the inaugural workshop in 2012, which received 31 submissions, 17 of which were accepted (a 55% acceptance rate). This year the workshop has expanded to 2 days.
Thanks to continued progress in artistic and scientific research, a new possibility has emerged in our musical relationship with technology: Generative Music or Musical Metacreation, the design and use of computer music systems which are "creative on their own". Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and life sciences. Musical Metacreation suggests exciting new opportunities to enter creative music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software "partners", and design of systems in gaming and entertainment that dynamically generate or modify music.
MUME brings together artists, practitioners and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large. Join us at MUME2013 and take part in this exciting, growing community!
Topics
======
We encourage paper and demo submissions on topics including the following:
* Novel representations of musical information
* Systems for autonomous or interactive music composition
* Systems for automatic generation of expressive musical interpretation
* Systems for learning or modelling music style and structure
* Systems for intelligently remixing or recombining musical material
* Advances or applications of AI, machine learning, and statistical techniques for musical purposes
* Advances or applications of evolutionary computing or agent and multiagent-based systems for musical purposes
* Computational models of human musical creativity
* Techniques and systems for supporting human musical creativity
* Online musical systems (i.e. systems with a real-time element)
* Adaptive and generative music in video games
* Methodologies for, and studies reporting on, evaluation of musical metacreations
* Emerging musical styles and approaches to music production and performance involving the use of AI systems
* Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc.
Format and Submissions
======================
The workshop will be a two day event including:
* Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
* Presentations of POSITION PAPERS and TECHNICAL IN-PROGRESS WORK (5 pages maximum)
* Presentations of DEMONSTRATIONS (3 pages maximum)
* One or more PANEL SESSIONS (potential topics include international and networked collaborations, evaluation methodologies, generative music in art vs. games)
* Presentations by INDUSTRY PARTNERS
Workshop papers will be published in a Technical Report by AAAI Press and will be archived in the AAAI digital library.
Submissions should be made in AAAI, 2-column format; see instructions here: http://www.aaai.org/Publications/Author/author.php
We also invite companies involved in Musical Metacreation and its application to present their work and challenges to the MUME community. Each industrial partner selected will be given a timeslot to present/demo during the workshop. Interested industry representatives, for more info see: http://www.metacreation.net/mume2013/industry
For complete details on attendance, submissions and formatting, please
visit the workshop website:
*** http://www.metacreation.net/mume2013/ ***
Important Dates
===============
Submission deadline: July 9, 2013
Notification date: August 6, 2013
Accepted author CRC due to AAAI Press: August 14, 2013
Workshop date: October 14-15, 2013
Workshop Organizers
===================
Dr. Philippe Pasquier (Workshop Chair)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Vancouver, Canada
Dr. Arne Eigenfeldt
School for the Contemporary Arts
Simon Fraser University, Vancouver, Canada
Dr. Oliver Bown
Design Lab, Faculty of Architecture, Design and Planning
The University of Sydney, Australia
Graeme McCaig (Administration & Publicity Assistant)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Vancouver, Canada
Program Committee
=================
Gérard Assayag - IRCAM-France
Al Biles - Rochester Institute of Technology - USA
Tim Blackwell - Department of Computing, Goldsmiths College, University of London - UK
Alan Blackwell - Cambridge University - UK
Oliver Bown - The University of Sydney - Australia
Andrew Brown - Queensland Conservatorium, Griffith University - Australia
Jamie Bullock - Integra Lab, Birmingham Conservatoire - UK
Karen Collins - University of Waterloo - Canada
Nick Collins - University of Sussex - UK
Darrell Conklin - University of the Basque Country - Spain
Arne Eigenfeldt - Simon Fraser University - Canada
Jason Freeman - Georgia Institute of Technology - USA
Guy Garnett - University of Illinois - USA
Toby Gifford - Griffith University - Australia
Luke Harrald - Elder Conservatorium of Music, The University of Adelaide - Australia
Bill Hsu - Department of Computer Science, San Francisco State University - USA
Robert Keller - Harvey Mudd College - USA
Nyssim Lefford - Audio Technology, Luleå University of Technology - Sweden
George Lewis - Department of Music, Columbia University - USA
Aengus Martin - Faculty of Engineering, The University of Sydney - Australia
James Maxwell - Simon Fraser University - Canada
Graeme McCaig - School of Interactive Arts and Technology, Simon Fraser University - Canada
Jon McCormack - Centre for Electronic Media Art, Monash University - Australia
James McDermott - Complex and Adaptive Systems Laboratory, University College Dublin - Ireland
Alex McLean - ICSRiM - University of Leeds - UK
Kia Ng - ICSRiM - University of Leeds - UK
Philippe Pasquier - School of Interactive Arts and Technology, Simon Fraser University - Canada
Marcus Pearce - Queen Mary, University of London - UK
Robert Rowe - New York University - USA
Benjamin Smith - Case Western Reserve University - USA
Richard Stevens - Leeds Metropolitan University - UK
Michael Sweet - Berklee College of Music - USA
Peter Todd - Indiana University - USA
Dan Ventura - Brigham Young University - USA
Ivan Zavada - Conservatorium of Music, The University of Sydney - Australia
----------------------
http://www.metacreation.net/mume2013/
======================
(apologies for x-post)
Jun 30 - 7:30pm
Michael Schimmel Center for the Arts, 3 Spruce Street, New York
Free entrance
"Ominous" an incarnated sound sculpture performance
Live acoustic sounds from the performer's body are digitally sculpted
through a choreography of physical gesture. The result is an unstable sonic
object which oscillates between high density and violent release. The
listeners see through sound the sculpture which their sight cannot perceive.
http://marcodonnarumma.com/works/ominous/
This piece for the Xth Sense biophysical instrument was commissioned by the
international Competition for Live Electronics, organized by the European
Conference of Promoters of New Music.
Other performing artists: Pauline Oliveros, Laurie Anderson, Joshua Light
Show, Maja S.K. Ratkje, Juraj Kojs, Mari Kimura with Tomoyuki Kato.
Press release
"Part of a shared evening co-presented with Pace University and
Harvestworks as part of the New York Electronic Art Festival.
Marco Donnarumma’s *Ominous* is part of an evening of concerts and
performances, providing audiences with an opportunity to experience
cutting-edge electronic artwork from artists working across the arts and
technology spectrum. The New York Electronic Art Festival brings talent
into Lower Manhattan from around the globe to showcase new technologies and
new artistic practices, and to celebrate the transformational intersection
of the two."
http://www.rivertorivernyc.com/artists/marco-donnarumma/
hope to see some of you there,
best wishes,
--
Marco Donnarumma
New Media + Sonic Arts Practitioner, Performer, Teacher, Director.
Embodied Audio-Visual Interaction Research Team.
Department of Computing, Goldsmiths University of London
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Portfolio: http://marcodonnarumma.com
Research: http://res.marcodonnarumma.com
Director: http://www.liveperformersmeeting.net
Post-Doctoral Research Fellowship (up to 3 years)
"Sound-Producing Actions"
University of Oslo, Department of Musicology, fourMs lab
http://uio.easycruit.com/vacancy/982967/62048?iso=no
## Description ##
One postdoctoral research fellowship is available at the Department of
Musicology for work within the fourMs research project. fourMs is an
interdisciplinary research project with participation of the Departments
of Musicology, Informatics and Psychology at the University of Oslo. As
suggested by the acronym of the four MÂ’s, Music, Mind, Motion, and
Machines, the project is focused on the relationships between body
motion, mind and technology in music, and is strongly interdisciplinary
in its approach and activities. See the fourMs website for more information:
http://www.fourms.uio.no
The fellowship is within the area of sound-producing actions, broadly
defined as body motion that produces sound in interaction with various
instruments and/or by the human vocal apparatus. Applicants should have
competence within one or more relevant areas for research on sound
producing actions such as motion capture, processing and classification
of motion capture data, biomechanics, motor control, and, more
generally, embodied music cognition. Familiarity with basics of musical
acoustics and sound synthesis would be an additional advantage in view
of modeling the links between sound-producing actions and the resultant
sound. The postdoctoral fellow shall work closely together with the
other researchers in the fourMs group, taking part in joint studies,
experiments, publications and dissemination efforts, including both
scientific and more artistic and practice-related sub-projects.
The main purpose of the post-doctoral research fellowship is to qualify
for work in top academic positions within their disciplines. The
postdoctoral fellowship period is three years. The position entails a
10% compulsory workload per year of teaching and supervision duties.
Please see regulations concerning other conditions of employment for
research fellowship positions. The proposed starting date for the
fellowship is fall 2013.
## Qualifications and Personal Skills ##
In assessing the applications, special emphasis will be placed on the
quality and extent of previous research, relevance of scientific
qualifications for the announced fellowship position and personal
qualifications for team-work and international collaboration, the
quality of the project description, and on the assumed academic and
personal ability on the part of the candidates to complete the project
within the given time frame.
## Requirements ##
- A PhD degree in an area of relevance for research on sound-producing
actions.
- The doctoral dissertation must have been submitted for evaluation by
the closing date of the application.
## We offer ##
- salary level 57 - 64 (NOK 473 400 - 538 700, depending on level of
expertise)
- Broad training possibilities within a stimulating academic environment
- Attractive welfare arrangements
- Favorable pension arrangement
## Submissions ##
Applicants must submit the following attachments with the electronic
application, preferably in pdf format:
- letter of application
- Curriculum Vitae
- list of publications
- project description of 3 - 5 pages, including a schedule of
activities. The project description must clarify how the applicant will
approach the post-doctoral project theme theoretically and methodically,
and specify how the project will be completed within the given time frame
Applicants who graduated at a foreign higher education institution are
expected to submit an explanation of their institutionÂ’s grading system.
Please be aware that all documents should be in English or a
Scandinavian language. Please also refer to the Regulations concerning
Post-Doctoral Research Fellowships.
Qualified and short-listed applicants will be invited for an interview.
The University of Oslo has an agreement for all employees, aiming to
secure rights to research results.
According to the Norwegian Freedom of Information Act, information about
an applicant may be included in the public list of applicants, even if
non-disclosure has been requested.
The University of Oslo aims to achieve a balanced gender composition in
the workforce and to recruit people with ethnic minority backgrounds.
## General ##
Application deadline: 5 August 2013
Expected Start Date: Fall 2013
Reference number: 2013/6355
Contacts:
Administrative Head of Department Ellen Wingerei
Telephone: +47 22 84 44 28
fourMs project leader Professor Rolf Inge Godøy
Telephone: +47 22 85 40 64
Head of Department Alexander Refsum Jensenius
Telephone: +47 22 84 48 34
http://uio.easycruit.com/vacancy/982967/62048?iso=no
--
Alexander Refsum Jensenius, Ph.D.
Head of Department
Department of Musicology
University of Oslo
http://www.hf.uio.no/imv/english/
======================
CALL FOR PARTICIPATION -- please circulate to interested audiences
======================
((( MUME 2013 )))
2nd International Workshop on Musical Metacreation
http://www.metacreation.net/mume2013/
Held at the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13)
Northeastern University, Boston, Massachusetts, USA
October 14-15, 2013
----------------------
Deadline for Paper and Demo Submissions:
*** July 1, 2013 ***
======================
We are delighted to announce the 2nd International Workshop on Musical Metacreation (MUME2013) to be held October 14 and 15, 2013, in conjunction with the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13). MUME2013 builds on the enthusiastic response and participation we received for the inaugural workshop in 2012, which received 31 submissions, 17 of which were accepted (a 55% acceptance rate). This year the workshop has expanded to 2 days.
Thanks to continued progress in artistic and scientific research, a new possibility has emerged in our musical relationship with technology: Generative Music or Musical Metacreation, the design and use of computer music systems which are "creative on their own". Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and life sciences. Musical Metacreation suggests exciting new opportunities to enter creative music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software "partners", and design of systems in gaming and entertainment that dynamically generate or modify music.
MUME brings together artists, practitioners and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large. Join us at MUME2013 and take part in this exciting, growing community!
Topics
======
We encourage paper and demo submissions on topics including the following:
* Novel representations of musical information
* Systems for autonomous or interactive music composition
* Systems for automatic generation of expressive musical interpretation
* Systems for learning or modelling music style and structure
* Systems for intelligently remixing or recombining musical material
* Advances or applications of AI, machine learning, and statistical techniques for musical purposes
* Advances or applications of evolutionary computing or agent and multiagent-based systems for musical purposes
* Computational models of human musical creativity
* Techniques and systems for supporting human musical creativity
* Online musical systems (i.e. systems with a real-time element)
* Adaptive and generative music in video games
* Methodologies for, and studies reporting on, evaluation of musical metacreations
* Emerging musical styles and approaches to music production and performance involving the use of AI systems
* Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc.
Format and Submissions
======================
The workshop will be a two day event including:
* Presentations of FULL TECHNICAL PAPERS (8 pages maximum)
* Presentations of POSITION PAPERS and TECHNICAL IN-PROGRESS WORK (5 pages maximum)
* Presentations of DEMONSTRATIONS (3 pages maximum)
* One or more PANEL SESSIONS (potential topics include international and networked collaborations, evaluation methodologies, industry engagement, generative music in art vs. games)
Workshop papers will be published in a Technical Report by AAAI Press and will be archived in the AAAI digital library.
Submissions should be made in AAAI, 2-column format; see instructions here: http://www.aaai.org/Publications/Author/author.php
For complete details on attendance, submissions and formatting, please
visit the workshop website:
*** http://www.metacreation.net/mume2013/ ***
Important Dates
===============
Submission deadline: July 1, 2013
Notification date: August 6, 2013
Accepted author CRC due to AAAI Press: August 14, 2013
Workshop date: October 14-15, 2013
Workshop Organizers
===================
Dr. Philippe Pasquier (Workshop Chair)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Vancouver, Canada
Dr. Arne Eigenfeldt
School for the Contemporary Arts
Simon Fraser University, Vancouver, Canada
Dr. Oliver Bown
Design Lab, Faculty of Architecture, Design and Planning
The University of Sydney, Australia
Graeme McCaig (Administration & Publicity Assistant)
School of Interactive Arts and Technology (SIAT)
Simon Fraser University, Vancouver, Canada
----------------------
http://www.metacreation.net/mume2013/
======================