Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.

Interactive Music Notation and Representation Workshop

June 30 2014 - 9:30 to 13:00
Goldsmiths, University of London, London, UK

@NIME 2014

www.nime2014.org

Call for participation

Computer music tools for music notation have long been restricted to conventional approaches and dominated by a few systems, mainly oriented towards music engraving. During the last decade and driven by artistic and technological evolutions, new tools and new forms of music representation have emerged. The recent advent of systems like Bach, MaxScore or INScore (to cite just a few), clearly indicates that computer music notation tools have become mature enough to diverge from traditional approaches and to explore new domains and usages such as interactive and live notation.

The aim of the workshop is to gather artists, researchers and application developers, to compare the views and the needs inspired by contemporary practices, with a specific focus on interactive and live music, including representational forms emerging from live coding. Special consideration will be given to new instrumental forms emerging from the NIME community.

Registration

Workshop attendees must register for NIME for at least one of Tuesday, Wednesday or Thursday (see here) but don't pay for the workshop day. You must send an email to dfober@gmail.com with your coordinates to be registered to the workshop itself.

Program

  • 9h30
    • Introduction
    • Animated Notation Dot Com: 2014 Report
      Ryan Ross Smith
    • Timelines in Algorithmic Notation
      Thor Magnusson
    • Breaking the Notational Barrier: Liveness in Computer Music
      Chris Nash
  • 10h30
    • Quid Sit Musicus: Interacting with Calligraphic Gestures
      Jérémie Garcia, Gilbert Nouno, Philippe Leroux
    • Non-Visual Scores for Ensemble Comprovisation
      Sandeep Bhagwati
    • Interactive and real-time composition with soloists and music ensembles
      Georg Hajdu
    • A javascript library for collaborative composition of leadsheets
      Daniel Martín, François Pachet
    • (Pre)compositional strategies and computer-generated notation in surface/tension (2012) for oboe and piano or ensemble
      Sam Hayden
    • On- and off-screen: presentation and notation in interactive electronic music
      Pete Furniss
    • John Cage Solo for Sliding Trombone, a Computer Assisted Performance approach
      Benny Sluchin, Mikhaïl Malt
    • Deriving a Chart-Organised Notation from a Sonogram Based Exploration: TIAALS (Tools for Interactive Aural Analysis)
      Michael Clarke, Frédéric Dufeu, Peter Manning
  • 12h30
    • Discussion
  • 13h00 - end

Organisation

  • Dominique Fober - Grame - Lyon
  • Jean Bresson - Ircam - Paris
  • Pierre Couprie - IReMus, Université Paris-Sorbonne - Paris
  • Yann Geslin - INA/GRM - Paris
  • Richard Hoadley - Anglia Ruskin University - Cambridge

Contact

Please send enquiries to dfober@gmail.com

Detailed program

Animated Notation Dot Com: 2014 Report

Ryan Ross Smith Rensselaer Polytechnic Institute

Animatednotation.com [ANDC] was created in order to document the emerging field of animated notation [an umbrella term encompassing real-time/spontaneous/moving/generative/etc./music notation], as a platform to encourage discourse, and to place this field within a broader historical context. This brief talk will cover what precedents motivated the creation of ANDC, and report on its current state and future leanings.

Link:


Timelines in Algorithmic Notation

Thor Magnusson University of Sussex

Computer code is a form of notational language. It prescribes actions to be carried out by the computer, often by systems called interpreters. When code is used to write music, we are therefore operating with programming language as a relatively new form of musical notation. Here code and traditional notation are somewhat at odds, since code is written as text, without any representational timeline. This can pose problems, for example for a composer who is working on a section in the middle of a long piece, but has to repeatedly run the code from the beginning or make temporary arrangements to solve this difficulty in the compositional process. In short: code does not come with a timeline but is rather the material used for building timelines. This article explores the context of creating linear “code scores” in the area of musical notation.

Link:


Breaking the Notational Barrier: Liveness in Computer Music

Chris Nash Centre for Music and Science (CMS), Faculty of Music, University of Cambridge

There are many roles for notation in music. In one instance, notation provides an asynchronous, persistent, and formal mode of unidirectional communication between artists, performers, teachers, and scholars. But in the creative process, lo-fidelity forms of notation (e.g. sketching) provide a more informal, more interactive tool for composers to explore and experiment with new musical ideas. Accordingly with the advent of interactive computer music systems, a disjunction has arisen between these roles and what we expect of digital notations and UIs. This talk focuses on the challenges and opportunities within digital music systems for supporting liveness and flow in notation-mediated interaction. Using real-world examples and findings from extensive user studies, the central role of rapid, incremental feedback from the domain (e.g. sound, music) is emphasised. The benefit of feedback is specifically demonstrated through the example of tracker-style sequencers, where interaction with a text-based musical notation is driven by rapid edit-audition cycles that enable flow and a high level of liveness and engagement during composition. The Cognitive Dimensions of Notations framework (Green and Petre, 1996) is discussed in a musical context, and a new framework based on modelling the feedback loops within an interactive system (Nash, 2014) is introduced to highlight common paradigms in modern music tools, such as sequencers and DAWs, discussing notation in the context of theories of liveness (Tanimoto, 1990) and flow (Csikszentmihalyi, 1996).

Link:


Quid Sit Musicus: Interacting with Calligraphic Gestures

Jeremie Garcia, Gilbert Nouno, Philippe Leroux INRIA, UPSud, CNRS (LRI) - STMS Lab: IRCAM-CNRS-UPMC - McGill University & CIRMMT

Illuminated manuscripts of medieval music contain rich decorations in addition to handwritten neumatic notation. Our project investigated the use of such handwritten symbols for the composition of the piece Quid sit musicus by Philippe Leroux. We designed several tools that combine computer-aided composition with interactive paper to support the composition process as well as the performance of the piece. Interactive paper creates new opportunities to interact with handwritten elements and explore computer based musical processes. In this workshop, we will present how the composer used the calligraphic notation as compositional material and demonstrate the tools we developed for this project.

The interactive paper interface captures the handwritten gestures of the composer over the original manuscript and extracts both graphical and dynamic features before transmitting the data to external computer-aided composition environments. We expanded the composer’s library in OpenMusic and used a reactive extension of the software to create patches that respond to the pen interactions. The composer used this environment to design several processes to create the harmony of the piece, rhythms, melodies and harmonic gestures from the pen data with direct audio-visual feedback. By clicking over an existing form with the pen, the composer can select and explore the features of previously drawn shapes. We also created patches in Max to synthesize sounds and control the spatialization from the shapes’ data such as their perimeters. We will use these patches for the performance of the piece that will be premiered in June at Ircam. During this project, Philippe Leroux defined new meanings for the neumatic musical notation of the medieval manuscript. The interactive paper interface combined with computer-based composition environments allowed the composer to use handwritten symbols as gestures with dynamic properties and as interactive elements to trigger, control and explore musical processes.

Link:


Non-Visual Scores for Ensemble Comprovisation

Sandeep Bhagwati Matralab - Montréal

A visual interface is not always the most ideal method of conveying information to a musician – it may well be an unwelcome distraction. Many soloists in classical music learn their part by heart in order to fully concentrate on interpretation. Having to look at an interactive score interface may often occupy more of the player's attention than the actual shaping of sound or musical architecture. Indeed, in performances of interactive score-based music, the performer is visibly much more “glued” to the visual score interface than in both traditional classical composed music and, obviously, in improvisation. The need for non-visual scores becomes obvious when we consider an ensemble of moving performers or non-standard, reactive or reconfigurable spatial arrangements of musicians (as in several works by the author). After extensive (and still ongoing) exploration of interactive on-screen scores, the author has, in recent and future research-creation projects decided to explore the artistic viability of non-visual interaction with musicians: either with purely aural scores or via a body suit score interface, to be developed with a major 4 year research-creation grant recently awarded by the Canadian Social Sciences and Humanities Research Council. The talk will describe the intellectual and technical context of non-visual scores, the tools and the current state of thinking about different approaches to live-scoring music (one of which is his comprehensive definition of comprovisation) and will give an insight into a collaboroative, emerging research-creation project, inviting listeners to contribute to this aspect of interactive music notation.

Link:


Interactive and real-time composition with soloists and music ensembles

Georg Hajdu

In “Extreme Sight-reading, Mediated Expression, and Audience Participation: Real-time Music Notation In Live Performance” Jason Freeman described how real-time composition can be taken to new extremes: composers “waiting to create the score until during the performance”. In my presentation I shall describe several scenarios involving real-time composition, arrangement and/or part extraction to which classically trained musicians have been exposed to. For this, I have integrated MaxScore (developed by Nick Didkovsky and myself) into my networked multimedia performance environment Quintet.net allowing scores and parts to be delivered over the network and displayed on individual computer screens or tablets.

The compositions in question are three pieces of mine: Ivresse '84 (2007) for violinist and 4 laptop performers, schwer…unheimlich schwer (2009/11) for bass clarinet, viola, piano and percussion, and Swan Song (2011/12) for cello, percussion and multimedia. In contrast to the first two pieces, Swan Song uses a fixed score to control all aspects of the performance (musicians, audio and video); nonetheless, MaxScore's real-time capabilities were a boon during the rehearsals as changes could be made on the fly without the need to print new parts on paper, and annotations and suggestions could be fixed for future reference.

Furthermore, I shall focus on the current state of the art of real-time composition in man-machine networks as well on future developments affording participating musicians more control over the composition and editing process.


A javascript library for collaborative composition of leadsheets

Daniel Martín, François Pachet Sony CSL - Paris

A leadsheet is score composed by a melody (most of the times monophonic) and a grid of chord labels. Leadsheets are widely used to represent popular music songs as found in jazz and bossa nova song books. Existing score editors can be divided into desktop­based for individuals (e.g. Sibelius and Finale), and web­based social oriented ones like NoteFlight (www.noteflight.com) which is a social network in which users can create and share scores. NoteFlight has a web friendly score editor with which users can comment and recommend scores, and edit them when they have the permission. Chromatik is a social network to share scores, where scores can be uploaded in PDF format and users can make annotations in it. However, these systems cannot be easily embedded in other applications or easily extended. Furthermore, the considered systems, do not provide annotation tools for commenting on specific parts of the score.

The Leadsheet Editor (LE) enables users to edit and play leadsheets intuitively. The LE provides an interface to edit melodies and chord progressions, as well as a Json format for representing and storing leadsheets. Users can make suggestions of modifications on the most critical parts of leadsheets, such as notes, chords, bars, chord transitions and structure (see figure). The LE is implemented in javascript and can be easily embedded in any web­based application, such as MusicCircles, developed in the Praise project. It has also been used in the FlowMachines project to enter over 10,000 jazz and Bossa nova songs. The modifications made by users are stored as suggestions that the original author can later visualize and apply or discard. They can be accompanied by text or audio examples. The LE also proposes features to edit a lead sheet collaboratively.

Links:


(Pre)compositional strategies and computer-generated notation in surface/tension (2012) for oboe and piano or ensemble

Sam Hayden Trinity Laban Conservatoire of Music and Dance

My recent cycle of acoustic works, misguided (2011) for ELISION Ensemble, surface/tension (2012) for oboe and piano/ensemble, composed in collaboration with Christopher Redgate, as part of his AHRC research project ‘New Music for a New Oboe’, and a new String Quartet (2013-14) composed for Quatuor Diotima, has involved different solutions to the same broad initial context, using computer-assisted compositional techniques (CAC) via OpenMusic to help solve the problem of how to enable the proliferation of diverse surface materials, whilst maintaining an underlying formal coherence. In the case of surface/tension, the composition and notation of entirely acoustic music is inseparable from CAC tools and purpose-built acoustic instruments. surface/tension evolved from a dialectical relationship between the unique sonic possibilities inherent in the new Redgate-Howarth oboe and CAC techniques inherent in my (pre)compositional strategies, prototypes (OM patches) which were used to compose misguided. The underlying material for surface/tension was the product of two distinct CAC techniques, each yielding a different kind of ‘found object’ which became the starting points for the piece: (a) the spectral analyses of multiphonics, used to generate inharmonic non-tempered microtonal pitch fields (using Audiosculpt and OM); and (b) the algorithmic generation of artificial spectra (equal-tempered microtonal non-octavating scales) and complex structures of embedded rhythmical subdivisions. The collaborative process and the formalized (pre)compositional strategies, shaped directly both the notation of hyper-virtuosic material and the approach to form, taking the piece in unanticipated directions. Towards the latter stages of surface/tension, OM was used to interpolate between the artificial OM-generated spectra and the pitch fields derived from multiphonics, related concepts that became central to the composition of the String Quartet. The use of such digitally mediated notational tools has helped to develop a specific conception of material, aiding the creation of new musical ideas, sounds and modes of expression, beyond existing paradigms.

Links:


On- and off-screen: presentation and notation in interactive electronic music

Pete Furniss Edinburgh College of Art - University of Edinburgh

At April 2014 performance at the Dialogues Festival in Edinburgh, I presented three works for clarinet and computer (one written, two improvised), each engendering a sense of interactive play in both performer and audience. Here I examine the communication of the material for each piece, as it was presented by the composer: firstly by a combination of text description, on-screen prompts and GUI; secondly by traditional notation (with the aid of a guide stave, cue numbers and sound file for testing); finally, by stand-alone application. In each case I was fortunate to work in close personal contact with the composer, which afforded the significant benefit of discourse and contextualisation.

It has been observed (Berweck, 2012, Dudas & Furniss, 2014) that software interfaces for live electronic performance have tended, on the whole, to be designed for use by either the composer or a technical specialist. An increasing number of musicians are choosing to play with full, onstage control of electronic elements (where practical), leading to the need for a more nuanced approach, towards “expressive, higher-order music notations” (Stowell & McLean, 2013). Using example images, audio and demonstration at the clarinet, the presentation will deal with the requirements of a performer in managing synchronicity, audio input/output, trust, attention, peripheral vision and cueing within an interactive environment. Andrew May’s Ripped Up Maps (Fig.1) is an interactive improvisation for any instrument and computer. It has a ‘score’ of musical and technical text instructions, and an on-screen graphical interface in Max/MSP. Here, a user-adapted ‘presentation mode’ in Max 6 provides enhanced visual feedback, clearly registered from a distance of 1.5-2m from the laptop screen, achieved with blocks of distinct colour and large format graphics on an anti-glare background. On-screen résumés of both software and score instructions are optionally available in the display. Again employing a personally adapted GUI, the interface for Richard Dudas’ Prelude No.1 for clarinet and computer (Fig.2) clarifies the process of setting up audio and software, and provides clear and relevant visual feedback. A traditionally notated score is used here, with numbered cues, some of which were repositioned to provide accuracy and reassurance. Martin Parker’s gruntCount (Fig. 3) is presented as a stand-alone application, with a graphical interface showing various plotted journeys through pre-selected sound processing modules, as well as software and audio settings and a non-linear timeline, controlled by the sound of the instrument at a set threshold. There are several versions of the piece built into the software, to which further adjustments can be made. Performers may also create new plots, effectively providing the potential to expand the app into a tool for collaborative composition.

Links:


John Cage Solo for Sliding Trombone, a Computer Assisted Performance approach

Benny Sluchin, Mikhail Malt IRCAM

John Cage has left an abundant and varied work, which stays enigmatic, contested and not entirely understood. During the 1950s, in order to define the process of his thinking, he employs three different terms: indeterminacy, unintentionally and chance. The first two terms will be used for an unintended result of a performance. A computer interface is demonstrated to assist in a performance, where theses notions are the essence.
The Concert for Piano and Orchestra (1957-58) is one of the most significant works of Cage. The instrumental parts of the “orchestra”, as composed of isolated events that are to be put together “according to a program” and to follow the composer’s instructions, are especially difficult of approach.
The presentation will focus on the trombone part of the Concert using a CAP (Computer Assisted Performance) interface that follows the composer’s concepts and the structure of the work. The performance of two short examples will be included. Each instrument page is a collection of punctual musical events, (see Figure 1), displayed on 12 music sheets. Each event is a compound one, having relative pitch, dynamics, playing modes and other indications. The events are distributed variously on each sheet, from extreme density (music system with around 10 events) to empty staves. Analyzing the Cage instructions and his musical and aesthetical points of view, we realize that the traditional scores make the players performance difficult, especially regarding the “unintentional” choices of different musical objects. The material and sequential aspect of paper scores is an obstacle to the realization of the Cage’s main idea that the player could go, freely, without constraint and without intention through the score. Three kinds of ways displaying music will be shown. A further possibility to organize the events according to logical criteria is available. The memorization of a “version” is also a way to test and analyze one of the infinite possible realizations of the work. Links:


Deriving a Chart-Organised Notation from a Sonogram Based Exploration: TIAALS (Tools for Interactive Aural Analysis)

Michael Clarke (1), Frédéric Dufeu (1), Peter Manning (2) CeReNeM, University of Huddersfield - Durham University

The TaCEM project (Technology and Creativity in Electroacoustic Music), funded by the AHRC for 30 months (2012-2015), investigates eight case studies from the electroacoustic repertoire. Additionally, a generic software environment is being developed: TIAALS (Tools for Interactive Aural Analysis). Traditional modes of notation often prove limited for representing electroacoustic music and its structural elements. TIAALS enables its users to build interactive aural analyses from any recorded work. Like Sonic Visualiser or EAnalysis, TIAALS builds a sonogram of the considered work. This facilitates a fully interactive exploration of the piece under a time-frequency representation, as well as the graphical creation of interactive sound objects and structures (figure 1).
One important specificity of TIAALS is to provide access to a chart maker, upon which the user can arrange objects extracted from the sonogram representation, hence enabling a form of notation for the relevant sound objects and structures that is independent from their original time and frequency bounds (figure 2). Objects represented on the chart are fully interactive, so that the user can elaborate and present his representation both visually and aurally in both the sonogram and the chart maker views.
Developed in Cycling’74’s Max, TIAALS can be well integrated to other pieces of software developed for specific case studies, and its future releases will expand on relevant representation features. This proposal aims at presenting, in the context of the workshop, the current beta version of TIAALS and engaging a discussion for its further perspectives regarding the notation of music that exists primarily in sound rather than on the score.

Links:


AFIM INEDIT ANR EFFICAC(e) NIME2014

Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0