BEAM 2011 had three Open Call opportunities for artists wishing to share their work at this year’s festival. Categories for these opportunities were:
OPEN SPACE – A platform for artists to exhibit or perform their own work at the festival. All OPEN SPACE submissions had to refer to the broad themes of music & technology, interactivity, electronic music, music produced through unusual interfaces (including sonic robots!) and homemade instruments.
BEAM TRAIN – Podcasts, recordings and interactive games that will be available through the BEAM website to captivate and entertain festival-goers on their tube journey from central London to Uxbridge Station.
BEAM AV Gallery – A category for anyone who wanted to share related work but couldn’t get to the festival. This work will be uploaded to the BEAM AV Gallery.
All installations and Open Space works were open 5-7.45pm Friday, 1-7pm Saturday and 10am-1.45pm Sunday.
The BEAM BURSARY category was for work we felt was outstanding, that was totally in line with the concepts of BEAM: exploring music that moves, experimenting with new ways to produce sound and works which attempted to engage with the audience in fun or innovative ways. These artists will also present in the pecha kucha session on Saturday morning (starts 11am).
BEAM Bursary Performances
- Marco Donnarumma – Music for Flesh II
- Christian Bannister – Subcycle Labs
- Christos Michalakos – Frrriction: drums + live electronics
BEAM Bursary Installations
- Arne van den Berg – A self-playing sitar
- Ed Wright – Hopscotch: an interactive sound game
- Jack F. X. Pavlik – 10 Waves: an incredible kinetic sculpture
- Mike Cook – Arduinocaster, Chaotic Pendulum and Hexagonal Monome
Arne van den Berg brings an installation from The Hague: the Sitar. Here’s his description from his vimeo channel: The idea is to make Computer-music with an analogue output and a live, visual performance. In history we have seen mechanical instruments like the pianola or the barrel organ and composers (Xenakis) who generate their scores with the use of the computer, which were studied and played by musicians. This piece is a reaction on these phenomena’s. In mordern times of technology is it possible and challenging to replace the musician for a computer. And so, the citer is being played by electromotor which are switched and programmed with the use of arduino (software). In order to make the piece not as dead as the computer itself is, I developed some change factors in it, by using elastic wires between the balls and the motors. It’s generating chaotic behaviour, and as a result, the piece never sounds the same when being played.
Subcycle Labs: The Subcycle instrument is a series of gestural processes and techniques that allows the musician to shape audio and visual aspects of live performance, simultaneously and in real-time. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound directly and visually. Ultimately this project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides. Subcycle Labs is bridging the gap between sound visualization and musical instrument.The project is focused on bringing the techniques and processes of electronic music and studio production to live performance.
Christian Bannister is the founder and creative director of Arxi Creative. He has been conceptualizing, designing, and developing award winning interactive experiences for over fifteen years. He became interested in new media and interactive design as a cross over from his experience composing and performing electronic music in the early 90’s. His process has evolved to include both visual design and development. In 2001 Christian was awarded an MFA from the UCLA department of Design | Media Arts . The program helped him discover his interest in site-specific interactive projects and installations and develop his formal skills and break away from more conventional industry process. In 2006 Christian relocated to Portland to work with Second Story Interactive Studios where he found new challenges developing rich media for exhibit design and architecture working with projects as diverse as the Library of Congress in Washington DC to the Grammy Museum in Los Angeles. In 2010 Christian formed Arxi creative to explore his interest in creative coding and installation projects. Documentation online: project blog: http://www.subcycle.org/ | professional website: http://www.arxi-creative.com/ | flickr set: http://www.flickr.com/photos/16552313@N00/4288846805/in/set-72157626593767729/lightbox/#/photos/16552313@N00/5719560044/in/set-72157626593767729/lightbox/ | video documentation: http://www.vimeo.com/user2148150/videos
BEAM thought Christos Michalakos’s massively energetic and inventive combination of drums and beautifully controlled electronics sounded like it would make an amazing LIVE PERFORMANCE at BEAM 2011.
Frrriction by Christos Michalakos
Christos Michalakos (b. 1983) is a composer and improviser hailing from northern Greece. Working predominantly with percussion and live electronics in recent years, his music explores the relationship between the acoustic and electronic sound worlds, either shifting exclusively between one or the other, or intricately merging timbres to create a unique sonic experience. A keen improviser, he can be found performing assertive solo works, or collaborating on projects ranging from a percussion/piano/electronics based duo with Lauren Hayes, to large-scale free jazz and improvisation ensembles such as Edimpro, the University of Edinburgh’s free improvisation orchestra. His collaboration with visual artist Parag K Mital has resulted in numerous audio-visual works, most notably Polychora I (2009) performed at Soundings Festival, and The Trial (2010) performed at the Dialogues Festival, Edinburgh. Whether exploring percussion performance or detailed digital sonic landscapes, his sound is unified by its crispness and rhythmic suggestion. This is achieved through an examination of methods for developing and augmenting his drum kit, and forms part of his PhD research at the University of Edinburgh under the supervision of Dr. Michael Edwards. Performances in 2011 include Sound Thought Festival (Glasgow), Soundings (Edinburgh), Sonorities Festival (Belfast) and NIME (Oslo).
Hopscotch: Installation for motion tracking and audio
The game of hopscotch exists in a number of forms in many cultures: the unifying theme is movement through a space marked out on the ground. In some cases it is a simple counting game or form of exercise, there are other examples where it is seen as symbolizing the quest for spiritual enlightenment and even as an allegory of re-incarnation. Experiences range from the playful to the profound, the functional to the transcendental and often taking the individual out of society whilst at the same time uniting those playing and watching. This installation uses 14 sound objects (triggered in 2 lines of 7), a simple web cam, and a computer program developed by myself to create a physical and auditory ‘play’ area. It can also be a lot of fun… Watch a previous participant playing Ed’s hopscotch here
Ed Wright was born in Buckinghamshire, England 1980. He finished his PhD in music with Andrew Lewis at Bangor University, in July 2010, where he currently lectures. His work is mainly focused toward the electro-acoustic end of the musical spectrum, although he writes for and plays real instruments as well. He performs on violin, viola and voice, as well as laptop: and works with a number of school/student groups promoting performance of both older and more modern music. Recent highlights include a ‘mention’ in the Prix Bourges for his piece Con-chords, a number of commissions including a 3 day sound installation piece in the dark solitary confinement cell of Ruthin gaol, airplay on BBC Radio 1 and S4C television, and a gig in an old gunpowder factory. Ed lives in North Wales (U.K.) with Emma, their daughter Alena (3 years old at the time of writing) Ben the dog and Bess the cat. Ed’s music is available on the Blipfonica label.
Jack F. X. Pavlik – 10 Waves: an incredible kinetic sculpture
“In sculpture you can have an idea, an exact idea but the final result will come from experimentation and the limits of the material. In this mutual give and take, a transformative process takes place; where there is an evolution of both form and idea. In my work combine flexible and rigid materials, materials that merge a biomorphic, surreal pattern of black shapes and lines, with geometric metal frames and structures that work at the same time to move, and contain the active forms of the piece, creating what I hope is a humorous, surreal form of sculpture”.
Jack Pavlik, from Minneapolis, United States. Studied sculpture at the University of Minnesota in Minneapolis, currently a graduate student at the Minneapolis College of Art and Design, has exhibited in the United States and abroad, with exhibitions in Germany, Ireland, and England. Currently represented by Kinetica Museum in London, England.
Music for Flesh II (Marco Donnarumma 2011, present) is a solo sonic piece for augmented muscles sounds, which aims at demonstrating an experimental coupling between unheard sounds of muscle gestures and corresponding sound synthesis played back through loudspeakers. In Music for Flesh II the only musical instrument available to the performer is his own body. Performer’s voluntary muscles contractions produce kinetic energy, an acoustic sound which is captured by the Xth Sense biosensor and deployed as only sonic source. At the same time the biological signal undergoes a feature extraction which provides several control parameters for the real time processing of muscles sounds. Performer uses his body not only as a controller, but, more importantly, also as a truly real musical instrument; he is capable of actually creating music in real time exciting his muscles fibres. Such paradigm attempts at informing classical gestural control of music and musical performance itself. During the execution both performer and listeners can perceive an authentic auditive and cognitive intimacy; the neat and natural responsiveness of the system prompts a suggestive and unconventional coupling of sound and gestures. Performances and talks in 2011 include Non-Bio Boom at Inspace, Imagine Create Festival, Linux Audio Conference, ICMC and 4th Pure Data Convention.
New media and sonic artist, performer and teacher, Marco Donnarumma was born in Italy and is based in Edinburgh, UK. His works are regularly presented internationally. He has exhibited and performed in 25 countries across South America, Europe, India, China and Australia (among the others, Venice Biennale, WRO Biennale, Némo, Mapping, Piksel, Re-New, Laboral, Pure Data Convention, Linux Audio Conference, EMAF, Visionsonic, Live!iXem, Carnival of e-Creativity, Netaudio). His projects have been featured on Wired, We Make Money Not Art, Rhizome, Turbulence.org and Micro Art in China. He received a BA (cum laude) in New Technologies for Arts and Performance at the Venice Academy of Fine Arts, Italy. Presently, he investigates experimental paradigms for embodied interaction in performative environments at The University of Edinburgh, supervised by composer and sound artist Dr. Martin Parker. Donnarumma teaches workshops and gives talks for international academic institutions and venues on a regular basis – including NK Berlin, University of London, Glasgow Centre for Contemporay Arts, UPR Universidad de Puerto Rico, Trinity College, NUI Maynooth and Ulster University in Ireland, Gotland University Institute for New Media Art and Technology in Sweden, Academy of Fine Arts of Brera and Pierluigi da Palestrina Conservatory in Italy.
Mike Cook makes things. Here he is in his own words and below are some links to his creations…
Spent 21 years as a Physics lecturer at Man Met University until they closed down the Physics department in 1998 and I had to get a proper job. During my time lecturing I wrote for computer Magazines mainly Micro User and Acorn User, over a period of 15 years, producing over 200 electronic constructional articles. Then went on to work for Pace PLC designing digital set top boxes and then PAC designing access control security products. Now working as a freelance consultant for Electronics Art, and medical electronics. Chaotic Pendulum – Most pendulums are regular but by addition of magnets to nudge the pendulum this one produces a chaotic motion. Hexagonal Monome – The well known monome is an undedicated controller who’s function depends on the host software. By constructing a version whit a hexagonal layout in place of the more conventional Cartesian layout new possibilities are opened up with regard to sequencing and keyboard mapping. Arduinocaster – Looking vaguely like a guitar the Arduinocaster is an Arduino based MIDI instrument modelling a six string guitar. An easy method of chord control and string picking ensure this instrument is easy to play.
BEAM OPEN SPACE 2011
BEAM OPEN SPACE was created as a platform for festival goers to exhibit or perform their own work at BEAM. All OPEN SPACE submissions had to refer to the broad themes of music & technology, interactivity, electronic music, music produced through unusual interfaces (including sonic robots!) and homemade instruments. Below are the successful Open Space artists, whose work you can see throughout the BEAM weekend.
Open Space Installations
- Mogees – Bruno Zamborlin
- Bjoern Erlach & Dohi Moon – Pulb
- Monomatic – Modular Music Box
- Mor Bakal – Sonic Sketch
Open Space Performances
- Mute Transcript
- Neal Spowage – Electronic Dumbell
- Stuart Freeland – Shubunkin
- William Cheshire
- Roger Thomas – SARAH (original software)
- Jag – DIN is noise (software demo)
Bruno Zamborlin – MOGEES Mosaicing Gestural surface
Mogees is an interactive gestural-based surface for realtime audio mosaicing. When the performer touches the surface, Mogees analyses the incoming audio signal and continuously looks for its closest segment within the sound database. These segments are played one after the other over time: this technique is called concatenative synthesis. For instance, loaded a series of voice samples, a graze in the surface could corresponds to a whispering while a scratch would trigger more shouted sounds.
The wooden surface can be “played” with any tools such as hands and Mogees will always try to find a correspondent sound to it. It can also be applied to other sound sources such as voice or acoustic/electric instruments.
Mogees has been developed in collaboration with Norbert Schnell and it is currently used in the Airplay project by the IRCAM composer Lorenzo Pagliei.
DIN is noise
S. Jagannathan is a free software maker from Chennai, India living in London, UK.
din is a free software musical instrument for GNU/Linux that lets you use your computer mouse like the bow of a traditional instrument like the Sarangi. You can draw and sculpt Bezier curve waveforms, create gating and modulation (FM and AM) patterns, and create delay feedback and volume patterns. You can also create an unlimited number of drones and sculpt their waveforms. It uses JACK to output audio, and supports MIDI, OSC and IRC bot for input. din can be extended and customized with Tcl scripts.
Some goals of the din project: * be a high quality free software musical instrument on a totally free platform like GNU/Linux; * use readily available input devices like the computer mouse & keyboard to make high quality music.
Website: http://dinisnoise.org | Screenshots: http://www.dinisnoise.org/screenshots/ | Selected videos: din drones demo (143 simultaneous voices created from scratch): http://vimeo.com/19017469 | microtonal chord progressions: http://vimeo.com/20054608 | din new users guide: http://vimeo.com/19391709
Grégoire Mariault and Anna Louise Hale: HeadBox #1 + #2
“HeadBox #1” & “HeadBox #2” (narrative based experimental sound pieces) is an installation work I’ve been working in collaboration with Anna Louise Hale for her degree show, she is studying fine art at Bayam Shaw and will be graduated in June. It’s about “space, time and politics of location”. Materialized as two head boxes (supposed to be in different spaces) with two different narrative sound story playing inside, it’s an exploration inside the home based on a journey around using the functional areas. Both of the music pieces have been inspired by living spaces (kitchen and bathroom) and have been firstly made from realistic sound recording and mixing. Then after they have been modulated and mastered in order to give a different perception of reality. All have been done on “Ableton Live” using the basic effects and some VST plug-in with a keyboard.
People passing by will be able to put their head inside of the boxe’s darkness (symbol of the mind) in order to experiment and focus on the concept of “interior and exterior”. Within the kitchen it is based around the idea of cooking and how the kitchen is used individually; opening cupboards, pouring rice, washing hands etc, with occasional other voices heard within the background. Within the bathroom it is more of a personal space and based upon water; showering, going to the toilet, brushing teeth etc.
Biography : My name is Grégoire Mariault, I am 21 years old, French. I was born in Chartres the 14th of may 1989, close to Paris. I’ve been studying engineering in high school and graphic design for 3 years in Paris before I moved to London in September 2010. This gap year has been an opportunity for me to focus mainly on sound design, music and illustration, spending 9 months working on different personal projects, prior to my enrollment at the Central Saint Martins College of Art & Design, which will be in September 2011. I started learning viola around the age of 7 at the music conservatory of Lucée and used to play in germany and france with a symphony orchestra around the age of 14. I’m still practicing this instrument as guitar and piano but music has always been related to my free time since my child hood until I realized that doors can be open and then the rest of the story is all about walking down new spaces and perspectives through different goals.
Monomatic – Modular Music Box
The Modular Music Box consists of several interconnected, plug’n’play devices that collectively reproduce the functionality of the familiar 19th century clockwork musical instrument. At the heart of the piece is a custom-made electro-magnetic rotary sequencer. Melodies are stored on interchangeable, 10” acrylic disks embedded with small magnets arranged in a regular circular grid. By rotating these over a ‘play head’ of magnetic field sensors the device effectively replicates but supersedes the set of pins on the revolving cylinder that pluck the tuned teeth of a steel comb in the traditional instrument.
Additional units include a self-contained and controllable sound source (to hear and effect the musical output); a clockwork-like key (to ‘wind’ the device so it can ‘play’ the melody); and an animated representation of a dancing ballerina automaton – realised as a modern-day interpretation of the praxinoscope – a popular visual parlour toy of its era. Inspired by the design of second-generation monome.org controllers these modular components draw on their minimalist aesthetic and a similar restricted material palette of walnut, brushed aluminium, translucent acrylic and orange LEDs.
The work explores themes such as the ‘materialisation of data’, ‘the tactile digital’ and the ‘invisibility of technology’ while attempting to recapture something of the sense of craft and workmanship, refined aesthetics and genteel appreciation of the iPods of their day.
The Modular Music Box was first exhibited at Analogue is the New Digital, AND Festival, Manchester, UK in October ’10 and has since been shown at Kinetica Art Fair, London, UK in February ’11; Maker Faire, Newcastle, UK in March ’11; and most recently at the Roundhouse, London, UK as part of the Sonic Maze exhibition of Netaudio London in May ’11.
Monomatic is a collaboration, experimental playground and halfway house between the work of Nick Rothwell and Lewis Sykes.
NICK ROTHWELL is a composer, performer, software architect, programmer and sound designer. He has built performance systems for projects with Ballett Frank- furt, Vienna Volksoper and Braunarts, and has worked at STEIM (Amsterdam), CAMAC (Paris) and ZKM (Karlsruhe). He has composed sound tracks for choreographers Aydin Teker (Istanbul) and Richard Siegal (Laban Centre), and has performed with Laurie Booth (Dance Umbrella, New Territories), and at the Different Skies Festival (Arco- santi, Arizona), the ICA, and the Science Museum’s Dana Centre. He is currently working in software for Wayne McGreggor|Random Dance (Sadler’s Wells), in sound for body>data>space alongside CIANT (Prague) and Kibla (Slovenia), and in audiovisuals for the current Future of Sound tour. Works Compositions
LEWIS SYKES is musician and music technologist, interaction designer, digital media producer and curator, experimental visuals enthusiast and qualified Youth & Community Worker specialising in the Arts. A veteran bass player of the underground dub-dance scene of the 90s he performed and recorded with acts such as Emperor Sly, Original Hi-Fi, Somatik, Pfink and Radical Dance Faction, was a partner in the respected underground dance label Zip Dog Records and more recently as musician with the progressive AV collective The Sancho Plan. Lewis is Director of Cybersonica – an annual celebration of music, sound art and technology (now in its eighth year) – and between 2002-2007 was Coordinator of the independent digital arts agency Cybersalon – founding Artists-in-Residence at the Science Museum’s Dana Centre. Lewis has just started a practice-led PhD at MIRIAD, Manchester Metropolitan University exploring the aesthetics of sound and vibration. Cybersonica The Sancho Plan
Mor Bakal – Sonic Sketch
Mor Bakal is just graduating from the Goldsmiths BA Design course. Her piece, Sonic Sketch, is a meeting of sound and visual art and was made with musical and technical support from Itay Cohen.
M.B. ‘Sound is an integral part of our everyday life. It affects us psychologically, physiologically behaviourally and cognitively, whether we’re aware of it or not. Through an interactive installation, the project compliments drawings with a sonic gesture, which in turns enables the user to explore their actions in added dimension.’
The ‘Mute Transcripts’ are set of improvisations for acoustic instruments and laptop that explore various ways of creating different perspectives from a set of live acoustic gestures. Each ‘Transcript’ uses an acoustic instrument as its singular source of material and BEAM’s event will use the piano as its input.
‘Mute’ is a set of code written in MaxMSP. At its core are a set of empty buffers that feed off the live instrumentalist, only to be regurgitated into the sonic texture in various stages of digestion. Both the aural and visual triggers sent from the performer to the audience are tweaked and refracted in this fashion, hence the title ‘Mute’: the patch has no voice of its own, it can only communicate through the gestures of another through ‘Transcript’-ion.
Thomas Byrne is a composer and pianist who works with both acoustic and electronic sound. Thomas studied music at King’s College London graduating with a first-class degree (2009), and subsequently completed his MMus at the same college with a specialization in acoustic composition. He also participated in the Acanthes 2010 summer school where he studied electronic composition with Emmanuel Jourdan and Mikhail Malt from IRCAM.
During this time he has enjoyed numerous performances and recordings of his compositions with the Lontano Ensemble and the student ensemble Stranded, in addition to various other solo performances. Other projects have involved a collaboration through the poetry and music group based in London dubbed Diverse Deeds, which juxtaposes spoken poetry, mixed-media works and improvised music.
Thomas’ music draws inspiration from the study of environmental and biological rhythms, and how their interaction can trigger sets of reactions or patterns. Consequently his programming focuses upon creating live and artificial-intelligent softwares that manipulate and react to acoustic sound.
Louis d’Heudieres graduated from King’s College London in 2009 with a 1st in Music, specialising in composition and winning the Purcell Prize. He now works as a freelance composer, sound designer, and music teacher. His works have been performed by the Lontano Ensemble and his first opera was premiered at Grimeborn Festival 2010. In March 2011 he was sound designer for Daisy Evans’ Silent Opera project, showcased as part of the Old Vic Young Voices ‘Coming Up’ programme in London.
His current work aims to explore the new means of musical interactivity and expression afforded by the increasingly complex electronic devices and processes available to us today. He starts a Masters course in composition at the Royal College of Music this September.
The Mute Transcripts were performed at an event organised by NonClassical in May 2011 to great success.
Neal Spowage – Electronic Dumbell
This performance of the Electronic Dumbell musical instrument has been developed with two choreographers, Angélina Jandolo and Danai Pappa to demonstrate new possibilities of the Dumbell using their expertise in performance and gesture. Neal will also be able to give practical demonstrations of the Dumbell throughout the weekend. The Electronic Dumbell was completed at STEIM in September 2010 and first performed for a devised improvisation in November 2010 at the Phoenix Square in Leicester.
Neal’s project was led by the intention of building an electronic musical instrument that takes advantage of the gestural expressions of a musician. It is not a controller, an interface or an extended instrument. It is a self-contained musical instrument. It is constructed using Bricolage, Hardware Hacking, Circuit Bending, D.I.Y. Electronics, Iterative Design and Sculptural Presence. Some components are shop bought or machined, such as the electronics and the speaker mounting plates, and others are found or readymade objects, such as the central handle (a piano key), the amplifier, the speakers (from the same electric piano as the piano key) and the lampshades.
There are two independent oscillators, one for each speaker. Their default oscillation is altered when the instrument is moved triggering the short circuits. The musician can tip and spin the Dumbell, like the mace of a drum major. The musician can move around and interact with the performance space using the Dumbell, taking advantage of acoustic reflections. Tightening and loosening ones grip on the handle alters the pitch of the generated sound. The sounds are high, bird-like, and have the feel of a modern ‘chip’ instrument.
Neal Spowage was born in Scunthorpe and lives in Leicester. He is a Musician, Artist and Part Time Lecturer at De Montfort University. He is currently studying for a PhD in Music Technology under Dr. John Richards and Professor Simon Emmerson. Neal gained a Bachelors Degree at DMU in Multimedia Design in 2002, and a Masters Degree in Music, Technology and Innovation in 2008. For the past four years he has played and composed for the Dirty Electronics Ensemble at De Montfort University. He has given Seminars, Presentations, Artists Talks and Performances at De Montfort University, STEIM, University of East Anglia, Royal Music Association Study Days and The Sheffield Access Space. His musical instruments were exhibited at the Solder Soldiers 2 exhibition, held at the Fabrika Gallery in Leicester from 14th to 26th June 2010. He released an Album with the ‘Screaming Banshee Aircrew’ in 2009 called ‘SUGAR’. He now plays and records in an alternative rock band/art project called ‘Luxury Stranger’ who have recently toured Europe with the Chameleons Vox and will be releasing new recordings in the next year.
He has had various jobs in his life including Chicken Chaser, Pizza Chef, Cross-Stitch Pattern Designer, Assistant Antiques Dealer, Assistant to the Scunthorpe Community Musician, and a Military Aeronautical Graphic Designer.
Pulb by Dohi Moon and Bjoern Erlach (CCRMA)
Dohi Moon and Bjoern Erlach : Pulb
Pulb is an installation in which a piano sound board is played by a machine dropping water at the strings to excite them. The strings respond with very soft vibrations to the impact of the drops. The vibrations of the strings are picked up with piezoelectric discs and are amplified by speakers. The resulting sounds exhibit very different characteristics from the sound of plugged piano strings or the sounds produced by the strings struck by a hammer. By adjusting the placement of the pickups different sounds ranging from loud percussive hits to very soft harp-like sounds. We searched interesting constellations of the pickups while having the machine drop water in different patterns. Much of the instrument’s charm is due to the element of chance and the non uniformity between the sounds produced by the individual strings. The path the water takes through the air onto the pianoboard varies slightly from drop to drop. Sometimes the drops split and hit the strings in more than one position. Instead of trying to achieve the precision of a steril medical device we inserted bend needles into the water outlets to amplify the organic and slightly non deterministic behaviour. The organization of the drops is varied between random drops resembling rain falling on the board and repetitive patterns. If order is slowly introduced, patterns start to appear, at first seemingly accidentally until recognizable repetitions of material occur.
Dohi Moon’s works has been performed in the United States, Norway, France, and Korea, by Western Michigan Orchestra, MSU Children’s Choir, neoPhonia New Music Ensemble, St. Lawrence String Quartet, Nobilis trio, Stephen Prutsman, Livia Sohn, Suren Bagratuni, Erica Ohm, and many well-known soloists. Her music won the first prize at the 2009 Østfold Musikkråd¹s competition in Norway, the third prize of 2009 InNova Musica – Music and Video Art Competition. She is spending time as a visiting scholar at CCRMA, Stanford University.
Bjoern Erlach was born in Germany, studied Sonology in The Hague and is currently a Phd candidate at CCRMA Stanford.
Roger Thomas – SARAH
SARAH (Semi-Autonomous Reactive Accompanist Hardware) is a performance technology project that began as an educational tool for the courses in experimental music I run at the Bishopsgate Institute in London. SARAH is an electronic duo partner that attempts to model human participation in free improvisation without recourse to software other than the firmware used by the system’s digital components. A performance will usually entail a single musician playing live with SARAH in real time.
SARAH differs from both capture-based processes (e.g. live sampling/looping) and the use of pre-recorded material in that SARAH’s response to a human performer is not repetitive, predetermined or chaotic. Capture and playback systems can be manipulated in live performance to an extent, but nothing will induce such systems to improvise an idiosyncratic, real-time response to external musical input which is unpredictable yet also demonstrably intentional. SARAH, by contrast, does exactly this.
Given the generally imprecise usage of the term I hesitate to describe SARAH’s way of working as ‘interactive’. What does seem to apply, however, is the word ‘transactive’. SARAH will improvise in response to any kind of audible input but is particularly receptive to material which is harmonically and/or timbrally complex; processed digital percussion sounds are currently favoured.
ROGER THOMAS first became involved in experimental and improvised music in the late 70s, subsequently performing with artists such as Maggie Nicols, Lol Coxhill, Philipp Wachsmann, Michael Parsons and David Bedford and participating in workshops run by Butch Morris, Eddie Prevost, John Russell and Paul Rutherford.
Although he has also performed in bands and orchestras and occasionally ‘composed’ (Then Three Come Along At Once for digitally processed bus sounds was premiered during the 2009 Spitalfields Festival; earlier pieces include Quiet but Complicated for improvising musicians and Smear Campaign for musicians and painters), he remains primarily interested in the exploration and development of improvisation, mainly with percussion and live electronics. More recently he gave the the opening performance at the 2011 NoiseFloor Festival at the University of Staffordshire and contributed a piece to the online jukebox for the 2011 Audiograft Festival hosted by SARU at Oxford Brookes University.
He has lectured at Brunel University on numerous occasions and he currently teaches at the Bishopsgate Institute in London, where he runs an experimental music group. Widely published as a writer and editor, his work has appeared in many publications including Sound On Sound, The Wire, DJ magazine, Gramophone and BBC Music.
Stuart Freeland – Shubunkin
In 2006 I created Shubunkin, a creative persona that allows me to explore the potential of Soundbeam equipment in live performance. A Shubunkin performance incorporates the results of digital experimentation with sounds and samples. These sounds are stored on computer and, in performance released by movement within the range of sensors/soundbeams. The digital experimentation enables me to create a sound landscape outside the scope of musicians using traditional instrumentation, and the soundbeam equipment allows me to perform without being trapped behind a bank of technology that creates a barrier between stage and audience.
Soundbeam is a midi instrument that allows the ‘music maker/sound artist’ to trigger notes and melodies, samples or sounds by interrupting an invisible beam. The soundbeam equipment is linked to a laptop running music software that allows the triggering of samples and sounds by the midi note played by the soundbeams. Visuals are trigged simultaneously, using midi VJ software that allows the triggering of images, film and various live camera feeds. These clips are mixed and movement within the soundbeam sensors controls effects. Other sounds come from a theremin linked to an effects prosessor and a kaos pads.
I have two sets: one is described as old school drum n bass and I have a more experimental set that involves the recording of live sampling of ultra sound from the sensors and then the re-shaping of these sounds into a piece, that explores live digital synthesis and musically, tradition Indian raga systems.
ilanz is a new piece for live electronics and live tape. It aims to create an immersive sound experience, focusing in on the shared act of listening. Listeners should feel free to engage with the piece on their own terms.
The work’s main structural principle is its movement from purely electronic sound to analogue sound. Essentially there are two versions of the same sonic material – the only difference being that one is transposed a fifth below the other. The transposed version is running (from a laptop) through a cassette tape, the original version remains purely electronic (ie. just running through the laptop). The piece is a very slow transition from the original version to the transposed version. Everything is performed live.
The surface of the music remains fairly constant, whilst the underlying details are in constant flux. The piece explores the unique qualities of analogue tape, along with its timbral unpredictability. The gradual harmonic shift which occurs is also an area of focus.
William Cheshire is a sonic artist and composer based in London. His work sits within the fertile ground between contemporary classical composition and electronic music. Recent projects include a live electronic performance at the Barbican’s esteemed More Soup and Tart event, a string quartet for members of the Aurora Orchestra and a limited edition EP. He is currently based at the Guildhall School of Music and Drama, studying under Richard Baker.
For more details please see www.williamcheshire.co.uk