RESEARCH PLATFORM

Academy for Theatre and Digitality

Hreyfð - wearable gestural instrument

completedFellowship

Summary
A practice-based research project exploring the possibilities of instruments for live electronic music performance that connect sound with the physical body, movement, and space.

Hreyfð (meaning ‘she is moved’ in Icelandic) is a wearable speaker instrument that creates sounds with gestures. The instrument is a part of my ongoing research on live electronic music that links the physical body and gestures with music performance. The aim is to find methods to perform electronic music with a vivid expression on both audible, visual, and tactile levels.

Hreyfð uses microphones and speakers to create audio feedback that are processed with a microcontroller and gyroscope signals. The feedback creates a field of audible possibilities that one can learn how to play with practice. The instrument always has its own playful, unpredictable, and organic character that the performer navigates precisely in proximity and location. With audio speakers attached to the body, the performer is directly the sound source. Thus, the physical body also acts as a sound object that moves in space; physio-spatial sound or a sonic choreography.

During the fellowship, I worked on the instruments design and performance possibilities of Hreyfð in collaboration with choreographer Marina Mascarell, costume designer Daphna Munz and four dancers: Eli Cohen, Jin-Young Won, Joy Kammin, and Lukas Karvelis. At the end of the fellowship, I presented a short demonstration performance where the dancers performed with four different instrument designs.

Research Questions

- How can an (electronic) instrument create new connections between bodies and sound?

- Are these relationships unique to music or performance, or can they teach us something about our human bodies?

- What is the role of instruments in electronic music performance?

- How can this role be a link to other mediums, such as dance?

- How can one learn to perform an unpredictable instrument?

Instrument Design


Point of departure

Hreyfð is a part of my long-term research focusing on wearable gestural instruments. It is driven by an urge to create performative possibilities for electronic music that are both expressive, precise, and transparent. This means creating electronic instruments with a similar hearing-seeing relationship as when a person plays an acoustic instrument. When experiencing music performance, regardless of one's knowledge of an instrument, most acoustic instruments share a visual aspect that demonstrates function and expression. As an audience, you make a cognitive link between what you see and hear which affects your experience and how “expressive” the performance was.

Simultaneously, many music technology trends focus on laptops as the primary instrument, often extended with controllers that mimic computer keyboards, pianos, or machines. They rarely mimic other instruments or performance possibilities. In many ways, it is like an office job, hence the common joke about DJs and electronic musicians sending emails while performing.

With expression and transparency as a starting point, the research has led me to instrument design that uses the embodiment of sound through the performer’s physicality to express music. Since electronic instruments are not tied to physical materials such as wood, strings, or air pressure, their means of performativity are not limited by it either. Thus, electronic instruments have the possibility to transform the imperceivable; imaginary spaces become audible through movement, and the materiality of air and space becomes visible. Therefore, the performer can be simultaneously a performer of music, space, and movement.


Costume design

The instrument consists of two speakers, two lavalier microphones, a gyroscope sensor, and a microcontroller hub that connects them. During the fellowship, I experimented with how the location of the speakers and microphones would affect movements and visual aesthetics. The conclusion was to make four different versions so that the instrument would emphasise the physicality of each performer.

At this stage, the costume design is primarily functional. Together with Daphna Munz and Marina Mascarell, we experimented with materials and systems that would make risk-free cabling on a dancer's body and how the cables can be an integral and organic part of their movements. Another consideration was that the speakers I used were very bulky. Therefore, they are hard to mount and very present in the instrument, both visually and for movement.

Generally, I wanted the costume design to be transparent to the instrument's function. All electronics are either mounted and naked or cased in transparent boxes. Cables lie outside clothing, hold steady with straps, and each signal is always a single cable line. Visually, this makes the electronics become an organic part of the performer's body: The cables become veins, and the bulky speakers become extra limbs. These become new body parts that the dancer now needs to learn to perform with.

Behaviour

The instrument forces certain behaviours in two ways.

Firstly, the speaker and microphone need to “see” each other to create sound; therefore, their location affects the performer’s movements. From the moment of wearing the instrument, it transforms their physicality, forces them to cope with the instrument’s nature and move body parts at precise speeds and locations to make the desired sound. This limits physical possibilities whilst forcing the performer to make new connections. For a non-trained dancer like me, this is an excellent way to teach your body to go in strange positions, e.g., by lifting your foot to make it reach your chest and wiggling it around in the air for different sonic textures.

Secondly, the instrument is based on audio feedback and tends to overflow the signal, sometimes resulting in painful squeaks. One can learn to avoid this by practice, but it’s also a factor in the instrument design that I want to improve. Regardless, the feedback is an essential characteristic of the instrument that forces a dialogue between the performer and the instrument with precision and concentration.


Sound design

The sound design of Hreyfð is very simple. The performer wears microphones and speakers on the body, and when those “see” each other, it creates audio feedback. With this as the primary material, the sound source goes through an EQ process and is delayed by a few milliseconds. A gyroscope sensor controls the volume (Y-axis) and distortion (X-axis). Part of my research during the fellowship was exploring how slightly different time delays and EQing would drastically affect the sound while still maintaining its character.


Electronics, hardware and coding

Electronics: I used a Teensy 4.0 and a Teensy Audio Shield as the core for audio processing. The Teensy acts as an audio interface with input, output and processing. I used two commercial electret microphones and two commercial battery-powered speakers as the basis of the instrument. The Teensy then ran a simple code that filtered and delayed the sound (the short delay is a crucial characteristic of the instrument's sound) and used the signal of a gyroscope to control the amplitude and a distortion effect. In the future, I imagine both the code being more advanced and precise, as well as having more functions for the gyroscope.

Sensor Ring: The sensor ring consists of a gyroscope sensor module and a 3D printer ring. The module's data is used to change the audio processing according to the position of the hand.

Hardware: All the hardware was made at the academy using laser cutter, 3D printer, and the electronic workshop. The electronics were put in a transparent plexiglass box and connected to other modules through minijack plugs. I chose this design because I wanted the electronics to be visible to the audience. It also has an aesthetic layer to it which contributes to the whole cyborg vibe that the costume already has.

Code: The basic version of the instrument's code can be found here. For different versions, I played around with changing delay time, EQ parameters and compression, and experimenting with audio effects.

Future Perspectives

The instrument is still at an early stage. The next steps are improving its design, performative possibilities, and coding to make the instrument more expressive and reliable.

The long-term goal is to create a 1-hour long piece for 4+ dancers where each performer is wearing Hreyfð. Over the fellowship, my concepts for this piece have focused on the contrast between the human and organic in comparison with the inhumane and machine. There is an inescapable link between any wearable instrument and cyborgs which has greatly influenced this work's design and concept. I have been regularly questioning myself in what context I accept this cyborg connection and when to go against it. The human-machine relationship is a trendy topic nowadays and staying critical is essential for me.

A significant influence on this human-machine theme has been the sci-fi classic Frankenstein. The story is about a man-made creature that even its maker fears. The creature longs for love and empathy, but in its failed attempts for compassion, it becomes the monster the public feared it was. The story tells us a lot about fear of our own development and technology and the fear of losing what is human. What does this teach us about the organic/human? And what does it teach about empathy? With this as the conceptual basis, I aim to make a sonic choreography where the sound will unravel the social behaviour of a group of Cyborgs.

Photos on film by Jin-Young Won


Similar Projects

******
******
Christiane HütterMarkus Schubert
InteractivePerformanceNetworked ObjectsSensors2020
Fellowship
PALIMPSEST
PALIMPSEST
Jakob LorenzStella Lennert
VR (Virtual Reality)DramaInteractiveStage Design2022
Fellowship
SOUNDVISION
SOUNDVISION
Max Schweder
Data VisualisationSound DesignPerformance2019
Fellowship
Resonance
Resonance
Björn Lengers
InteractiveArtificial IntelligenceSound Design2023
Fellowship
A_live
A_live
Laura Waltz
2023Motion Capture/TrackingDancePuppetryAvatarsSensors
Fellowship
Sigil :: bringing the inner world out
Sigil :: bringing the inner world out
Jorge Guevara
2023VR (Virtual Reality)DanceChoreographyLighting Design
Fellowship
Memory of Things
Memory of Things
Nico ParisiusCaspar Bankert
Networked ObjectsObject TheatreInstallationSensors2021
Fellowship
Robiotope
Robiotope
MANUFAKTOR
PuppetryInteractiveRoboticsBiological Data2022
Fellowship
Digitalizing Narrative Spaces
Digitalizing Narrative Spaces
Fabian Raith
AR (Augmented Reality)Narrative DesignInteractiveNetworked ObjectsXR (Extended Reality)2022
Fellowship
Post-Organic Bauplan
Post-Organic Bauplan
Post-Organic Bauplan
RoboticsSensorsDancePerformanceXR (Extended Reality)2022
Fellowship
AUTO_NOMOS
AUTO_NOMOS
Markus WagnerChristoph Wirth
VR (Virtual Reality)Stage DesignInteractiveAvatarsChoreography2021
Fellowship
Common Grounds
Common Grounds
Kerstin ErgenzingerBnaya Halperin-Kaddari
Installation2022Data SonificationSound DesignSensors
HIDA
Pawāaraibu - filling the vacuum
Pawāaraibu - filling the vacuum
Jana Kerima StolzerLex Rütten
InstallationSpatial DesignSound DesignVR (Virtual Reality)2020
Fellowship
Sensory Airplanes
Sensory Airplanes
Lukas Rehm
AV (Audiovisual Design)Sound DesignArtificial IntelligenceOperaXR (Extended Reality)2020
Fellowship
Merging Entities
Merging Entities
Birk SchmithüsenNina Maria Stemberger
Data SonificationChoreographyLighting DesignArtificial IntelligenceSensors2022
Fellowship
Machine Learning in Artistic Production
Machine Learning in Artistic Production
Meredith Thomas
Artificial IntelligenceInteractiveSensorsInstallation
Fellowship
New Opera Technologies
New Opera Technologies
Amy StebbinsHauke Berheide
OperaProjection/MappingVR (Virtual Reality)Sound Design2020
Fellowship
Sun Within
Sun Within
Vesela Stanoeva
XR (Extended Reality)InstallationSound Design2020
Fellowship
Disco Planet: Letheia
Disco Planet: Letheia
Do MayerHen/i
2023VR (Virtual Reality)ChoreographyInstallation
HIDA