NAMI

SARA SITHI-AMNUAI

GOAL

I wanted to develop a new interface for trumpet that explores identity and privileges improvisation and live performance interaction between the performer’s body and their instrument (trumpet) through gesture and sound.

DESCRIPTION

Nami – is a custom built glove interface designed for live musical performance and specifically for trumpet player. The interface is fitted on the left hand of the performer and requires no modifications to the host-instrument or performer. It was named after the Japanese concept “nami” which means “wave” and embodies the ideology of music and life being fluid forms. Nami uses sensor technology to capture gestural data such as finger flexing and finger pressure to offer extended control and expressivity to performers through triggering and sound processing. Nami also has a growing library of gestural vocabulary that is built upon the existing vocabulary trumpet players use when utilizing mutes, the subtle gestures that exist within gagaku/bugaku (traditional Japanese court dancing style), and Butch Morris’ (creator of conduction) set of hand gestures that exist to facilitate structured free improvisation performances.

CULTURE

My cultural perspective is varied between and not limited to the following identity descriptors:

  • performer

  • trumpet player

  • composer

  • improviser

  • Nikkei (Japanese American)

In future iterations, I hope to develop a glove from the perspective of a certain community and create a gestural vocabulary based on this perspective.

SOME INSPIRATIONS

  • Laetitia Sonami’s Lady’s Glove

  • Mi.Mu glove (Imogen Heap and team)

  • Pamela Z and Donald Swearingen’s N Degrees

INPUT

Finger pressure and flexing of 1 person (myself) through a Force Sensing Resistor (FSR) and 3 Flex Sensors using Arduino, Max, Wekinator, and Ableton Live.

In future iterations, I hope to add an IMU to track pitch, roll, and yaw, and enable more control and expressivity for the user.

RULES

Wekinator

Dynamic Time Warping – to trigger a specific audio effect in Ableton Live based on a specific gesture/specific moment

Classification – to trigger and control a specific audio effect in Ableton Live based on a specific gestural event (can be a short or long event/movement)

Continuous (Neural Network) – to trigger and control a specific audio effect in Ableton Live based on a specific gestural event (can be a short or long event/movement)

As I trained Nami, I realized that Dynamic Time Warping was especially useful for triggering on and off effects but needed calibration within the degree of match because my movements weren’t always accurately matching the trained samples. Using classification or a continuous (neural network) ended up suiting my musical performances more because of its flexibility.

OUTPUT

This is a glove interface that uses sensor technology to capture gestural data such as finger flexing and finger pressure to offer extended control and expressivity to performers through triggering effects and sound processing. The output is MIDI data that can be mapped to any parameter within Ableton Live.

MATERIALS

  • 1st iteration (not photographed): modified garden glove (cotton)

  • 2nd iteration (photographed): Junior sized modified golf glove (leather, nylon, spandex)

  • 3rd iteration (photographed): medium sized modified 15-inch 1920’s style satin glove (89.3% nylon and 10.7% spandex)

  • 4th iteration (photographed): large modified arthritis/hand relief compression glove (copper-infused fibers, 83% nylon and 17% spandex)

CREATING THIS INTERFACE AND PERFORMING WITH IT BRINGS UP MULTIPLE QUESTIONS…

  • Who can access it/Who can wear it?

    • Design challenge: creating a glove that is flexible and can fit multiple hand sizes but also be form fitting

  • Performance practice

    • How to play with one hand? What kinds of sounds can you make utilizing one hand?

  • How does this affect the body?

    • Does this change traditional technique used by trumpet players? Does this cause balance issues or tension?

  • How can the materials embody the technology?

ACKNOWLEDGMENTS

Nami was created and developed during my time at CalArts in Interface Design Class led by Ajay Kapur and TA’d by Andrew Piepenbrink. Many thanks to Christine Meinders and the AI. Culture. Creativity class for their help in the development of Nami specifically with Wekinator and also Sarah Reid, David Rosenboom and Tim Feeney on both the creative and technical aspects of Nami.