ASCAP 2020 Music & Design Challenge:
Creating, Composing, Experiencing Music with Emerging Tech
Extended reality technologies, machine intelligence, voice-controlled user interfaces, and enriched data management tools are just a few of the new tools available to today’s music creators and consumers.
The American Society of Composers, Authors, and Publishers (ASCAP) Lab seeks creative, cutting-edge concepts that apply emerging technologies to music composition and experience. Are there new modes of composition and creation that can be enabled with a novel application or combination of these technologies? Additionally, how do these technologies enable and unlock new ways to experience music?
A quick look
A more inclusive, collaborative experience for music creators
Collaborative music creation today faces 3 major limitations:
centered around musicians or creators with background and knowledge in music, or musical instruments skills
limited by conventional understandings of so-called "musical instruments"
mostly experienced in common physical spaces
Through Life Jam, we would like to address those limitations, with a strong belief that the exciting future of music creation and music experience is only limited by our imagination. Life Jam reimagines music experience and explores new ways of listening and alternative forms of "musical instruments" of the future. It is aimed at creating a more inclusive, playful music co-creation experience, simply by using familiar objects of our daily lives to create music together via an accessible virtual platform.
Let's reimagine "musical instruments"
The concept of “cutting-edge technology" is not just about brand new, advanced forms of technology. It is also about "cutting-edge" use of technology - how such technology tool is designed to enhance meaningful, enjoyable elements of humans’ interactions.
We asked ourselves:
How might we reimagine the keyboard in new digital mediums?
How might emerging technologies open up new ways to collaboratively compose?
The idea of Life Jam was brought to life, which combines machine learning technology and intuitive interaction with familiar objects of our everyday lives - as our reimagined "musical instruments". The experience is enabled over a virtual live web platform to allow a playful collaborative music creation for anyone, from anywhere.
Objects of our lives, their sounds, and collaborative music creation
As our team explored unconventional instruments for music creation, we got excited by the idea of an augmented music playground inspired by objects found in our daily lives. We imagined this everyday-life inspired music experience would allow us to enjoy listening closely to the hidden, yet intuitive sounds and music inside such objects and beings right around us, inspiring us to have better interaction and connection with them through music.
Furthermore, musical instruments as everyday-life objects would also be accessible to many, particularly non-musicians. This aligns with our dream of a future where music creation is more collaborative and easily accessible. Indeed, imagine, instead of only professional composers and musicians producing music, a collective experience that could also include their audience/consumers/fans creating and enjoying music together.
On-the-spot sound recording
In addition to pre-assigned pairing between sounds and objects, players have the option to record their own original sound or music on the spot and attach it to an object of their choice. The beta version currently allows player to attach their recorded sound to any 'book' (as the object that would trigger the sound or music when showing in front of the webcam). We hope to develop a wider variety of objects for players to choose from in the near future.
Apply sound effects with body movements
With Life Jam, being able to physically touching, holding familiar objects of our daily lives while making music with them brings physical interaction back into a virtual experience. We then asked ourselves: "wouldn't it be also fun if we can use our own body movements to add interesting sound effects to the music?"
Using the same object detection machine learning model, real-time positions of players can be mapped to different filter effects, which can be turned on and off with a toggle. The beta version of Life Jam incorporates high/low pass filter on top of the music being played, which are controlled by players' body movements left to right, back and forth in front of the webcam.
Visual and Motion Design
Playful animations for a playful music experience
Colorful and playful animations are triggered along with intuitive musical sounds, as these everyday objects are shown in front of the users' webcams and get detected by the machine learning model.
Recordings of Jam-out Sessions
Sachiko Nakajima: concept development, sound design, back-end programming (Tokyo, Japan)
Chenhe Zhang: concept development, sound design, UI + motion design, front-end programming (New York)
Son Luu: concept development, back-end programming, visual design, project management (New York)
Special Thanks to
Vince Shao & Mark Lam
Shuju Lin, Shawn Every, Alden Jones, Ellen Nickles, Ada Jiang