Grants and Contributions:

Title:
Improvising with the dead
Agreement Number:
RGPIN
Agreement Value:
$115,000.00
Agreement Date:
May 10, 2017 -
Organization:
Natural Sciences and Engineering Research Council of Canada
Location:
British Columbia, CA
Reference Number:
GC-2017-Q1-03134
Agreement Type:
Grant
Report Type:
Grants and Contributions
Additional Information:

Grant or Award spanning more than one fiscal year. (2017-2018 to 2022-2023)

Recipient's Legal Name:
Tzanetakis, George (University of Victoria)
Program:
Discovery Grants Program - Individual
Program Purpose:

The end goal of the proposed research is rather simple to state. I would like to develop technologies to support the currently impossible experience of improvising music together with one or more musicians that are dead. This experience should be as close to actual music playing as possible. Although this goal is simple to state it is challenging to achieve and requires important advances in different research topics from several disciplines.

Improvisation is present in all music cultures and can be identified and appreciated by most music listeners even in cultures they are not familiar with. The processes that govern improvisation are not well understood and also depend on the specific music culture studied. They are also intrinsically tied to the physical actions of playing instruments. For example to understand and appreciate a solo by B.B King it is not sufficient to look at what notes are played but one also needs to consider the visceral connection between his body and the guitar. I believe that we are at a time when the experience of improvising with a dead musician is moving from science fiction to the realm of possibility as there are several recent key technological developments that can be leveraged for this purpose. Imagine being able to analyze the audio recordings of a legendary jazz musician, for example the incredible saxophone player Charlie Parker, in order to create a model of how he played and improvised. The model could be used to create a computer controlled virtual Charlie Parker that a jazz student could play with and learn from. In order to approximate the playing experience as accurately as possible augmented or virtual reality (AR/VR) devices would be used while playing an actual acoustic instrument.

There is resurgence of AR/VR with many key industrial players involved. In the new interfaces for musical expression community there is extensive experience building effective cyber-physical systems for music. Music information retrieval (MIR) techniques analyze millions of songs to extract content information to automatically recommend music. MIR techniques can also be used to extract information about what is being played in real time. My group has been active in these research areas and will continue to do so with this new focus. Students working on these project will obtain experience in audio signal processing, machine learning, augmented and virtual reality, human­ computer interaction (HCI), programming languages and artificial intelligence (AI). As the virtual and physical words blend, new ways of interaction will be developed. The proposed work can also lead to advances in AI and the tools used to build complicated, real-time, AI systems and shape how humans and such systems interact in the future. Finally, the advances envisioned in this proposal have the potential to radically transform how we create, distribute and perceive music (and multimedia in general) content in the future.