We are fortunate to have received funding from UiO’s infrastructure board to purchase a new Yamaha Disklavier grand piano for the Department of Musicology. We have applied for this several times and are grateful to the university for making such an investment. To celebrate the new instrument, we organized the seminar New Horizons in Piano Performance Research during which I did a “lecture-recital” talking a little about the history of piano research in the department and stress-testing the new instrument.

Our new Disklavier

We streamed the whole seminar, which can be seen here:

I have also uploaded my slides to Zenodo. In the following, I will summarize some of my comments and add a few more I forgot.

History

I began my presentation by drawing some lines back to the work of several people who have been influential in shaping piano research at the Department of Musicology, including my former piano teacher, Anne Eline Riisnæs, and former supervisor, Rolf Inge Godøy. Together, we began systematic studies of both sound-producing actions (in the fingers) and sound-facilitating motion (in elbows and shoulders).

Initially, we did this with only video cameras before progressing into sensor-based motion capture. We quickly realized that using a Polhemus electromagnetic tracker was not ideal due to all the metal in the piano, which distorted the measurements. Then, we had more success with a home-built accelerometer-based system capturing the motion of various body joints.

We purchased our first optical motion capture system in 2008, a Polhemus system that captured position in space. This was supplemented by a Qualisys system in 2009, allowing more accurate and precise measurements. This year, 2024, we upgraded to the newest Qualisys system, which significantly improves tracking tiny markers, for example, on the fingers.

Laura Bishop testing finger motion capture

Regarding pianos, we have had an upright Yamaha Disklavier from the beginning of the exploration. This allowed for recording MIDI signals from the instrument, which is of great help for later analysis. However, due to the upright nature of the instrument, it did not work well for camera-based recordings. Then, we had to rely on digital pianos with line-of-sight to the fingers. With a Yamaha Disklavier grand piano, we finally have a state-of-the-art instrument to continue our piano exploration with optical motion capture.

Exploring other piano-like instruments

As a pianist, I have always enjoyed playing various piano instruments. In my book, Sound Actions I describe my journey from playing an old, upright piano at home to the grand pianos at university. In parallel, I have explored many other types of keyboard instruments, acoustic and digital. In my book, I pay particular attention to the differences between acoustic instruments’ action–sound couplings and the action–sound mappings in various electroacoustic instruments. The difference is that the couplings are based on the instruments’ physical (mechanical and acoustical) properties, while mappings are designed and constructed. One is not better than the other, but I argue that this has cognitive implications for how we engage with and experience instruments.

The fascinating thing about our new Yamaha Disklavier grand piano is that it is a hybrid instrument. It is a regular acoustic grand piano that a human can typically play. But it also has built-in motors that allow it to play back recorded musical material or entirely new things. Finally, it also has built-in speakers, turning it into an electroacoustic instrument. I have yet to explore all the different possibilities, but undoubtedly many interesting things can be done with such an instrument!

Stress testing the Yamaha Disklavier grand piano

I decided to stress-test the instrument so as not to make my presentation sound like a sales pitch for Yamaha. My main concern and criticism relates to it using MIDI as the communication protocol. This makes sense in many ways; after all, MIDI is an established standard. However, it also comes with some limitations, particularly the constraints imposed by the 7-bit resolution of the MIDI protocol compared to Yamaha’s internal 10-bit resolution. This limits the dynamic range (going from soft to loud tones). This is quite noticeable when gradually increasing or decreasing the loudness of a tone played over MIDI (check the video above).

The instrument also has limitations regarding rapid note sequences. This is connected to its mechanics; the hammer needs time to move. However, it is good to know this when working with the instrument.

Stress testing the Disklavier

Given all the fuss about MPE and MIDI 2.0 in recent years, I hope Yamaha will upgrade their software to one of the newer protocols. I could wish for the addition of the flexible Open Sound Control (OSC) protocol, but I am happy with either MPE or MIDI 2.0 to at least get a better dynamic range.

An instrument for AI Experiments

“Everybody” talks about AI these days. New services make it possible to generate audio in the cloud. While that is fun, it is much more exciting to think about how our new Disklavier opens for AI experiments.

Many people only consider machine learning (and deep learning) AI today. However, I would argue that rule-based systems are also AI and have a long tradition in music. After all, different music theories (both Western and others) are, in many ways, based on algorithmic thinking and have been so for centuries.

In the early 2000s, I spent quite some time exploring both rule-based and learning-based algorithms on our old Disklavier. My research has later diverged from pianos, but I am excited about revisiting some of these ideas with the new instrument.

Exploring the Intersection of Art and Science through Music Technology

Our new Disklavier is a vital addition to our department’s infrastructure. It is both old and new at the same time. It allows for systematic studies of piano technique and historical performance practices. It makes it possible for students to have excellent, machine-based accompaniment for their performances. Perhaps most excitingly, it opens the door for new explorations of AI in music.