New publication: Non-Realtime Sonification of Motiongrams

Today I will present the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm. The paper is based on a new implementation of my sonomotiongram technique, optimised for non-realtime use. I presented a realtime version of the sonomotiongram technique at ACHI 2012 and a Kinect version, the Kinectofon, at NIME earlier this year. The new paper presents the ImageSonifyer application and a collection of videos showing how it works....

August 1, 2013 · 2 min · 225 words · ARJ

Kinectofon: Performing with shapes in planes

Yesterday, Ståle presented a paper on mocap filtering at the NIME conference in Daejeon. Today I presented a demo on using Kinect images as input to my sonomotiongram technique. Title Kinectofon: Performing with shapes in planes Links Paper (PDF) Poster (PDF) Software Videos (coming soon) Abstract The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device....

May 28, 2013 · 1 min · 193 words · ARJ

ImageSonifyer

Earlier this year, before I started as head of department, I was working on a non-realtime implementation of my sonomotiongram technique (a sonomotiongram is a sonic display of motion from a video recording, created by sonifying a motiongram). Now I finally found some time to wrap it up and make it available as an OSX application called ImageSonifyer. The Max patch is also available, for those that want to look at what is going on....

April 6, 2013 · 1 min · 200 words · ARJ

Are you jumping or bouncing?

One of the most satisfying things of being a researcher, is to see that ideas, theories, methods, software and other things that you come up with, are useful to others. Today I received the master’s thesis of Per Erik Walslag, titled Are you jumping or bouncing? A case-study of jumping and bouncing in classical ballet using the motiongram computer program, in which he has made excellent use of my motiongram technique and my VideoAnalysis software....

February 21, 2013 · 1 min · 146 words · ARJ

New publication: Some video abstraction techniques for displaying body movement in analysis and performance

Today the MIT Press journal Leonardo has published my paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance”. The paper is a summary of my work on different types of visualisation techniques of music-related body motion. Most of these techniques were developed during my PhD, but have been refined over the course of my post-doc fellowship. The paper is available from the Leonardo web page (or MUSE), and will also be posted in the digital archive at UiO after the 6 month embargo period....

January 14, 2013 · 2 min · 231 words · ARJ

Performing with the Norwegian Noise Orchestra

Yesterday, I performed with the Norwegian Noise Orchestra at Betong in Oslo, at a concert organised by Dans for Voksne. The orchestra is an ad-hoc group of noisy improvisers, and I immediately felt at home. The performance lasted for 12 hours, from noon to midnight, and I performed for two hours in the afternoon. For the performance I used my Soniperforma patch based on the sonifyer technique and the Jamoma module I developed a couple of years ago (jmod....

December 13, 2012 · 1 min · 207 words · ARJ

Hi-speed guitar recording

I was in Hamburg last week, teaching at the International Summer Shool in Systematic Musicology (ISSSM). While there, I was able to test a newly acquired high-speed video camera (Phantom V711) at the Department of Musicology. [caption id=“attachment_1988” align=“alignnone” width=“300”] The beautiful building of the Department of Musicology in Hamburg[/caption] [caption id=“attachment_1987” align=“alignnone” width=“300”] They have some really cool drawings in the ceiling at the entrance of the Department of Musicology in Hamburg....

August 13, 2012 · 3 min · 614 words · ARJ

Paper #1 at SMC 2012: Evaluation of motiongrams

Today I presented the paper Evaluating how different video features influence the visual quality of resultant motiongrams at the Sound and Music Computing conference in Copenhagen. Abstract Motiongrams are visual representations of human motion, generated from regular video recordings. This paper evaluates how different video features may influence the generated motiongram: inversion, colour, filtering, background, lighting, clothing, video size and compression. It is argued that the proposed motiongram implementation is capable of visualising the main motion features even with quite drastic changes in all of the above mentioned variables....

July 12, 2012 · 1 min · 166 words · ARJ

Record videos of sonification

I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique. It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max....

June 25, 2012 · 1 min · 159 words · ARJ

Sonification of motiongrams

A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper. See below for the full paper and video examples. The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams....

February 3, 2012 · 2 min · 398 words · ARJ