Record videos of sonification

I got a question the other day about how it is possible to record a sonifyed video file based on my sonification module for Jamoma for Max. I wrote about my first experiments with the sonifyer module here, and also published a paper at this year’s ACHI conference about the technique. It is quite straightforward to record a video file with the original video + audio using the jit.vcr object in Max....

June 25, 2012 · 1 min · 159 words · ARJ

New Laptop Orchestra Piece: Click-It

Yesterday I was teaching a workshop on laptop orchestra performance for the students in Live electronics at the Norwegian Academy of Music. I usually start such workshops by playing the piece Clix by Ge Wang (see e.g. here for a performance of it). It is a fun piece to play, and it is nice to show the students something else than Max patches. Unfortunately, while setting up for the workshop I had problems getting Chuck to work on my new laptop....

February 9, 2012 · 2 min · 329 words · ARJ

Sonification of motiongrams

A couple of days ago I presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain. The paper is actually based on a Jamoma module that I developed more than a year ago, but due to other activities it took a while before I managed to write it up as a paper. See below for the full paper and video examples. The Paper Download paper (PDF 2MB) Abstract: The paper presents a method for sonification of human body motion based on motiongrams....

February 3, 2012 · 2 min · 398 words · ARJ

Demonstration videos on using Phidgets electronic kits

I am using the Phidgets eletronics kits when teaching sound programming, and have now made two small videos demonstrating some basic principles. First, there is a getting started with Phidgets in Max video in Norwegian: And I have also made a video demonstrating the Phidgets2MIDI application that I developed earlier this year: I am planning to make some videos showing some more musically interesting use of the electronics and software.

April 1, 2011 · 1 min · 70 words · ARJ

Sonification of motiongrams

I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram. See the demonstration video below: The module is available from the Jamoma source repository, and will probably make it into an official release at some point....

November 9, 2010 · 1 min · 88 words · ARJ

AudioAnalysis v0.5

I am teaching a course in sound theory this semester, and therefore thought it was time to update a little program I developed several years ago, called SoundAnalysis. While there are many excellent sound analysis programs out there (SonicVisualiser, Praat, etc.), they all work on pre-recorded sound material. That is certainly the best approach to sound analysis, but it is not ideal in a pedagogical setting where you want to explain things in realtime....

October 11, 2010 · 2 min · 277 words · ARJ

Many lines in a text file

I am trying to debug a Max patch that does video analysis. For some reason many of the exported text files containing the analysis results contain exactly 4314 lines. This is an odd number for a computer program to dislike, so I am currently going through the patch to figure out what is wrong. The first thing I thought about was the text object, which is used for storing the data and write to a text file....

October 11, 2010 · 1 min · 168 words · ARJ

GDIF recording and playback

Kristian Nymoen have updated the Jamoma modules for recording and playing back GDIF data in Max 5. The modules are based on the FTM library (beta 12, 13-15 does not work), and can be downloaded here. We have also made available three use cases in the (soon to be expanded) fourMs database: simple mouse recording, sound saber and a short piano example. See the video below for a quick demonstration of how it works:

July 3, 2010 · 1 min · 74 words · ARJ

New motiongram features

Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice: About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e....

July 2, 2010 · 1 min · 171 words · ARJ

Quantity of motion of an arbitrary number of inputs

In video analysis I have been working with what is often referred to as “quantity of motion” (which should not be confused with momentum, the product of mass and velocity p=mv), i.e. the sum of all active pixels in a motion image. In this sense, QoM is 0 if there is no motion, and has a positive value if there is motion in any direction. Working with various types of sensor and motion capture systems, I see the same need to know how much motion there is in the system, independent of the number of variables and dimensions in the system studied....

July 1, 2010 · 2 min · 238 words · ARJ