Difference between videogram and motiongram

For some upcoming blog posts on videograms, I will start by explaining the difference between a motiongram and a videogram. Both are temporal (image) representations of video content (as explained here), and are produced almost in the same way. The difference is that videograms start with the regular video image, and motiongrams start with a motion image. So for a video of my hand like this: we will get this horizontal videogram:...

July 13, 2011 · 1 min · 110 words · ARJ

Sonification of motiongrams

I have made a new Jamoma module for sonification of motiongrams called jmod.sonifyer~. From a live video input, the program generates a motion image which is again transformed into a motiongram. This is then used as the source of the sound synthesis, and “read” as a spectrogram. The result is a sonification of the original motion, plus the visualisation in the motiongram. See the demonstration video below: The module is available from the Jamoma source repository, and will probably make it into an official release at some point....

November 9, 2010 · 1 min · 88 words · ARJ

New motiongram features

Inspired by the work [[[Static no. 12 by Daniel Crooks that I watched at the Sydney Biennale]{.entry-content}]{.status-content}]{.status-body} a couple of weeks ago, I have added the option of scanning a single column in the jmod.motiongram% module in Jamoma. Here is a video that shows how this works in practice: About motiongrams A motiongram is a way of displaying motion (e.g. human motion) in the time-domain, somehow similar to how we are used to working with time-representations of audio (e....

July 2, 2010 · 1 min · 171 words · ARJ

Presenting mocapgrams

Earlier today I held the presentation “Reduced Displays of Multidimensional Motion Capture Data Sets of Musical Performance” at the ESCOM conference in Jyväskylä, Finland. The presentation included an overview of different approaches to visualization of music-related movement, and also our most recent method: mocapgrams. While motiongrams are reduced displays created from video files, mocapgrams are intended to work in a similar way, but created from motion capture data. They are conceptually similar, but otherwise quite different in the way they are generated....

August 14, 2009 · 2 min · 216 words · ARJ

Open lab

We have slowly been moving into our new lab spaces over the last weeks. The official opening of the labs is scheduled for Friday 26 September, but we had a pre-opening “Open lab” for the new music students last week, and here are some of the pictures shot by Anne Cathrine Wesnes during the presentation. Here I am telling the students a little about our new research group, and showing the main room:...

August 26, 2008 · 1 min · 109 words · ARJ

AudioVideoAnalysis

To allow everyone to watch their own synchronised spectrograms and motiongrams, I have made a small application called AudioVideoAnalysis. Download AudioVideoAnalysis for OS X (8MB) It currently has the following features: Draws a spectrogram from any connected microphone Draws a motiongram/videogram from any connected camera Press the escape button to toggle fullscreen mode Built with Max/MSP by Cycling ‘74 on OS X.5. I will probably make a Windows version at some point, but haven’t gotten that far yet....

June 17, 2008 · 1 min · 132 words · ARJ

Motiongrams sync'ed to spectrograms

One of my reasons for developing motiongrams was to have a solution for visualising movement in a way that would be compatible to spectrograms. That way it would be possible to study how movement is evolving over time, in relation to how the audio is changing over time. In my current implementation of motiongrams in Max/MSP/Jitter (and partially in EyesWeb), there has been no way to synchronise with a spectrogram. The problem was that the built-in spectrogram in Max/MSP was running much faster than the motiongram, and they was therefore out of sync from the start....

June 11, 2008 · 2 min · 244 words · ARJ

Sonification of Traveling Landscapes

I just heard a talk called “Real-Time Synaesthetic Sonification of Traveling Landscapes” (PDF) by Tim Pohle and Peter Knees from the Department of Computational Perception (great name!) in Linz. They have made an application creating music from a moving video camera. The implementation is based on grabbing a one pixel wide column from the video, plotting these columns and sonifying the image. Interestingly enough, the images they get out (see below) of this are very close to the motiongrams and videograms I have been working on....

May 15, 2008 · 1 min · 86 words · ARJ

Motiongrams in EyesWeb!

We had a programming session this morning, and Paolo Coletta implemented a block for creating motiongrams in EyesWeb. It will be available in the new EyesWeb XMI release which will happen in the end of this week. Great!

February 13, 2008 · 1 min · 38 words · ARJ

Motiongrams

Challenge Traditional keyframe displays of videos are not particularly useful when studying single-shot studio recordings of music-related movements, since they mainly show static postural information and no motion. Using motion images of various kinds helps in visualizing what is going on in the image. Below can be seen (from left): motion image, with noise reduction, with edge detection, with “trails” and added to the original image. Making Motiongrams We are used to visualizing audio with spectrograms, and have been exploring different techniques for visualizing music-related movements in a similar manner....

November 1, 2006 · 2 min · 373 words · ARJ