Gumstix and PDa

Another post from the Mobile Music Workshop in Vienna. Yesterday I saw a demo on the Audioscape project by Mike Wozniewski (McGill). He was using the Gumstix, a really small system running a Linux version called OpenEmbedded. He was running PDa (a Pure Data clone) and was able to process sensor data and run audio off of the small device.

May 15, 2008 · 1 min · 60 words · ARJ

Optitrack motion capture

I held a guest lecture at the speech, music and hearing group at KTH in Stockholm a couple of weeks ago, and got a tour of the lab afterwards. There I got a demonstration of the Optitrack optical motion capture system, which, as compared to other similar systems, is an amazingly cheap solution starting at $4999. Obviously, it has lower accuracy and precision than the larger systems, but then it also costs 1/20 of the price… However, 100 Hz speed and millimeter precision is decent for a USB-based system, and the cameras are really portable (10x5 cm or so each)....

May 12, 2008 · 1 min · 133 words · ARJ

Motion Capture System Using Accelerometers

Came across a student project from Cornell on doing motion capture using accelerometers, based on the Atmel controller. It is a nice overview of many of the challenges faced when working with accelerometers, and the implementation seems to work well. {width=“300/”}

May 8, 2008 · 1 min · 41 words · ARJ

Softkinetic

Dutch company Softkineticoffers what they call natural interfaces, i.e. interfaces where you don’t have to put on any sensors to interact: Softkinetic operates with a single depth sensing camera, requires no marker (no gamepad, no wiimote, no special gloves or clothing, no headset - nothing), and works under all lighting conditions and scene settings (at home, in a fitness center, an amusement park, a classroom, a game cafe, an industrial simulation room - anywhere....

May 5, 2008 · 1 min · 114 words · ARJ

Sensing Music-related Actions

The web page for our new research project called Sensing Music-related Actions is now up and running. This is a joint research project of the departments of Musicology and Informatics, and has received external funding through the VERDIKT program of the The Research Council of Norway. The project runs from July 2008 until July 2011. The focus of the project will be on basic issues of sensing and analysing music-related actions, and creating various prototypes for testing the control possibilities of such actions in enactive devices....

April 24, 2008 · 1 min · 167 words · ARJ

Writing in NeoOffice, dreaming of LaTeX

I am working on a paper for a journal that only accepts RTF documents, and to avoid the possible problems resulting from converting a LaTeX document into RTF (or possibly from PDF), I decided to try using a word processor from the beginning. For simple word processing I have grown very found of Bean recently, a lightweight application slightly more advanced than TextEdit. I started out with Bean, but since I had to include endnotes in the document I ended up moving over to NeoOffice instead....

April 8, 2008 · 1 min · 185 words · ARJ

Apple tries to patent gestures

Wired reports that Apple has filed around 200 patent applications related to multitouch and gesture control: Yet it appears that the company is not trying to patent the entire multitouch concept, but rather trying to protect certain uses of it – specifically the methods to interpret gestures, and in some cases, the gestures themselves. It is interesting to see that they mention the interpretation of a gesture. This means that they separate between gesture and action, i....

February 25, 2008 · 1 min · 84 words · ARJ

Syncing Movement and Audio using a VST-plugin

I just heard Esteban Maestre from UPF present his project on creating a database of instrumental actions of bowed instruments, for use in the synthesis of score-based material. They have come up with a very interesting solution to the recording and synchronisation of audio with movement data: Building a VST plugin which implements recording of motion capture data from a Polhemus Liberty, together with bow sensing through an Arduino. This makes it possible to load the VST-plugin inside regular audio sequencing software and do the recording from there....

February 14, 2008 · 1 min · 185 words · ARJ

TRIL centre, Emobius and Shimmer

I just heard a presentation by a group of researchers from the Tril centre (Technology Research for Independent Living) in Dublin. They have developed Emobius (or EyesWeb Mobius), a set of blocks for various types of biomedical processing, as well as a graphical front-end to the forthcoming EyesWeb XMI. It is fascinating to see how the problems they are working on in applications for older persons are so similar to what we are dealing with in music research....

February 14, 2008 · 1 min · 127 words · ARJ

Open Sound Control

The newly refurbished OSC forum web site has sparked off some discussions on the OSC_dev mailing list. One interesting note was a reply from Andy W. Schmeder on how OSC should be spelled out correctly: The short answer is, use “Open Sound Control”. The other form one may encounter is “OpenSound Control”, but we don’t use that anymore. Any additional forms you may encounter are probably unintentional. I have been using various versions over the years (also including OpenSoundControl), I guess this is then an official answer since Andy is working at CNMAT....

January 18, 2008 · 1 min · 93 words · ARJ