A few years ago the EU project TAI-CHI (Tangible Acoustic Interfaces for Computer-Human Interaction) explored using acoustic cues from tapping and scratching on surfaces to control devices. It was interesting to see that they managed to create fairly advanced “touch screens” using only 1-3 contact microphones attached to the surface.

Here is a nice video demonstration made by Chris Harrison, demonstrating a system he calls Scratch Input for controlling devices based on detection of patterns from the acoustic signal coming from a contact microphone attached to various surfaces:

{.youtube-video}

I am currently planning the Sound programming 1 course in the fall semester, and implementing a system like this seems like a good exercise for the students.