While debugging some Max/Jitter patches, I just noticed that the built-in iSight camera in my aluminum MacBook only returned 15 fps. I have always imagined that it was running at 30 fps, but turning on the “unique” attribute in jit.qt.grab showed that 15 fps is what you actually get. Looking around the web I found a post that the camera runs at 30 fps when there is good lighting, but drops to 15 fps when the lighting is low. So here in normal lighting in the office it gives 15 fps, but when pointing the camera at the window it gives 30 fps.
What I discovered along the way is that the camera actually has a sensor with 1024x768 pixels. While the normal API only returns a 640x480 image, apparently all the pixels are accessible in Quartz composer. Learning Quartz composer is somewhere in the middle of my long to-do list, but this would be a good reason to get started.