Prof Hsu’s code from the past workshop: http://userwww.sfsu.edu/~whsu/2012.3.2.ACM/
Prof. Hsu’s interface workshop will be this Friday, Nov. 16th 3PM!
This is a great opportunity to discover how code you write can interact with you through a device!
LOCATION IS : Blakeslee, Thornton Hall 10th Floor 3PM
In case you haven’t been up there, this is the nice meeting room above the observatory.
Take the Thornton elevator to the 9th floor, walk just outside the elevator area and look
around for a door with a sign by it. Go up the stairs and you’ll be there.
This will be an informal workshop/tutorial on working with interface
devices beyond mouse and keyboard. We’ll use Processing, a Java-based
environment that is easy to learn, but powerful enough to be used
for real applications with graphics and animation.
Tentative list of topics:
- quick overview of Processing basics
- programming graphics/animation in Processing
- OpenSoundControl (OSC) protocol
- using TouchOSC with iPhone/iPod Touch and OSC
- real-time camera input and OpenCV
- Kinect hacking
IMPORTANT: Please read the following requirements as well as download/install the corresponding software before coming so we can start working right away.
Processing uses Quicktime for Java for video. This should work easily on the Mac.
Windows users should read the following:
Note: That whole video thing is kinda a turn off, I know I’ll be in the lab starting at 2 at the latest, so if you don’t want to fuss with this stuff come in and we’ll do it together.
Download OSC for Processing:
Download Daniel Shiffman’s openkinect for Processing library:
Also remember to bring a device to test on (perhaps a tablet or smart phone), and a laptop.
Windows users: In order to get video capture to work in Processing, it is likely you will need to install WinVDIG. You can find WinVDIG here. It seems that many people are recommending 1.01, personally though, only 1.05 is working for me. My setup is Quicktime 7 and 1.05.