Project FluidWall


This is a project I have been working on with another student from Texas A&M Visualization Lab, Austin Hines.

Here's a link to the project's initial draft on Austin's blog.
We've also made this project opensource and created a whole separate blog for FluidWall.
FluidWall Source: http://code.google.com/p/fluidwall/
FluidWall Blog: http://fluidwall.blogspot.com

The idea was to use the depth information and user tracking from the Kinect to be able to interact with a fluid simulation on the screen. Austin Hines worked on all aspects related to the fluid simulation, while I handled exploring and integrating the SDK that would let us communicate with the Kinect with the most ease. Using the OpenNI  along with with NITE middle ware we were able to get some really nice and simple depth data working with our simulation. This allowed us to have silhouettes of people / objects (whatever the depth sensor returned) to interact with fluid on the screen. Any movements would trigger the fluid which would dissipate in a bluish-white whirl. as depicted in pictures of the original idea on Austin's blog (mentioned above).

Soon we were able to track users and send appropriate data into the fluid simulation. This allowed us to start emitting different colors for different users.

The final result was quite amazing...


No comments:

Post a Comment