The furthest I’ve been able to get with this project for the month was to create a decent tool for Craig to use. I put together a quick video demo below. It’s not super flashy and is still in an early phase, but it allows us to do a ton more much more quickly than prior to this month.
So the way that this works is that there are 2 modes as you’ll see in the upper right hand corner when you first run it. Record and Read mode.
When in the default “Record” mode, you’ll see a white dot, starting in the center of the screen. When you have the accelerometer/arduino plugged in, you should see the white dot move around based on the accelerometer data coming in. There is a file in the project folder called “calibration.txt” that saves the calibration values each time to recalibrate the arduino board. To do the calibration, press ‘c’. Keep the accelerometer as flat and horizontal as you can and it will average the values over the 10 seconds that it runs. Once done, it saves that data to the text file. If that text file isn’t found, the calibration runs right away.
For a recent motion graphics project that I was working on, I needed to use a sucking-in vacuum effect where some logos were being sucked into a TV. Not being a motion graphics expert and jumping into after effects for the first time in a while, I first went searching some forums to see if anyone else asked the question and had gotten answers. Indeed people have asked how to get this affect, but not too many useful answers were out there. Fortunately, I was able to put together something I was happy with and wanted to show anyone that’s interested how the effect was achieved.
I’ve been able to find some time to work on this month’s project and in the few hours I’ve been able to devote, it’s been pretty fruitful. Who uses the term “fruitful”? Whatever, it’s going well so far.
The steps that I outlined for myself were to:
Combine all of my past sketched into one project
Create some sort of visual feedback of live data
Record that data and create something cool with it
Step 1 was a lot easier than I thought it would be. I sometimes forget how easy processing is to work with. I had sketches for calibrating the accelerometer, saving the accelerometer data, and reading and displaying the data. I just created a class for each of those and called their update/draw functions when needed and that was pretty much it. That was only an hour or two worth of work. But really, that just got me to ground level so that I could actually be productive.
The biggest challenge with working with an accelerometer is understanding what the data your getting actually means. Originally, I had it in my mind that the accelerometer was measuring the point in space that it was in and the acceleration between 2 points. That’s sort of true, but after thinking about it and reading up some more, I realized that it’s more about the acceleration among the axis’ then the actual points, since that’s arbitrary information. The next part of thinking through what kind of useful data I could get from the acclerometer was turning those seemingly random numbers into useful numbers. For each axis, I was getting numbers is a range of 225 to 435 or so, which does’t mean anything as they are. What do those number mean really? What you have to do is turn those numbers into something else that makes sense. After going through some arduino forums posts, I found an equation that convert the numbers into a decimal that to stand for gravitational force (acceleration) based on the voltage and sensitivity of the accelerometer. Or something like that. Luckily, that info was easy to find. After working this in, it made a huge difference.
From there, I plotted the x, y, and z force onto a 3d axis and started turning those numbers into velocity values so that I can draw with the accelerometer.
I’m doing something that’s slowing down my processing sketch like crazy, so that’s the next thing to figure out. Once that’s done, I have ot go in and refactor some code and refine my data saving and reading classes. I’ll then pass this sketch to the artists that I’m working with, Craig Damrauer, so that he can start playing with it and coming up with some ideas to turn this data into something beautiful.
In my next update, I hope to share some sort of video demo of my progress. Stay tuned.
February 2012 Project
For my February project, I’ve decided to continue with a project that I’ve already started, but neglected over the last month or two. I don’t want to give everything away, but it’s about taking values from nature using an accelerometer hooked up to an arduino and using the data to create something visually interesting using processing. This is actually a collaboration with an artist that’s also super psyched about this project. He came to me with this idea and I’m here to help him pull it off.
Currently, this project exists as a sketch on an arduino board and 3 different processing sketches that each serve a very specific purpose. One for recording data, one for calibrating the accelerometer, and one for reading and displaying the data. This of course is not an ideal set up, but the processing environment has been great for splitting up the different functions and figuring out each piece separately.
At the end of the month, I hope to have some cool sketches and hopefully something worthy of creating a nice print of. Since this is a collaboration, it might take a few months to get to that point, but this should at least be a huge step towards being able to create some beautiful stuff with this data.
To get there, this is what I need to do:
Combine the previous processing sketches into one project/sketch.
Create an accurate representation of the incoming data from the arduino so that we better understand the data coming in as well as how it relates to the actual accelerometer movements.
Explore ways to showcase the data in a visually interesting way. I hope this will be a series of experiments. I’ll try to post these on my tumblr blog.
January Kinect Project – Results
My project for January 2012 is complete for now. The result is below:
The result is virtual pin art which takes the depth values from a kinect camera and translates them into a depth which is projected for each pin. This was put together using cinder and the kinect cinder block which is the freenect kinect library configured to work with cinder.
There were a few things that I learned, some of them super obvious, but will nonetheless help me make better decisions next time around:
– The kinect has a lot of inconsistencies, especially when it comes to depth data. There are some ways to make things smoother, but you’ll notice that there aren’t too many examples out there that rely on the precise kinect depth values.
– Processing is great for prototyping. Cinder is great for the real thing. Processing helped me figure out what was possible and helped get me there relatively quick, but once I had a lot of particles on screen, things quickly began to slow down and make my processor chug. Once I moved into c++ and cinder, those same processes began to run much smoother.
– Having side projects is hard to keep up with when you have a 1 year old at home and projects to do at work. The is obvious, but not so much when you’re thinking about all the things you want to learn and explore. That being said, I’m still committed to learning and experimenting as much as I can once my first two priorities are taken care of. Who needs to watch crappy reality TV anyway?
I don’t consider this a final piece, but more of a really good proof of concept. Since this was my first kinect, cinder, and C++ project, a lot of the time was learning the capabilities and workflow. I’m obviously not a C++ expert and I know I have a lot to learn, but this was a great way to learn it. For the second phase of this project, I’d like to bring in some 3d textures and shading. Really give it a metal pin art look. I’d also like to smooth out some of the depth map noise. I know I won’t be able to nail that down perfectly, but I’ve read about some methods to smooth that out a bit better within the limitations of the current kinect’s resolution and camera positioning issues. The hope for this is to throw it up somewhere that it can be part of an installation that people can walk up to and interact with and cycle through to see the imprint of previous visitors as well.
To see some of the work in progress, check out my tumblr. Once I clean up the code, I’ll post that somewhere too.