Monday, December 12, 2011

Play Fruit Ninja PC version with Kinect

Since we are very closed to finish the final project, I spent a little time to write a mouse control program to play the Fruit Ninja, the cursor runs pretty good. I use right hand push movement to simulate pushing the mouse's left button, and use left arm lift movement to simulate releasing the mouse's left button, here is a video I made tonight. Enjoy!

Thursday, December 8, 2011

Success!

Tonight we had our first successful run of the Kinect teloperation of our manipulator! We passed direction data of the human user into Matlab, then opened up the M2 microcontroller as a serial device. Then we could easily control the arm with the timer outputs. Here's a video of the first voyage of the Kinect Teleoperator.

Tuesday, December 6, 2011

Communications

To send wireless signals from the Kinect processing unit to the manipulator we decided to use an M2 MaEvArM.  These are very compact microcontrollers with wireless add-ons that have a 16 MHz processor, multiple timer outputs, ADC inputs, and GPIO pins among a number of other nice features.
We chose this microcontroller so we could control the 5 servo motors of our manipulator with five independent timer output signals.  The wireless communication is also very simple because this microcontroller was meant for use in many MEAM classes at UPenn.

Tuesday, November 29, 2011

The robotic arm

We've split up into two subteams: the CS team and the EE/ME team.  The CS team will be working on finding out how to run the Kinect and programming it so we can extract the necessary data.  The EE/ME team wanted to take on the daunting task of building a robotic manipulator from scratch.  I had done something like that before in my undergrad days, and was looking forward to designing one again.

We were given an OWI robotic arm like this.

This robot is driven completely by dc motors.  The controller in the picture operates the motors at a single speed bidirectionally.  There is no position control or velocity control whatsoever.

This further convinced us to build our own robotic arm out of servo motors for their ease of position control.  Luckily, Haofang found a robotic arm from his work in the Mod Lab assembled from a kit from http://www.lynxmotion.com/c-27-robotic-arms.aspx.  This saved us a lot of designing time, but we still needed to build a gripper.


I ended up designing a gripper that could be easily built from laser cut ABS and a servo motor which is based on a gripper design by an old MIT colleague by the name of Carrick Detweiler.  It looks something like this.


Here's the finished product.

Friday, November 18, 2011

Kinect Development Environment and Get the Skeleton



We established the Kinect Development Environment under Windows 7, with openNI, Kinect Driver and openCV 2.3 .

After constructing the development environment, we wrote the skeleton code by using openNI functions, the picture shows the result, and then we realized the only thing we need would be two arms, so we adjusted the code to work with only two arms.

Thursday, November 17, 2011

Kinect Virtual Reality Project beginnings

We are a group of four grad students taking an Embedded Systems class here at UPenn.  We are extremely excited to be working with the Microsoft Kinect to develop a type of virtual reality for the user.  Our team consists of Haofang Yuan, Josh Karges, Yixin Jin, and Tao Lei.


We wanted to incorporate some of our mechanical design knowledge into this project as well as the topics we've learned in the class.  There are hundreds of videos online of people doing very interesting projects with the Kinect, and we used a few of them as inspiration. Chris Harrison worked with Microsoft to develop the Omnitouch project, http://www.chrisharrison.net/index.php/Research/OmniTouch. This is an incredible project that allows the user to interact with a projected interface by detecting the position of the user's own hands. Another project involves a guy who hooked up an OWI robotic arm to respond to the motions of an iphone, http://youtu.be/ttV-gXw3s3U. We thought we'd try to combine these two ideas.


Our initial proposal was to control a robotic arm by touching projected buttons on a nearby surface. We quickly found that the depth camera that Chris Harrison used for the project was specially made for him by PrimeSense, so we decided to alter our objectives.


Our objectives for this project will focus on controlling a robotic manipulator with a Kinect.  We plan on designing and constructing our own custom manipulator with at  least 4 degrees of freedom.  To control this manipulator, the user will be standing in front of a Kinect and move his/her arms in the air. When the Kinect captures the movement of the person's arms, a computer will process the information from the Kinect and then transmit the control signals wirelessly to the manipulators, which will then respond accordingly.