MP4 Kinects and Motors

Posted: Tuesday, April 2
Due: Tuesday, April 28 (before class, meet in HCIL)

We've now acquired significant experience working with the Arduino hardware prototyping platform. In MP1 and 2, we designed and implemented

new interactive experiences using custom hardware that we built. With MP1, the focus was on creating new input devices and interactions with desktop computers. With MP2, we left the desktop environment all together and explored embedding computation in low-tech materials (paper, fabric, wood, cardboard). With MP3, we explored reappropriating existing electronic artifacts and remixing them into new interactive forms. In this assignment, we continue the theme of combining the physical and virtual worlds in unique ways but our approach changes. Enter: the Microsoft Kinect. In this assignment, you will use the Microsoft Kinect, which combines an IR camera (for depth) and a traditional RGB camera (for visuals) into a single sensor. Microsoft recently donated eight "Kinect for Windows" version 2 sensors to the HCIL Hackerspace so we will try using those. If you experience problems developing for v2, then you can default back to the original sensors

KinectsCropped.jpg
Thanks to Microsoft for donating Microsoft Kinects versions 1 and 2 over the past few years.

What To Do

In this assignment, your goal is to use the Microsoft Kinect to create a physical-based interaction that is digitized, analyzed, and then used to control/actuate something in the real world. Hence, Kinects and Motors. :) You will use the Kinect along with computer vision to translate the physical into the virtual and the Arduino (or the Raspberry Pi or BeagleBone) to translate the virtual back to the physical.

Here are some examples (all of which use motors but not necessarily cameras/computer vision--you must utilize both in your project):

Here are some successful examples from previous version of this course (note their assignment may have been a bit different from this year's version).
  • (motor)chestra: using physical gesture to play music mediated by mounted motors on a cabinet
  • Atmosphere: a custom interactive sandbox that traces a person's movement through space
  • Sketch Engine: light painting with the body, lasers, and open shutter photography
  • Polite Bunny: a bunny that follows you in a window using gears and motors
  • Somebody's Watching You: an interactive poster that follows you when you move past it
  • The Force: Uses a Kinect and an IR emitter to control a wireless helicopter using gestures.
  • The Friendly Waving and Dancing Bunny: Uses a Kinect, LEGOs, a stuffed animal, and servo motors to create an interactive animatronic bunny that responds to various gestural interactions (like dancing and waving).
  • Kinect: And Then There Was Light! To save power, many rooms in AVW turn off the lights to save power if no motion is sensed for ~20 minutes. In this project, the Kinect is used to count the number of people in a room and, if the count > 0, the system activates an arm attached to a servo motor to trigger the room's motion sensor to turn the lights back on.

Here are some cool Kinect hacks that could possibly be remixed, improved upon, or extended with motors--if nothing else, these examples are inspiring. :)
  1. Theo Watson and Emily Gobeillo of Design-IO.com used openFrameworks and libFreenect to build an "Interactive Puppet Prototype with XBox Kinect." Here's version 2.0, which is even more impressive!
  2. Kinect-based touchscreen by combining depth sensing with projection (link). More on using Kinect for multitouch (video).
  3. Oliver Kreylos' famous 3D-video capture with a Kinect (more info here).
  4. Many people have tried using the Kinect for 3D-scanning (e.g., Kinect Fusion link1 and link2). These models can then be imported into video games or even 3D-printed (e.g., Kinect-Based 3D-scanning, link2, link3).

Note: the Microsoft Kinect also has a microphone array for localizing sound and a speech recognition SDK. You are welcome to experiment with this as well but it is not required for this assignment.

The Kinect

Kinect for Windows vs. Kinect for XBox

We will be using the Microsoft "Kinect for Windows" sensor (rather than the XBox360 or XBox One Kinect sensor) for this assignment. What's the difference you ask? Good question. According to the Microsoft Kinect Developer FAQ, the "Kinect for Windows" sensor "The Kinect for Windows sensor is a fully-tested and supported Kinect experience on Windows with features such as “near mode,” skeletal tracking control, API improvements, and improved USB support across a range of Windows computers and Windows-specific 10’ acoustic models."

So, it would seem that the Windows-based sensor has a special "near mode" to deal with the fact that people will be much closer to the Kinect sensor on a PC/laptop than they would be with an XBox360 Kinect sensor (i.e.,using it in living rooms). Again, according to the Kinect Developer FAQ, "'near mode' enables the depth sensor to see objects as close as 40 centimeters and also communicates more information about depth values outside the range than was previously available. There is also improved synchronization between color and depth, mapping depth to color, and a full frame API."

Kinect v1 vs. Kinect v2

Microsoft's MSDN Channel 9 website published a summary of differences between "Kinect for Windows 1" and "Kinect for Windows 2" (see figure below). In summary, the new device has an HD color camera, a higher resolution depth camera, a larger field of view, and can track 6 vs. 2 bodies (skeletons) at the same time. Read more here. In addition, SDK 1.8 for Kinect v1 supported multiple simultaneous Kinects plugged into the same computer and had functions to control the IR emitter, SDK 2.0 does not have those features (source).
Kinect1vs2.png

The most important difference for us, perhaps, is that the Kinect for Windows SDK 2.0 requires Windows 8 and a USB 3.0 connection. If you do not have access to Win8, then I suggest using a Kinect v1 for this assignment. I am still investigating whether you can develop for Kinect v2 with a Mac--you are welcome to try it and report back via Piazza.

Interestingly, on April 2, 2015, Microsoft stated that they are no longer selling the "For Windows" version of Kinect (link). You can use the Kinect for XBox One sensor on Windows using an adapter, so Microsoft is just consolidating engineering efforts.

Setting up the Kinect Development Environment

In the past, I have supported both Windows and Mac-based Kinect development for this class. Due to the closure of the leading open source framework (OpenNI), I must strongly recommend that you use Windows for this assignment. I have two Windows 8 laptops (Dell XPS 15z and ASUS Zenbook Z31) that I am willing to lend out to help (neither laptop meets the minimum requirements for Kinect v2 development). I have also put together a poll on Piazza inquiring about access to Windows development environments (link).

Setting up Kinect v1 On Windows using Official SDK

The "officially supported" way of setting up your development environment for the Microsoft Kinect on Windows involves the following steps:
  1. Download and install Microsoft Visual Studio. You can get the full version for free from your Microsoft Dreamspark account, from TerpWare, or download Microsoft's free version of Visual Studio, which is called Visual Studio Community
  2. Download and install the Kinect for Windows SDK v1.8. If you can't get this link to work, here's a local backup.
  3. Download and install the Kinect for Windows Developer Toolkit. If you can't get this link to work, here's a local backup.
  4. Plug the Kinect into your laptop/PC
  5. Now open the Developer Toolkit Browser v1.8 and launch an example Kinect application. Play around with these samples--they all come with code (click 'Install'). It's very likely that you will start with an existing example application and modify it for your project. This is what we did in class on Tuesday, April 7. You can download the demo code covered in that lecture here.
  6. Optional: You can also install language packs if you so desire (French, German, Italian, Japanese, and Spanish are supported)

Setting up Kinect v2 on Windows using Official SDK

  1. Download and run the Kinect Configuration Verifier that checks your system and ensures that your OS, drivers, graphics card, Direct X libraries, etc. are all compatible with Kinect v2. Note: I have two relatively new desktop workstations (Dell T1600s from 2012) and two Windows 8 laptops (Dell XPS 15z and ASUS Zenbook Z31--both from 2012) and none of them were sufficient to work with the new Kinect v2s. The two Dell workstations lack USB 3.0 and the two laptops have USB 3.0 but evidently not the correct USB 3.0 controller.
  2. Download and install Microsoft Visual Studio 2013. You can get the full version for free from your Microsoft Dreamspark account, from TerpWare, or download Microsoft's free version of Visual Studio, which is called Visual Studio Community
  3. Download and install the Kinect for Windows SDK v2.0. If you can't get this link to work, here's a local backup.
  4. Plug the Kinect v2 into your laptop/PC's USB 3.0 port (instructions)
  5. Browse the Kinect v2 Technical Documentation and Tools

Setting up Kinect v1 On Mac OS X using a Virtual Machine

The newer Kinect SDKs (1.6, 1.7, 1.8 but not 2.0) work on Windows running in a virtual machine. To get this to work, see: Using Kinect for Windows with a Virtual Machine. You can download a copy of Windows using your Microsoft Dreamspark account. Note: you cannot use Kinect v2 using a virtual machine.

Setting up Kinect v1 On Mac OS X using OpenNI

In previous versions of this class, OpenNI (Open Natural Interaction) was the recommended approach for Mac-based Kinect development. However, Apple purchased PrimeSense, the main company behind OpenNI, and shut down this project on April 23, 2014. You can still obtain OpenNI libraries (see below) but the OpenNI website is down making it rather difficult to pursue this approach (e.g., limited available documentation, forums are gone, etc.).
  1. You can download this local OpenNI 2.2 backup (which I downloaded last year), use the now stale but still available git repo, or try a managed fork. I also found this website, which provides detailed instructions on setting up OpenNI and NiTE using archived zips; however, I can not vouch for this website and caution you to download and install software from unknown sources/websites.
  2. With OpenNI installed, you can also install and use the SimpleOpenNI framework for Processing. Because I installed all of this software last year (Spring 2014), I have this working on my Mac if you'd like to see it.

Setting up Kinect v2 on Mac OS X

According to Microsoft, there are two (official) options for developers who have an Intel-based Mac--both require installing Win 8.1 64-bit on the Mac: (1) install Windows to the Mac’s hard drive and use Apple's Boot Camp, or (2) install Windows to an external USB 3.0 drive and use Microsoft's Windows To Go (link). It appears that unofficial, open source support for Kinect v2 is active but still in progress (in contrast to v1 where there is a very mature and vast open source ecosystem). Here's a blog post about Kinect v2 Community Projects and the OpenKinect's libfreenect2 github. You are welcome to try these but at your own risk.

Kinect Resources

Helpful Links

Books

  • Jared St. Jean (editor of developkinect.com), Kinect Hacks: Tips & Tools for Motion and Pattern Detection, O'Reilly, 2012, Amazon
  • Greg Borenstein, Making Things See: 3D vision with Kinect, Processing, Arduino, and MakerBot, Make:Books, 2012, Amazon, Safari Online

Motors

There are three main types of motors: DC Motors, Servo Motors, and Stepper Motors.
  • DC Motors are fast, continuous rotation motors often used for things that need high RPM like car wheels and fans
  • Servo Motors are responsive, high-torque motors but with limited angle of rotation (e.g., 120-180 degrees). Servo motors require additional circuitry for positioning information. So, though you can precisely position a servo motor, if this position was blocked by, say, a wall or some other obstruction, your motor system would not know this without additional circuitry.
  • Stepper Motors have precise rotation that is easy to setup and control but are comparatively slower than servo motors. While a servo requires additional support circuitry to provide positional feedback information, a stepper motor gets this information "for free." Thus, stepper motors are used for 3D-printers, CNC machines, and other devices that require precise positioning information.

HCIL Hackerspace Motors

We have a bunch of random motors in the HCIL Hackerspace including hobby DC motors and servos as well as motors embedded in built artifacts like the Parrot AR.Drone and the IR RC Helicopters. For example:

Motor Tutorials

Feel free to post additional helpful tutorials or forum entries to Piazza and I'll add them to this list.

Motor Controllers

DC Motors


Servo Motors


Stepper Motors

Tools/Library Usage

As before, you can use whatever developer tools, IDEs, debuggers, libraries, and/or code snippets you find to support turning your ideas into a reality. Of course, you must keep track and cite the use of any code or libraries you use in your project. You must also include citations towards projects that inspired your own. Do not be shy to include as many links as you can that influenced your projects form or function in some way.

Remember to also include citations (with URLs) in your code via comments to all code that you borrowed from or extended from blogs, forums, open source, etc. If I find code that was copied and not appropriately cited, I will consider this a direct violation of the UMD Academic Integrity policy. You will not be penalized for re-using or re-appropriating cool things in this class, you will be penalized for not properly attributing them.

Assignment Deliverables

The assignment deliverables are due before lecture begins.

  • Utilize github to store and post your code. This should be publicly viewable and accessible. You are welcome to use any license you like on the code itself (including no license at all--e.g., None). When you use other people's code, you must cite your source--even if it's just a blog post and a small snippet. I believe github provides academic accounts (for additional features, please check the website).

  • Post a Wiki write-up to your own wiki subpage on this wiki (example).

  • Upload a video demoing your submission to YouTube. You should include the link to the YouTube video in your Wikipage. Please take the video creation process seriously--video is one of the best forms to portray the interactivity and sheer awesomeness of your inventions. I hope that you create something you would feel proud of to show your friends or family.

  • Presentation/demo. On Tuesday, April 28, we'll have a presentation/demo day. We will dedicate the whole 75 minutes to this (if not more!). It's up to you how you want to present your work--you could do a live demo for the class, play all or part of your video, show slides, or do an interpretive dance. After all presentations are complete, we'll use the remaining time in the class to interact with each others demos.

Assignment Grading and Rubric

Assignments in this class will be graded on novelty, aesthetic, fun, creativity, technical sophistication, and engagement. All assignments (including the project) will be peer-reviewed by everyone in the class including me. We will rank our favorite projects and the top two or three teams will receive an award.

In-Class Presentations/Demos

You must fill out this Peer Feedback Form during the presentations/demos and submit it before the end of the day. Do not forget to also fill out this partner evaluation form due April 30th before class. We will present today in the order listed in the Completed Assignments starting with Tint Picker.

Completed Assignments

As before, please list your completed assignments below.

0. Project Name as a Heading 2 (linked to wiki write-up)

Team name or student names
A two or three sentence description of your artifact

1. Marvin

Sriram Karthik Badam
Marvin is a wall-mounted bot that responds to your actions in a physical space. He performs four different actions based on what you are doing in his space.

2. Ukulele Player

Seokbin Kang, Beth McNany
Ukulele Player lets you play a ukulele by waving your arms, using a Kinect to capture motion, and servo motors to strum and form chords.

3. Kinetic ToPoGo

Majeed & Brian
Robot that mimics user body movements and has some pre-programmed commands.

4. SpiritCharades

Philip & Jonggi
Guessing body gesture of the partner from the vibration of motors. The posture of the two people is measured by Kinect.