15 November 2014

Week 2 of The Build


This week we focused on building a rig to hold the Kinect and projector and continued to work on tracking the paper.

When building our rig, we knew it had to be able to support the weight of the Kinect and projector combined. In addition, the Kinect and the projector had to be aligned to make tracking easier. Initially, we felt that the Kinect and projector had to be perpendicular to the surface. Our initial rig allowed the Kinect and projector to be perpendicular to the surface (no picture available - we forgot to take a picture). However, this initial rig was not secure enough for our needs. Through talking to lab assistants and doing our own research, we found that we can have them at an angle and still be able to do the tracking and projection. With these constraints, we built the rig shown in the image above. As you can see we use a camera Tripod to hold the devices. The Tripod allows us to change the angle of projection and also the height.

Once we have completed tracking of the paper with the Kinect, we plan to use Ogre to project onto the paper. Ogre is an open source 3D Graphics Engine. Therefore, in order to project onto the paper, we will model the paper in a 3D scene and place a camera in our scene that matches the position of the physical Kinect and projector in the real world. This will allow us to map a texture onto the paper that will then be projected.

Before we can accurately track the paper, we need to calibrate the Kinect. A research paper from Microsoft Research [1] describes how to perform quick calibration using a Kinect. To summarise the research paper, a checkerboard is shown to the Kinect and a maximum likelihood framework is used to calibrate the camera. The checkerboard pattern is commonly used in colour camera calibration. Since we have a projector, we project the checkerboard through that instead of having a physical checkerboard for calibration.

The images below show the projection of the checkerboard pattern and how this pattern is fed into our Kinect software to calibrate the device.



References:

[1] "Zhang, Cha; Zhang, Zhengyou; ",Calibration between depth and color sensors for commodity depth cameras,"Multimedia and Expo (ICME), 2011 IEEE International Conference on",Pages 1-6,2011,IEEE



Related Posts:

  • The Start of Implementation After completing a literature review the previous week, my group and I began getting to grips with implementing our interactive device. To begin with, we felt there were two key parts to the project that we should start w… Read More
  • In the Literature: Flexible Surfaces & DisplaysAfter further thought, my group have decided to explore flexible surfaces and displays. We did not feel that the wearable ring idea had enough scope to make a really good interactive device. In recent years, there has been i… Read More
  • Further Idea Discussions - Workshop 2 This week's workshop gave my group the opportunity to further discuss our ideas and also try out some hardware. Available was an Ardiuno Kit, Intel Galileo, Kinect Sensor, Leap Motion and Oculus Rift. The initial ideas we … Read More
  • Week 3 of The BuildAlthough we seemed to make good progress last week, the majority of our efforts this week involved having to reimplement a lot of what we had already done and thought we had completed. We had calibration of the Kinect and p… Read More
  • Week 2 of The Build This week we focused on building a rig to hold the Kinect and projector and continued to work on tracking the paper. When building our rig, we knew it had to be able to support the weight of the Kinect and projector com… Read More

0 comments:

Post a Comment