On the 19th of February 2009 we had a party to celebrate the new arrangement of our offices.
At the end of november 08, I ordered new desks at my father’s company. It seems trivial to change desks, but in our office it’s been a revolution. The idea was to bring up the lab and improve the use of space.
This is why the magnificent man call dad sent 2 guys to drill, screw and fix long pieces of wood on the walls (410cm X 90cm each).
The result is pretty nice, very useful and already paying off.
So earlier this year, in our new lab, David and me had a great time building our first project together. We were very proud of our new born, and had the honor to present it during the party.
So to say, this party became the “multitouch and toys party” while the brand new desks were reduced to their first purpose : holding the huge amount of wine bottle we brought.
A lot of people enjoyed the table and the store window turned into a multitouch screen with our homemade IR funny throwies.
As we mentioned in a previous post, we relied on the techniques discussed on the Nuigroup site to drive our tableÂ (seeÂ NuigroupÂ for specifics). They all use computer vision to solve the multitouch problem. In other words, the position of the fingers on the surface is tracked by camera.
A simple webcam can do the trick. However, it needs to beÂ slightlyÂ modified to filter out visible light (so as to avoid capturing the projected image). Then, Â via a process of frame differencing and the aid of various thresholding and image processing filters, you obtain a pure black (0) and white image (1) describing the position of the elements in contact with the surface. This is then used as the basis for the tracking.
In our case, we used a modified PS3 eye webcam, which is relatively cheap, and has some excellent frame rates (640×480 at 100fps).
On the software side, we usedÂ TbetaÂ which is an open source tracking solution written with the openframeworks c++ library. Tbeta tracks the elements in contact with the surface and broadcasts their position and id over udp using the Tuio protocol.
this shows the tbeta interface in action. On the left, the source image from the webcam. On the right the processed B/W Â image used for tracking.
Well, first post in english. Let’s see how it works.
David (called “the mighty Brain” below) and me (called “the clumsy Hands” below) tried, and actually managed, to build our own multitouch screen using FTIR technology. We obtained most of the information to get started onÂ Nuigroup. This is a great community and we wouldn’t have managed without them.Â
For our first try, we wanted to stay low budget.Â We repurposed our old tables to build the main box, tried different silicone-based options for the compliant surface, opted for some rosco projection screen, used some common webcams for the tracking (PS3 eye cam) and finally bought a relatively cheap DLP projector.
Our aim was to make a quick and dirty Multi(please)Touch from scratch, and learn by making experiments. We now plan to build a nicer V2.
The Hand knows how to handle the work when it’s about getting something done.
The Brain knows where is information and how to write code that work.
We plan to post some articles in the coming weeks to discuss the different aspects of the whole process.