Project thumbnail

Touch Glove

6 September 2019
A homemade data glove with motion tracking using a DragonBoard 410c

The aim of this project was to create a gesture input device to use with a laptop. One of the main goals was to be able to grab a window or a virtual pen with two fingers and then moving it around.

I realise there exist some very good tracking and gesture recognition devices that don't require a special glove or a wire attached, but they tend to be rather expensive. So I decided I will make the glove out of the things I had at home and all I needed was some wires, a winter glove and a DragonBoard 410c (a Raspberry Pi or an Arduino could also do, but that's what I had at home).

To make the glove, I threaded some naked wires in the tips of the fingers on the glove which acted as a simple switch when fingers were held together in various gestures. For example, holding together a thumb and an index finger resulted in a holding action on the screen (e.g. holding and dragging a window around). I also fitted a small tracker in contrasting colours onto the glove which helped the movement tracking with the laptop's webcam.

Front side of the glove
The front side of the glove with visible sensors and tracker target.

The glove was connected to a DragonBoard 410c which was reading the state of the glove, encoding it, and sending it to the laptop via a USB cable. I initially, tried sending the data via WiFi but the there was too much latency for a seamless experience.

Glove setup
A glove is connected to the DragonBoard 410c which encodes and sends data to the laptop.

Most of the hard work is done by the software running in the background on the laptop – it listens for USB data sent from the DragonBoard 410c, it is tracking the movement of the glove, and merging all this data to simulate accurate mouse actions.

The glove supports the following actions:

  • Mouse movement: To move the mouse around the screen, I simply move my hand around in front of the webcam.
  • Mouse click, mouse drag: Touching an index finger and a thumb simulates the pressing of the mouse button; putting the fingers apart will release the mouse button. Done quickly it simulates a mouse click. By moving my hand while holding the fingers together, it is possible to hold and drag items around on the screen.
  • Changing mouse speed: Touching a ring finger and a thumb scrolls through different mouse speeds – faster for quick movements, slower for precise movements.
  • Switching between the windows: To enable the window switching mode, a middle finger and a thumb need to be touched. Now touching the index finger and the thumb opens the last active window, and touching the ring finger and the thumb circulates through all open windows. Touching the middle finger and the thumb again toggles the window switching mode off.

You can see all these actions in a demo below:

What I really liked about this project is how I was able to use a different programming language for each component to maximise the overall performance of the whole system. For example, the code on the DragonBoard was written in C++ because it needed to be fast on the DragonBoard's slightly limited resources, and the code for motion tracking was written in Python because I used the OpenCV library which simplified the process. It was an interesting exercise in making different subprograms work together as one system.

You can find the code and detailed setup instructions on my GitHub.