HandShake
Update, April 2024: I’ve got a ‘proof of concept’ working using some nicely polished off-the-shelf hardware made by Lilygo. A few (many) improvements to make and the odd bit of ‘unintentional functionality’ (bugs) to remove.
Introduction
HandShake is a project to help enable people who need to access software to create speech to be able to access this technology through hand motion. The technology is not restricted to the hand, it can be used anywhere that an intentional shake can be made, such as a foot.
Many people who are unable to speak use specialist software to create speech e.g. Sensory Software’s Grid . This software can often be operated using buttons or joysticks and is known as switchable software. Please see a more detailed explanation and a video showing one software package in my post Using the microbit to control switch access software.
Some people are unable to use physical controllers such as buttons and joysticks but are still able to make intentional hand movements. This project aims to detect this movement to operate a standard assistive technology switch. This allows operation of the specialist software or any device that is adapted to operate from an assistive technology switch.
HandShake v3 - using Lilygo hardware
I recently started porting the project to use some nicely polished off the shelf devices:
Here’s a photo of the watch on my wrist:

T-Watch running handshake firmware
The two units running the firmware that I wrote for them can be seen below:

Lilygo T-Watch S3 and Lilygo T-Embed
A short YouTube video showing a proof of concept can be seen here.
The watch has an accelerometer that is configured to detect when the watch is shaken above and adjustable threshold. The watch then sends a trigger signal to the T-Embed. The two modules communicate using ESP_NOW. I can use ESP-NOW as both the watch and the T-Embed have the same ESP32 S3 chip powering them. The watch also has a vibration motor which gives a buzz when the trigger is sent.
The amount of shake that is needed to create a trigger is adjustable. The T-Embed has a jog dial around the button. This is used to adjust the trigger threshold. A little scale showing the trigger threshold can be seen on the T-Embed display. The button on the T-Embed has three uses:
-
A long press: The T-Embed sends the new trigger signal to the T-Watch.
-
A short press: The unit receives the greatest shake value for the last 5s from the watch so you can tune in the right trigger level. The watch shows both the trigger threshold and the last shake it received on two scales.
-
A double tap: Sends a trigger signal to the Joycable to test it out.
The T-Embed receives the trigger signal from the T-Watch and then sends a signal to an assistive technology switch adapter such as the Joycable. This adapter plugs into the USB port of the laptop that is running the speech software. The adapter translates the signal from the T-Embed to appear as a keyboard key press. This keyboard key press is used to control the target software.
The adapter cable can be seen here:

Adapter cable in the T-Embed
To enable the T-Embed to operate the Joycable I made a little board that plugs directly into the T-Embed. Please see the figure below:

T-Embed showing switch adapter board
The T-Embed has a ring of LEDs around the button. These flash when a trigger is received. It also has a little speaker which allows for an audio jingle to play when a trigger is received. Right now, the unit gives a tasteful ‘quack’ when the trigger is received. My wife pretends to like the sound of quacking coming from the little room where I pretend to work on this project. She has patience.
The advantages of using the Lilygo hardware over my earlier iterations are:
-
It looks nice. Why shouldn’t the end-user have something that is presentable?
-
It’s robust - shake proof.
-
Easy to distribute - anybody can buy the hardware and load on my code. There is a small board to make for the T-Embed but this can be ordered for ~$10 from e.g. jlcpcb.com. I’ll post all of the necessary designs.
Once I get the project a bit more polished I’ll try it out and see what needs improving with the feedback I get.
Obvious question - why not use this type of hardware from the start? It didn’t exist. Hat off to companies like Lilygo who take the gamble of making devices aimed at hobbyists and tinkerers with no clear use case. I came across Lilygo devices from a post on the hackaday blog.
Q2 - Where’s the Github with the goodies? I’ll upload this and put a link here. I’ve got the Gitgub running locally right now.
HandShake v2 - using the micro:bit
The last iteration of HandShake used a pair of BBC micro:bits to enable triggers to be sent to switchable communication software to enable speech to be composed. A trigger is sent to the communications device when the user moves a hand above an adjustable threshold of acceleration.
This project was presented at the 2018 Communication Matters conference and will appeared in their journal in 2019. Almost all gesture recognition software focuses on the user replicating a predefined gesture, such as a swipe or sign-language. This is not practical for the students that I work with at Beaumont College who have cerebral palsy (CP).
The project code is on the GitHub site: https://github.com/mattoppenheim/microbit_hand_shake
Setup instructions are here: setup instructions
I made a video to complement the setup instructions here:
I wrote a Circuit Cellar article detailing the hardware and software, which is online here.
Testing with micro:bit based hardware
One micro:bit is worn on the wrist and a second micro:bit is connected to a laptop or communications device. When a wrist movement is made, the micro:bit on the wrist uses its radio to signal the micro:bit connected to the laptop. The micro:bit connected to the laptop then triggers an event on the laptop through the USB cable connecting the micro:bit with the laptop. The trigger sent from the micro:bit can also be used as switch to control communications software, such as Smartbox’s Grid software. Thanks to Sensory Software for giving me a licence for this software to allow me to test the hardware with it.
For initial testing, we asked two of the students at Beaumont to operate a light through gesture. Please see the video below.
The advantage of using the micro:bit is that somebody else makes them! Rather than my having to construct my own kludge, somebody else has kindly made a board which has all the widgets I need. The micro:bit boards are around £12 each. You need two for this project. Instructions on how to make your own fashionable wrist holder for a micro:bit from a £3 iPod arm band are here. All of the code and a manual showing how to replicate the system can be found on my GitHub site at: https://github.com/mattoppenheim/microbit_assistive_technology
The system was designed when Grid 2 and Windows 8 were in fashion. To enable the system to trigger Grid 2, we needed to use a Leostick plugged into the communication device, pretending to be a physical keyboard. I could not get Grid 2 to respond to software signals from my script. I needed to get the Leostick to send pretend keystrokes to Grid 2.
With Grid 3 and Windows 10, Leostick is no longer required. Grid 3 responds to software commands. The online manual and software on GitHub is now changed to reflect this. The Leostick is no longer needed!
The system will still work with the Leostick, it is just an uneccessary extra expense now. It could still be necessary for software, that like Grid 2, does not accept software generated keystrokes.
The Leostick technology guarantees that the system will work with any software, so I’m happy that I spent the time getting this working.
The rest of this article talks of earlier versions of hardware that I implemented using home-made circuits before the BBC micro:bit was widely available.
Testing with XBee based hardware
The photo below shows preliminary testing at Beaumont by one of the students. The pouch on the student’s wrist contains a microcontroller board and an XBee module for wireless data transmission. An accelerometer is attached to the XBee module. As the student moves, data is collected by the microcontroller and sent to a laptop for analysis. When the student shakes his arm within a comfortable range of ‘shakiness’ a bright LED called a blink1 attached to the laptop on the right flashes. The laptop screen shows the accelerometer data. Eventually we hope to do the processing and gesture recognition within the microcontroller on the student’s wrist and have this signal the student’s communication device directly. The accelerometer data display on the laptop allows me to develop the software to do this. See below for more details on the hardware.
Initial research used the Leap Motion. We found that the space that participants could interact with the Leap Motion was too limited for our user group. The code developed for recording and matching gestures will be tested with other technologies though, so the time was not wasted.
To continue with this research I moved to using an accelerometer to measure hand or wrist motion. Initially we will process and pattern match this data in a laptop to recognise a student’s gestures.
To display the real-time accelerometer data I wrote a user interface using the python library pyside. The python library pyqtgraph is used to display the accelerometer data for the x,y, z axis. The YouTube video below shows a recording of an early version of the interface, with data being displayed from the accelerometer real-time and the sample rate being changed. The display is lot more sophistimacated (sic) now.
I will make all of the code available once the project is finished. I use an assembla repository to store the code and would encourage any programmer to set up a git repository. You might think that you have adequate backups of your code…
HandShake v1 - using the pyboard
The hardware has been through a few iterations. Initially I set up two way wireless communication with a pyboard connected to a digital accelerometer. The pyboard is programmed with micropython, so the entire tool chain from the hardware to the user interface is python 3. This code is built on the work of other programmers, who kindly put their code online. The pyboard worked but was a bit bulky. I got a slimmed down system working using an XBee series 1 to directly sample an attached analog accelerometer. This worked well for testing and verified the idea as feasible. Then I went to a talk about the the microbit. The killer feature is that these have board to board wireless communication as well as an accelerometer and microprocessor. This little board looks like it could be the way to go. Plus I can code using micropython again. I needed to change the default range of sensitivity on the onboard accelerometer. After figuring out how to do this I think that I can implement my ideas with this platform. Why didn’t I use this board to start with? It didn’t exist when I started on this project.
I wrote an article on the Pyboard and micropython for Circuit Cellar. Please find a pdf here.
Here is a picture showing three iterations of hardware. On the left is an Xbee series 1 attached to an accelerometer kludged together on a breadboard. The board is powered by a single AAA battery using a dc-dc converter to pump up the voltage to 3.3v. In the middle is the same hardware, but soldered on to a prototype board. Best to solder everything down when it is being shaken around! On the right is the microbit board, powered by 2 AAA batteries, which is the board I am now using.

Xbee series 1 on breadboard, Xbee series 1 soldered down, microbit .
HandShake v0 - using what was available many years ago
A number of smart watches, such as the Pebble, which have accelerometers built in, were trialed. These are designed to pair with a smartphone using Bluetooth Low Energy (BLE) and periodically send and receive data, not to constantly stream accelerometer data to a PC, which is what is required for the initial development work. It would have been nice to get something working reliably with this platform, as the students at Beaumont would be quite happy to wear the latest smartwatch. The only smartwatch I found with a stable and reliable link with a PC was Texas Instruments EZ430-Chronos. This comes with its own receiver dongle, so there is no issue in setting up a reliable link between this watch and a PC. However, the data sampling rate is limited.
I found a few Sparkfun WiTilt accelerometer and gyroscope sensor boards in the lab, left over from a long dead project on tracking people indoors. These are well designed boards with both a wired and an old school Bluetooth interface. Using the Bluetooth interface is a pain as for each iteration of code the device has to be reconnected. Using the wired serial interface allows for faster iterations of code as there is no re-connection to do each time the software is changed. I got this streaming accelerometer data to my laptop. However, this device is no longer manufactured. I emailed Sparkfun who said they had no plans to make anymore. So I started to look at what we can get off the shelf now. Using live hardware allows for others to easily replicate and improve anything that I come up with.
The Pyboard caught my eye. This runs micropython, which allows me to program the board using a version of the Python programming language. As this is the language that I use for my gesture recognition code, I figured this gives me a chance to eventually have all of the pattern matching done on the board. Initially I will take the accelerometer data from the board real time and process it on a laptop. Having the pattern recognition done on the accelerometer hardware will make for a better device that does not need to be constantly paired with a PC. The board will chug away on its own and when it recognises a gesture, send out a signal. That’s the plan anyway.
I’d heard of micropython for a while, but I was brought up on the ethos that with firmware ‘if you can’t do it in C, do it in assembler. If you can’t do it in assembler, it is not worth doing’. Then I listened to a podcast on micropython here and figured it was about time I stopped being such a curmudgeon. There are two types of fool. One says ’this is old and therefore good’ and the other says ’this is new and therefore better.’ With hardware design, I get to be both at the same time.
I interfaced the pyboard with an mpu-6050 accelerometer/gyrsocope board. If you want to get one of these, look on eBay where you will find these boards for a few pounds. I modified code from this project on Hackaday which is the site for the discerning electronics enthusiast. I am streaming data from the sensors to my laptop. I need to add some error checking to flag if there are missing data samples and compensate for these and to check the sampling rate is correct. Then write some unit tests, to avoid being a hardware design hypocrite.
As with any new platform, I encountered the usual World of Pain. I managed to install a micropython package over one of the regular python packages on my laptop. I never did figure out how to fix this. As luck would have it, I had a clonezilla image from the night before, which only took 20 nail biting minutes to load. Matt’s top tip - use clonezilla and use it often!
All of the material on this site and linked resources is covered by the GNU GPL licence v3.