HandShake
Introduction
HandShake is a project to help enable people who need to access software to create speech to be able to access this technology.
Many people who are unable to speak use specialist software to create speech. This software can often be operated using buttons or joysticks and is known as switchable software. Please see a more detailed explanation of this and a video showing one software package in my post Using the microbit to control switch access software.
Some people are unable to use physical controllers such as buttons and joysticks but are still able to make intentional hand movements. HandShake uses a pair of BBC micro:bits to enable triggers to be sent to switchable communication software to enable speech to be composed. A trigger is sent to the communications device when the user moves a hand above an adjustable threshold of acceleration.
This project was presented at the 2018 Communication Matters conference and will appear in their journal sometime in 2019. Almost all gesture recognition software focuses on the user replicating a predefined gesture, such as a swipe or sign-language. This is not practical for the students that I work with at Beaumont College who have cerebral palsy (CP).
The project code is on the GitHub site: https://github.com/mattoppenheim/microbit_hand_shake
Setup instructions are here: setup instructions
I made a video to complement the setup instructions here:
I wrote a Circuit Cellar article detailing the hardware and software, which is online here.
Testing with micro:bit based hardware
The latest HandShake system uses the BBC micro:bit. One micro:bit is worn on the wrist and a second micro:bit is connected to a laptop or communications device. When a wrist movement is made, the micro:bit on the wrist uses its radio to signal the micro:bit connected to the laptop. The micro:bit connected to the laptop then triggers an event on the laptop through the USB cable connecting the micro:bit with the laptop. The trigger sent from the micro:bit can also be used as switch to control communications software, such as Smartbox’s Grid software. Thanks to Sensory Software for giving me a licence for this software to allow me to test the hardware with it.
For initial testing, we asked two of the students at Beaumont to operate a light through gesture. Please see the video below.
The advantage of using the micro:bit is that somebody else makes them! Rather than my having to construct my own kludge, somebody else has kindly made a board which has all the widgets I need. The micro:bit boards are around £12 each. You need two for this project. Instructions on how to make your own fashionable wrist holder for a micro:bit from a £3 iPod arm band are here. All of the code and a manual showing how to replicate the system can be found on my GitHub site at: https://github.com/mattoppenheim/microbit_assistive_technology
The system was designed when Grid 2 and Windows 8 were in fashion. To enable the system to trigger Grid 2, we needed to use a Leostick plugged into the communication device, pretending to be a physical keyboard. I could not get Grid 2 to respond to software signals from my script. I needed to get the Leostick to send pretend keystrokes to Grid 2.
With Grid 3 and Windows 10, Leostick is no longer required. Grid 3 responds to software commands. The online manual and software on GitHub is now changed to reflect this. The Leostick is no longer needed!
The system will still work with the Leostick, it is just an uneccessary extra expense now. It could still be necessary for software, that like Grid 2, does not accept software generated keystrokes.
The Leostick technology guarantees that the system will work with any software, so I’m happy that I spent the time getting this working.
The rest of this article talks of earlier versions of hardware that I implemented using home-made circuits before the BBC micro:bit was widely available.
Testing with XBee based hardware
The photo below shows preliminary testing at Beaumont by one of the students. The pouch on the student’s wrist contains a microcontroller board and an XBee module for wireless data transmission. An accelerometer is attached to the XBee module. As the student moves, data is collected by the microcontroller and sent to a laptop for analysis. When the student shakes his arm within a comfortable range of ‘shakiness’ a bright LED called a blink1 attached to the laptop on the right flashes. The laptop screen shows the accelerometer data. Eventually we hope to do the processing and gesture recognition within the microcontroller on the student’s wrist and have this signal the student’s communication device directly. The accelerometer data display on the laptop allows me to develop the software to do this. See below for more details on the hardware.
Initial research used the Leap Motion. We found that the space that participants could interact with the Leap Motion was too limited for our user group. The code developed for recording and matching gestures will be tested with other technologies though, so the time was not wasted.
To continue with this research I moved to using an accelerometer to measure hand or wrist motion. Initially we will process and pattern match this data in a laptop to recognise a student’s gestures.
To display the real-time accelerometer data I wrote a user interface using the python library pyside. The python library pyqtgraph is used to display the accelerometer data for the x,y, z axis. The YouTube video below shows a recording of an early version of the interface, with data being displayed from the accelerometer real-time and the sample rate being changed. The display is lot more sophistimacated (sic) now.
I will make all of the code available once the project is finished. I use an assembla repository to store the code and would encourage any programmer to set up a git repository. You might think that you have adequate backups of your code…
Accelerometer hardware
The hardware has been through a few iterations. Initially I set up two way wireless communication with a pyboard connected to a digital accelerometer. The pyboard is programmed with micropython, so the entire tool chain from the hardware to the user interface is python 3. This code is built on the work of other programmers, who kindly put their code online. The pyboard worked but was a bit bulky. I got a slimmed down system working using an XBee series 1 to directly sample an attached analog accelerometer. This worked well for testing and verified the idea as feasible. Then I went to a talk about the the microbit. The killer feature is that these have board to board wireless communication as well as an accelerometer and microprocessor. This little board looks like it could be the way to go. Plus I can code using micropython again. I needed to change the default range of sensitivity on the onboard accelerometer. After figuring out how to do this I think that I can implement my ideas with this platform. Why didn’t I use this board to start with? It didn’t exist when I started on this project.
I wrote an article on the Pyboard and micropython for Circuit Cellar. Please find a pdf here.
Here is a picture showing three iterations of hardware. On the left is an Xbee series 1 attached to an accelerometer kludged together on a breadboard. The board is powered by a single AAA battery using a dc-dc converter to pump up the voltage to 3.3v. In the middle is the same hardware, but soldered on to a prototype board. Best to solder everything down when it is being shaken around! On the right is the microbit board, powered by 2 AAA batteries, which is the board I am now using.

Xbee series 1 on breadboard, Xbee series 1 soldered down, microbit .
I explored a number of different ideas before the microbit was available. Please see below.
Earlier hardware platforms
A number of smart watches, such as the Pebble, which have accelerometers built in, were trialed. These are designed to pair with a smartphone using Bluetooth Low Energy (BLE) and periodically send and receive data, not to constantly stream accelerometer data to a PC, which is what is required for the initial development work. It would have been nice to get something working reliably with this platform, as the students at Beaumont would be quite happy to wear the latest smartwatch. The only smartwatch I found with a stable and reliable link with a PC was Texas Instruments EZ430-Chronos. This comes with its own receiver dongle, so there is no issue in setting up a reliable link between this watch and a PC. However, the data sampling rate is limited.
I found a few Sparkfun WiTilt accelerometer and gyroscope sensor boards in the lab, left over from a long dead project on tracking people indoors. These are well designed boards with both a wired and an old school Bluetooth interface. Using the Bluetooth interface is a pain as for each iteration of code the device has to be reconnected. Using the wired serial interface allows for faster iterations of code as there is no re-connection to do each time the software is changed. I got this streaming accelerometer data to my laptop. However, this device is no longer manufactured. I emailed Sparkfun who said they had no plans to make anymore. So I started to look at what we can get off the shelf now. Using live hardware allows for others to easily replicate and improve anything that I come up with.
The Pyboard caught my eye. This runs micropython, which allows me to program the board using a version of the Python programming language. As this is the language that I use for my gesture recognition code, I figured this gives me a chance to eventually have all of the pattern matching done on the board. Initially I will take the accelerometer data from the board real time and process it on a laptop. Having the pattern recognition done on the accelerometer hardware will make for a better device that does not need to be constantly paired with a PC. The board will chug away on its own and when it recognises a gesture, send out a signal. That’s the plan anyway.
I’d heard of micropython for a while, but I was brought up on the ethos that with firmware ‘if you can’t do it in C, do it in assembler. If you can’t do it in assembler, it is not worth doing’. Then I listened to a podcast on micropython here and figured it was about time I stopped being such a curmudgeon. There are two types of fool. One says ’this is old and therefore good’ and the other says ’this is new and therefore better.’ With hardware design, I get to be both at the same time.
I interfaced the pyboard with an mpu-6050 accelerometer/gyrsocope board. If you want to get one of these, look on eBay where you will find these boards for a few pounds. I modified code from this project on Hackaday which is the site for the discerning electronics enthusiast. I am streaming data from the sensors to my laptop. I need to add some error checking to flag if there are missing data samples and compensate for these and to check the sampling rate is correct. Then write some unit tests, to avoid being a hardware design hypocrite.
As with any new platform, I encountered the usual World of Pain. I managed to install a micropython package over one of the regular python packages on my laptop. I never did figure out how to fix this. As luck would have it, I had a clonezilla image from the night before, which only took 20 nail biting minutes to load. Matt’s top tip - use clonezilla and use it often!
All of the material on this site and linked resources is covered by the GNU GPL licence v3.