Bendlabs single-axis bend sensor

Using the Bendlabs 1-axis bend sensor with an ESP32 board

This post explores using a more tactile and sensitive type of flex sensor than I've used so far. This is part of my ongoing flex sensor assistive technology project.

The flex-sensor is the bendlabs 1-axis flex sensor.

In this post I show how I interfaced this sensor with an Unexpected Maker feather S2 board and started logging data from it. This board uses the ESP32 S2 MCU. The 'S2' variation has the feature that enables the board to appear as a physical keyboard. This will be useful to control communication software as part of the assistive technology project.

A short video showing the test system displaying real-time data can be viewed here:

A photo of the flex sensor from the Bendlabs website is shown below.

Bendlabs 1-axis flex sensor, from the bendlabs website.

A photo showing my grubby hand holding one of the flex-sensors is shown below. I already soldered the wires onto the end connector in this photo. The sensor comes without these.

Bendlabs single-axis flex-sensor.

To use these sensors you can either buy a development kit that Bendlabs sells which has the sensor and a board to plug it into or you can just buy the sensor and solder on your own wires to connect with your own board. I just bought the sensor. I need to be able to coonect this sensor to my own projects. If I can only get the sensor to work with a demo kit, it is not of much use to me.

I soldered on some silicone insulated wires to the end connector. The silicone allows the wire to be more flexible than the regular PVC insulation. This was a little tricky, but not too bad an exercise. I terminated the ends of the wires onto 0.1" header pins. The wires that I soldered onto the flex sensor and the header that they connect with can be seen in the photo below. On the left of the photo are the wires that connect to the Unexpected Maker Feather S2 board. I soldered female headers onto this board. Male or female headers on a development board? I've seen both. I went with female as I could and nobody stopped me.

Bendlabs single-axis flex-sensor with extension wires soldered on to break-out connector.

The pinout for the connector is shown below.

Bendlabs one-axis flex-sensor pinout.

The pinout shows that the signal interface is I2C - these are the SDA and the SCL signals. In addition to these, there are power (VCC) and ground (GND). The part is not 5V tolerant. I used the Unexpected Maker Feather S2 3.3V supply to connect with VCC. There are two other signals to deal with. nDRDY is 'not data ready' - meaning inverted logic on the data ready line. When data is ready, the line will go low. nRST is 'not reset', meaning that when this signal goes low, the sensor enters a reset condition. I connected these signals to two sockets on my board. nDRDY goes to pin 7 and nRST goes to pin 3. In the example code provided, nDRDY goes to pin 4, but I don't have a socket for pin 4, so I chose pin 7 and ajusted the software. More details on this below.

The wire colours that I used and their corresponding signals are:
1 black wire - GND ground
2 red wire - Vcc - 1.62-3.6V NOT 5V TOLERANT
3 green wire - nDRDY
4 blue wire - SDA - needs 10K pull up
5 yellow wire - SCL - needs 10K pull up
6 orange wire - nRST

The board I used has built-in 10K pullup resistors for the dedicated SDA and SCL ports. These pull-up resistors are necessary of I2C communication. I found that out the usual way that I find things out.

The Unexpected Makers Feather S2 board has a Qwiic connector socket which takes care of the power, ground, i2c clock and i2c signal lines. I bought a flexible wire Qwiic connector to attach the Qwiic connector socket on the board with the relevant signal lines on the flex sensor, using the 0.1" pins to connect the two devices.

I used the same colour wires from the flex sensor to the header pins as the Qwiic connector wires have that connect on the opposite side of the header pins.

Unexpected Maker Feather S2 board connected to a Bendlabs single-axis flex-sensor.

code

Demo code from Bendlabs can be found at:

Bendlabs one-axis flex-sensor Github.

This is written to run on the Arduino platform.

This code is aimed at the the Sparkfun Pro nRF52840 Mini. With the hardware abstraction that C allows, the example code compiles for the Unexpected Makers Feather S2 once you install the necessary ESP32 library. Have a search on t'net on how to install the ESP32 library.

The only update that I needed to make to the demo code is that I use pin 7 on my board instead of the pin 4 defined in the code for the nDRDY signal. I adjusted the corresponding line of code:

from

#define ADS_INTERRUPT_PIN  (4)

to

#define ADS_INTERRUPT_PIN  (7)

The quick start guide from Bendlabs says to use the C program 'bend_interrupt_demo' through the Arduino platform. I couldn't get this to work. I used my 'scope on the I2C lines. I could see clock signals on the clock line - SCLK, but no data on the SDA line. I did get the example program 'bend_polled_demo' to work with one correction. Hopefully, this will be corrected in the Github repository by the time that you read this post.

The line:

 int ret_val = ads_read_polled(&sample, &data_type);

should be

 int ret_val = ads_read_polled(sample, &data_type);

in the function:

void loop() {

  float sample[2];
  uint8_t data_type;

  // Read data from the one axis ads sensor
  int ret_val = ads_read_polled(&sample, &data_type);

After flashing the code to the board, I can see real time angle data plotting on the Serial Plotter in the Arduino IDE. See below for a screenshot.

Arduino serial plotter showing flex sensor data.

What next?

The positives of this flex-sensor are:

  • It is more flexible and tactile than the flex sensor I've used so far.
  • Works for positive and negative deflection.
  • More sensitive at detecting flex.

The negatives are:

  • Difficult to make an extension lead for. The soldering is finicky and there are 6 power and signal lines to contend with.
  • Cost. I'm not sure how much of an issue this is. The Bendlabs sensor costs around £40 at the time of writing. This is about double the cost of the other flex sensor I tested.

Bendlabs look to have a business model where they want to be consultants and customise their technology for each product. What I would like is an off the shelf 'plug and play' sensor and a lead with a socket on the end that the sensor plugs into. Hand soldering the leads onto the sensor presents a potential mode of failure, as well as being time consuming.

I will contact Bendlabs to see if they have something like this for sale.

Accessing Smartbox Grid 3 using Python and win32gui

Summary

Smartbox's Grid 3 communication software creates two windows containing the words 'Grid 3' in their titles, even though you can only see one. If you are trying to interact with this software using your own program, you need to make sure to access the window that you intend to.

Problem

I wrote some Python code to detect the use of Grid 3 or Tobii's Communicator software for this project, to visually show when somebody who uses eyegaze technology interacts with the software.

This post concentrates on the issue I had with finding the correct window that Grid 3 runs in. Grid 3 runs under Windows.

I use the pywin32 library to access the win32gui library. This library allows me to find which window is running the software that I want to monitor. However, after using this library to find the 'grid 3' window, my code kept on telling me that nothing was changing in the window, when I could clearly see something was. To make matters more confusing, the code seemed to run fine on one machine and not another.

Solution

Please find the the parts of the Python script needed to explain my solution below. All of the script is on my GitHub site.

import logging
import win32gui

logging.basicConfig(
    format='%(asctime)s.%(msecs)03d %(message)s',
    level=logging.INFO,
    datefmt='%H:%M:%S')

COM_SOFTWARE = ['grid', 'communicator']
IGNORE = ['grid 3.exe', 'users']

def find_window_handle(com_software=COM_SOFTWARE, ignore=IGNORE):
    ''' Find the window for communication software. '''
    toplist, winlist = [], []

    def _enum_cb(window_handle, results):
        winlist.append((window_handle, win32gui.GetWindowText(window_handle)))

    win32gui.EnumWindows(_enum_cb, toplist)
    for sware in com_software:
        # winlist is a list of tuples (window_id, window title)
        logging.debug('items in ignore: {}'.format([item.lower() for item in ignore]))
        for window_handle, title in winlist:
            #logging.debug('window_handle: {}, title: {}'.format(window_handle, title))
            if sware in title.lower() and not any (x in title.lower() for x in ignore):
                logging.info('found title: {}'.format(title))
                return window_handle
    logging.info('no communications software found for {}'.format(com_software))
    time.sleep(0.5)

The critical debugging line is the commented out line 24:

logging.debug('window_handle: {}, title: {}'.format(window_handle, title))

When uncommented, and running the logging in debug mode, this listed out two windows that contained 'Grid 3' as part of their title, even though only a single Grid 3 window was visible. Even with just the 'Users' screen up, before launching a grid communication window, the logging.debug line returned two windows containing the name 'Grid 3' in their title:

grid: [(66532, 'GDI+ Window (Grid 3.exe)'), (197532, 'Grid 3 - Users')]

When running one of the Grids (for testing I used the Super Core grid), the software still tells me there are two windows with 'grid' in the title:

grid: [(66532, 'GDI+ Window (Grid 3.exe)'), (263256, 'Grid 3 - Super Core - .CORE')]

For this example, I could take the second window found and be done. However, to be robust, I created an IGNORE list, containing strings that are in the window titles that I do not want to use.

In the code example above, line 25 looks for the correct string to be in the window title and also checks that none of the strings in the IGNORE list are in the title:

if sware in title.lower() and not any (x in title.lower() for x in ignore):

This only passes the title for the window that I am interested in - the one containing the communication grid.

Testing

I use a Windows 10 virtual machine running in VirtualBox, with Debian Linux as the host. I also test on a separate Windows 10 only PC. I use a virtual machine for Windows for development as I run Linux on my laptop. The virtual machine allows me to create a static and controlled testing environment with only the software that I am working on in it. I double test on a stand alone Windows 10 machine in case the virtual environment somehow effects the software.

In this case, my script seemed to run well on one system and not another. I now suspect that sometimes the window that I was interested in was the only one generated by Grid 3 and at other times, the extra spurious Grid 3 window was generated as well. This spurious window was then selected by the software.

Communication Matters Conference 2019

I attended the 2019 Communication Matters Conference in Leeds on Monday. I presented a poster on three of my assistive technology projects using the BBC micro:bit and gave a 'lightning talk' - 15 slides, auto advancing every 25 seconds. I wrote the PowerPoint presentation using Libreoffice on my Linux distro. Just before giving the talk I realised I had not tested it in Windows, which the conference PC was running with. Of course. Why use Linux for free when you can pay for Windows? Fortunately the slides displayed and auto-advanced correctly. This was my first time presenting at a conference, so I was a little nervous. I think it all went well. I didn't notice anybody asleep and nothing was thrown.

A number of people showed interest in the poster. I presented my handshake, give-me-a-minute and hand-wave projects. Hopefully this interest will carry over to implementing these projects to help people access communication devices.

If any of these projects are of use for you or somebody you know, please get in touch. Contact details are on my home page.

Standing by my poster with the sunlight glinting gently off my baldy head.
My Mighty Poster. The original is A1.

People laugh when I tell them that I mostly failed over the years that I prodded at these projects. I went down the proverbial rabbit hole trying to implement camera based virtual switches before moving to repurposing simpler off the shelf technology, such as the BBC micro:bit.

I finally realised that trying to implement vision based controllers in the Real World was too much for Little Matty working on my time off from my paid work offshore. I managed to demonstrate the idea of using depth cameras to create virtual controls using head tracking: https://mattoppenheim.com/headbanger-controlling-software-with-head-motion/ before moving on.

I was gratified to see a presentation from a Japanese team that succeeded with depth camera based controllers: http://gesture-interface.jp/en/. The Japanese team have been working on this project for 5 years and have 9 people working on the team, according to the team member I talked with at the conference. So even simple sounding ideas using off the shelf technology can take significant resources and software development to implement in the Real World.

Using the microbit to control switch access software

What is switch access software

Many disabled people use specialist software to create speech or to interact with the environment (e.g. turn on lights). Some are unable to use keyboards or mice to operate this software, so use a variety of 'switches', such as push buttons. These buttons act like keys on a keyboard, or pretend to be a mouse click.

I took some hand held video of a switchable software package which enables speech to be created. The software is called Liberator. A big red button was configured as the switch controller. When the button is pressed, a row is highlighted. The highlighted row scans down. A second click selects a row. The software now scans across the single cells in the selected row. A third click now selects that cell and the text for that cell appears in the speech window. Sometimes a cell will lead to a new grid. Once the speech text is composed, a cell can be selected for the text to be sent to a speaker. I tried this out at the Communication Matters conference in Leeds.

Trying out the Linberator switch access software at the Communication Matters Conference.

Wikipedia has a page (of course) explaining what an assistive technology (AT) switch is here. Copying the one line summary at the start of that page: "A switch is an assistive technology device that replaces the need to use a computer keyboard or a mouse."

The system we developed allows the BBC microbit to pretend to be an AT switch, so that movement sensor devices I make using the microbit can be used to control switch accessible communication software. For instance, Grid 3 by Smartbox.

The picture below shows a typical button switch and a Joybox switch to USB adapter. The adapter enables the button switch to be connected to a USB port. This allows the button to act as if a key is pressed on a keyboard. This simulated keyboard key press then controls software, to e.g. create speech. The standard connector for a switch is the venerable 3.5mm audio plug. The 3.5mm plug is on the end of the cable attached to the button switch. A 3.5mm socket is attached to the USB to switch converter.

My task was to enable a microbit board to connect with a 3.5mm plug and act as a switch, so that the signal would be recognised by the switch to USB adapter. How hard could this be?

Button switch and interface dongle.

How does the switch work? The 3.5mm connector has 2 contacts inside of it. When the switch is operated, these are connected together internally. So, the contacts are normally open and closed when the switch is pressed. How do I recreate this switch electronically?

I used a Grove M281 Optocoupler Relay.

Grove M281 optocoupler relay board. The control pins are on the left, the relay pins are on the right.

This acts as an electronically controlled switch. When the CTR pin on the board goes high, the two connectors with the screws on top are connected. When the CTR pin is low, they are disconnected. The CTR pin can be seen on the left of the photo. There are connections for ground (GND), power (VCC) as well. The NC pin is Not Connected.

I could maybe lash up something cheaper using a transistor or two, but for around £6 I had an off the shelf solution that I got tested and running within a day. The microbit connects to the pins on the left of the board in the photo. The 3.5mm plug connects to the screw top terminals on the right of the photo.

The advantage of using optocoupler is that the microbit is isolated from the communication device that the 3.5mm plug is connected to. My slight worry was creating a ground loop. If I didn't have any isolation between the microbit and the 3.5mm plug, if the microbit is powered from a USB source - say another computer - and then the microbit is connected to a communication device that is also connected to the mains, we may create a ground loop. The optoisolator prevents this. I don't think this is a likely scenario with the tiny currents involved, but I am working with a vulnerable user group, so am a little more cautious than usual.

The optocoupler relay board is specced at 3.3V, but worked with the 2xAAA battery pack powered microbit at 2.4V. Nobody is more surprised than I am when something I build works!

The photo below shows all of the parts of the system, apart from the switch to USB adapter, shown in the photo at the top of the page. The microbit board slides into a Kitronik edge connector and break out board:

https://www.kitronik.co.uk/5601b-edge-connector-breakout-board-for-bbc-microbit-pre-built.html

microbit board slid inside a Kitronix break out board connected to a solid state relay board connected to a 3.5mm plug. The plug is unscrewed so you can see the two connectors.

The Kitronix break out board allows all the signal pins on the microbit board to be accessed. I used digital pin 16 on the microbit board to connect to the CTR pin on the M281 board, as it allowed for the neatest wiring. The photo below shows the wiring on the right. Pin 16 connects to the yellow wire. Ground is the black wire and the 3V output is connected to the red wire. Ignore the connectors and resistor on the lower left of the photo - these are used for connecting a motion detecting sensor, which I will write up on a different post.

The two screw top terminal connectors on the M281 are connected to each of the two contacts in the 3.5mm plug. The wires connected to the plug by soldering and are connected to the M281 by screwing down the terminals.

I wrote a small program to toggle pin 16 on the microbit high to simulate the action of the sensor detecting a hand motion, which is the action we would like to use to trigger the switch.

Pin 16 is connected to the solid state relay board using the yellow wire. The red and black wires connect to +3V and ground pins.

The 3.5mm plug goes into the socket of the interface dongle shown in the photo at the top of the blog. The dongle plugs into your PC on a USB port. The dongle is recognised as a switch interface using the free software at:

https://thinksmartbox.com/product/switch-driver-6/#

I used Windows 10 to check that everything works as it should. Which it did. Screen shot of the software below. Hurrah.

The final code used for this project can be found on my github page in the hand_wave folder.

Smartbox Switch Driver software used to test the switch.

One final Top Tip is to replace the AAA battery pack that comes with the microbit with one that has a power switch. These are about £4 + £0.75 postage from eBay. The title for the switched battery box I bought is 'Switched battery power box for BBC Micro:Bit 2 x AAA'.

Using Python to detect activity in Sensory Software’s Grid 2

Update: March 2018. This work is being submitted to the Communications Matters conference.

Following on from the eyeBlink post, with the help of Fil at Beaumont, I modified the algorithm I'm using to detect when the Grid 2 or Grid 3 software is being used. The image below shows Sensory Software's Grid 2 software being used to construct a sentence. The new text appears in the white area at the top of the window. Fil suggested that I change the Python script to just monitor this area at the top of the window. The script now looks for a change in the amount of black text in this area. After the usual software wrangling I think I got it working. The Python script looks at the top 20% of the window and counts the number of black pixels in this area. Every half second it recounts the number of black pixels. If there is a change in the number of black pixels above a threshold, then a trigger is sent to indicate that the Grid software is being actively used. I'm using a threshold of 20 pixels, so there needs to be an increase or decrease of 20 or more black pixels for a change to be detected. This allows you to move your mouse cursor around in the text area at the top of the Grid window without triggering that there has been a change. The activity detection script needs more testing, but preliminary results seem to show it works. Prior to this, I was monitoring the entire Grid window and looking for a change in the whole window above a threshold. This led to false triggers when cells were selected, but not activated. When a cell is selected, the colour of the cell changes, even when it is not activated to produce text. This change in colour was being detected.

Each time we test the script, we find new ways to break it, leading to some 'try except' exception handling clauses. The script is designed to run on Windows as Grid 2 and Grid 3 only work on this operating system. I use the win32gui library to interact with Windows and the python imaging library, PIL (known as pillow), to do the image processing.
 
Sensory Software's Grid 2 Chatterbox grid being used to construct a sentence:
 

Grid 2 communications software by Sensory Software used to create speech.

The latest code and installation details on how to get this running using the BBC microbit to give a flash when the Grid software is being actively used can be found on my github site at:
 
If you have any questions, please ask.

eyeBlink – enabling natural two way conversation with somebody who uses an eyetracker

It can be difficult to tell when a student who uses an eye tracker to operate their communications software is actively using the software. So the temptation is to go and look over their shoulder. The Head Technologist at Beaumont College asked if it is possible to have a light flash to indicate when the communications software is being used. This makes for a more natural two way conversation. You talk to the student and you see the light flashing, so you know that a reply is being composed.

 
After some false starts, I think I have some code that will detect when the software is being used. My script makes an image of the screen each second and looks for a difference. I set a minimum threshold for the difference between the images so that a blinking cursor will not continuously indicate a change.
 
I use a BBC micro:bit to do the flashing as they are cheap, cheerful and reliable. I reckon to have two of these qualities.
 
Please find a picture of Craig, one of the technologists at Beaumont, testing the prototype of the eyeBlink system using a Tobii eyetracker and Sensory Software's Grid 2 software. You will have to take my word for it that the microbit does indeed flash when the Grid 2 is being used.

Craig testing the eyeBlink hardware at Beaumont College, Lancaster.

CD player for the visually impaired

This blog details a method to allow somebody who is visually impaired to easily listen to their CD collection again. My Mother lost her eyesight through macular degeneration. She has a decent collection of classical music that she built up over a few years. But she can't see well enough to easily use a CD player anymore. On top of that, her mobility is restricted. I bought her a Roberts Concerto 2 CD player and radio designed for the visually impaired. Please find details and a review here. This is the best that I could find, but it is still fiddly and difficult for somebody without sight to load the CD. It is quite a bulky device, which makes putting it next to an elderly person awkward as it takes up most of chair side table, or the user has to get up and go to where it is placed. Which is a barrier to it ever being used when just getting out of a chair is no longer straight forwards.  I looked at a few potential solutions. I found a portable CD player on eBay and tried that. But again, it takes up a little too much table top space and it is fiddly to load. You don't realise how poorly controls are laid out on most devices until you try to explain to somebody without vision how to use it. I found some lovely projects where custom built players are built using tags. Audio books are loaded onto a memory card and played using something like a Raspberry Pi single board computer. An NFC coil is used to read a tag placed inside the case of an audiobook or CD and the audio is played from the memory card. Here is an example on Hackaday. I started going down this route. Then had another think. This will add another device to my mother's chair side table. I will have to:

  1. Build it.
  2. Run off all her CDs to memory card.
  3. Show her how to use it.
  4. Maintain it.

My mother uses a Sovereign USB stick player to listen to her talking books and newspapers. This is a well designed player aimed at the visually impaired. It has decent sound quality. The build cost of a custom device would exceed the cost of a Sovereign and for me to think I would match the sound quality is a tad arrogant. One of the design features of this player is that it will remember the place you were last at on the stick. You can even swap sticks and it remembers the last play position on the previous five sticks you played. Mum already has this next to her and knows how to use it. As a side note, there is now a smaller version of this player available called the sonic which I bought to listen to my podcasts with and loan to Mum when she visits.

So I ran off her CDs to MP3 and put each one on to a cheap USB stick. I bought some small key rings and used these to connect the memory sticks to postcards on which I printed the title of the CD. So far I have run off 10 of these. These 10 sticks and labels fit in an little box on her table next to the player. If the idea works, I will run off some more.
I've written this up so that other people in my position have a potential solution to enable others with disability to enjoy their music collections.
Having to pay a couple of dollars each for a cheap USB stick from eBay for each CD may seem a tad pricey, but compared with the time and cost of building a custom device, I think it is money well invested.
The proof of the pudding is in the eating. Please find a photo of Mum at my house, having dozed off while listening to Aled Jones with the bear she gave me about 40 years ago. The bear's arms are a little saggy now, but we all get a little infirm and need some help as we age.
 

Mum and Bear listening to some music.