ouch buttons interfere with how you want to control devices like white goods? They interfere with your way of life? Smartify them to MQTT messages - it's like a 'fingerbot' for touch buttons.
Build2gether2.0 Challenge entry story
#### Content ####
Problems+Solutions
Background
Software
FFC
##############
### The problems:The industry loves touch buttons, as they are fancy, cheap, durable and easy to clean. Those touch surfaces come with disadvantages though too. It's not possible to feel them, as the interaction then automatically is executed. Therefore they are not suitable for vision impaired.
For white goods, touch buttons are often found in kitchens on stoves and fume extractor hoods. The variety of suitable products is therefore severely limited for impaired people.
A similar problem exist for mobility impaired. Often buttons are out of reach - that's why sticks are used to press buttons being too far away. Fine motor skills are needed with good aiming to be able to hit the wanted button. Additionally, special tips are needed to trigger touch buttons.
## Existing solutions:
There already is a solution for mechanical switches called "Fingerbot" - it may even be possible to upgrade the "finger" to press touch buttons with a suitable finger cap, but it has disadvantages: Pressing buttons with a motor is energy hungry and needs lots of space. For each button one motor is needed, therefore buttons lying close to each other can not be addressed with a fingerbot, nor can a fingerbot control commonly implemented touch sliders.
Buying devices with integrated "smart" capabilities is not considered a solution, as they are mostly proprietary and may require new acquisitions.
### The solution:The solution is to emulate touch button presses, without tinkering with the original hardware. Smartifying 'dumb' touch devices.. so to say.
It's possible to use conductive stickers to trigger a touch button by pulling it to ground, hence simulation a finger touch.
This sticker is wired to a small control box containing a microcontroller, emulating the button presses. What executes the control box and its respective functions should be user defined. Either mechanical buttons with haptic feedback should be connectable, and/or automation should be implemented. This way it is possible to even integrate it into smart home solutions.
Sticking conductive stickers onto a touch button has advantages: It still can be operated by the original touch button and does not need modifications of the device. It is slim, so applicable nearly everywhere and is easily cleanable. Also, it is not energy hungry like a motor and enables the possibility to control even touch sliders.
To make it clean, the goal is to create an own FFC (Flexible Flat Cable) - to that later.
###The perspectiveEvery project needs a perspective. I've thought about devices I could try it on. A friend of mine has a cooking top from IKEA - Pler 8.
So let's create a way to control the 4 plates with the touch buttons of that cooking top with an MCU. To get to the name of the project containing FFC (Flexible Flat Cable), a flex sticker will be created to interact in a fashioned way.
### Technical background:Capacitance sensors detect changes in capacitance caused by the proximity or touch of a conductive object. This object can be a finger, which acts as a conductor that alters the electric field, increasing the capacitance detected by the sensor and triggering a touch response. The finger forms the second plate of a capacitor, effectively raising the overall capacitance between the finger and the touch surface of the sensor. Although the human body isn’t directly connected to the PCB’s ground, it behaves like a virtual ground. This is because the human body, due to its large surface area and ability to hold and distribute electrical charge, influences the sensor's electric field as if it were a part of the ground plane, even though it is not physically connected to it.
Similarly, sticking a conductive pad onto the touch sensor and pulling it to ground will create the same capacitance increase that the sensor interprets as a touch input.
## Concept / first progress:
Having understood the technical background, it's time for some action:
A wire is soldered onto a piece of aluminum foil. The foil is stuck on the surface of the touch sensor and the wire connected to ground, which then triggers an input action by creating a capacitance to the touch button.
Awesome! So all we need to do is to either keep the pad/wire floating for no interaction, or pull it to ground for a simulated button press. Not even a transistor is needed to accomplish that.
But wait... what if a button is pressed by a finger manually, the traditional way? Isn't that mixing up the states we think the cooking top is in?
That means extra hardware besides the MCU, right? ESP32-S3 to the rescue, as it has capacative touch sensing inputs, which we will need to address in software too to keep track of actual states. Initially I wanted to do it without touch sensing but only touch simulating and neglect finger touch input - I mean, a fingerbot can't do that either, can it?
Let's try - All that's needed is a microcontroller with capacitive touch sensing input as hardware.
### The software:In this project, ESP-IDF is being used with an ESP-S3, as it has capacative touch sensing inputs - keep that in mind, when you want to build for another MCU and wonder why it fails to build.
# setGPIO for touch simulation and sensing:
If a wire with an attached conductive pad is set to a GPIO, we can use it as touch sensing input or to actually simulate a touch button press to an underlying touch-device.
To have the pad as an output, we configure it to be floating by default and pulled to ground if triggered.
ESP-IDF code:
touch_pad_t touch_pad_num = (touch_pad_t)i;
// Reset the GPIO pin associated with the touch pad
gpio_reset_pin(touch_pad_num);
// Configure the GPIO pin as output (open-drain)
gpio_config_t io_conf;
io_conf.intr_type = GPIO_INTR_DISABLE;
io_conf.mode = GPIO_MODE_OUTPUT_OD; // Open-drain mode for high impedance when not driven
io_conf.pin_bit_mask = (1ULL << touch_pad_num);
io_conf.pull_down_en = 0;
io_conf.pull_up_en = 0;
gpio_config(&io_conf);
No
worries, there are functions in the code doing everything for you.
Now here is the main reason I switched from my initial tryouts with nRF to an ESP32-S3:
The same GPIOs can be used for capacative touch sensing input, so it's possible to keep track if manual finger interactions have been made, preferably with an interrupt. By default the GPIOs are in touch sensing mode, only switching to GPIO output touch simulation on MQTT changes and then back to input sensing.
Fortunately there also is an example application for that. We can adapt it to our needs, create a gpio_control.c and reduce it to simple calls to:
touchsense_app_start()
disable_touch_sensing_and_set_touchsim_for_all
reenable_touch_sensing_for_all
disable_touch_sensing_and_set_touchsim
reenable_touch_sensing
(I've really tried to come up with good names, honestly :'D)
The downside of input sensing is that you have to calibrate the values for the FFC, as it seems to sometimes trigger the touch button, it's completely stable in output-floating mode though. So we'll have to see how it works with an FFC and input sensing interference.
And finally, we create a "profile" for our to be controlled touch device, breaking it down to the two functions:
translate_stove_call //disable touch sensing and switch GPIOs to output for touch simulation
touch_to_platestate //get the input of a real finger on the touch device
This allows to keep the states in the software cohered to the actual device state. It's in this state by default, the GPIO is only set to output when touch simulation has been triggered (via MQTT).
#MQTT (+ HA)
Ok, so we can trigger the touch button. But when and how do we want to call that button press action?
At first testing, I've added a few mechanical buttons which can be felt for haptic feedback. Each mechanical hardware button triggers a touch button. This uses up a lot of GPIOs, which are not available if not multiplexed and also it's impractical. If you actually wanted mechanical buttons, then better take another MCU and connect it wireless as another device.
We have an ESP32, which is a WiFi MCU, so why not use MQTT for the calls. That way we are even able to automate it (e.g. over Home Assistant). and have it a generated dashboard tile with help of MQTT discovery-messages for HA.
(!) If you want to use the code and not just try it out supervised, please take in mind that this is a simple MQTT-client to a public test-server, unencrypted over TCP. At least change the server URL to your own! I've seen so many productive use looking data on those servers... *sigh*
#define CONFIG_BROKER_URL "change to your own!" //"mqtt://broker.emqx.io"
The addressed Ikea cook top has 4 plates, so what we need at minimum 4 sliders for the plates and a separate on/off button.
<Screenshot of HA slider on dashbord> 404 missing (actually forgot to take it)
We can not only just subscribe and publish via MQTT, Home Assistant takes discovery messages, too. This way HA will generate us that new device in dashboard to control via slider.
#define VALUE_TOPIC_PREFIX "homeassistant/number/flextouch/"
...
snprintf(state_topic, sizeof(state_topic), "%s%s/state", VALUE_TOPIC_PREFIX, slider_ids[i]);
snprintf(command_topic, sizeof(command_topic), "%s%s/set", VALUE_TOPIC_PREFIX, slider_ids[i]);
snprintf(config_topic, sizeof(config_topic), "%s%s/config", VALUE_TOPIC_PREFIX, slider_ids[i]);
snprintf(discovery_msg, sizeof(discovery_msg),
"{\"name\": \"%s\","
"\"state_topic\": \"%s\","
"\"command_topic\": \"%s\","
"\"min\": 0,"
"\"max\": %d,"
"\"step\": 1,"
"\"unique_id\": \"%s\","
"\"device\": {"
"\"identifiers\": [\"%s\"],"
"\"name\": \"%s\","
"\"manufacturer\": \"ESP32\","
"\"model\": \"Slider\""
"}}",
slider_names[i],
state_topic, command_topic,
STOVESTEPS,
slider_ids[i],
DEVICE_NAME, DEVICE_NAME);
esp_mqtt_client_publish(client, config_topic, discovery_msg, 0, 1, 1);
That we can loop nicely to get 4 sliders for Home Assistant, having a nice structure like
'homeassistant/number/flextouch/flextouch_slider_a/...'
with
'state' - subscribe to see value of slider
'set' - HA will publish changed slider value
'config' - discovery message for HA
#Software Sum-up
It always starts simple. With a GPIO as a touch simulating output. Thinking about a way to interact with it (MQTT) and before starting, realizing that manual touch inputs with a real finger need to be addressed too, to keep the states in sync. Luckily there is example code for tons of stuff already available by default, including touch sensing and MQTT clients.
And when done all that, well, then you realize probably a refactoring is at hands.. But we want to finally get to the FFC, the fun part!
## FFC custom flex pad:Finally, the FCC! I mean, it's even in the title and the last topic? But the most esthetic one! :)
As glued aluminum foil is impractical and just ugly, a fancy FFC as an all-in-one sticker is needed to interact with the touch pad nicely. I've created a custom flex cable in KiCAD for IKEA PLER8 HB 18(402.228.29) induction cooking top.
Just take a halfway decent photo from the top with a ruler on it. We need it to be able to scale the flex cable easily in KiCad - (haven't seen dimensions in manual, I might have been blind.)
Ok, now start in KiCad with an FFC connector, which you may need to create in footprint editor. I've had a look at the products offered by Würth, as they do have nice datasheets:
I've decided on a 10-pin FFC with 1mm pitch for contacts:
https://www.we-online.com/components/products/datasheet/686610050001.pdf
Now create the footprint of the pinout for the FFC connector for our own touch pad.
First, create the layout with the conductive pads (it's copper now, not aluminum anymore, does not matter though). We want to stick those surfaces on top of the original touch button of the device. Check dimensions twice, then connect everything up.
I've added nice symbols on silkscreen, so non-vision impaired can still see the functions of the buttons for manual interaction. Then we can add flow to the courtyard-layer and get our custom 10-pin FFC:
Not a fan of my design choice? Better had round button surfaces? mh, well, take the KiCad, feel free to modify and create! :)
#Prospects:
- create PCB for flex connector
- refactoring with generic handling of user created FFC profiles. It's just a switch case for that Ikea device right now, address multiple device possibilities.
+ possibility to set profile and configure over MQTT
- design touch slider pads in KiCad for other devices
- create separate project with haptic, mechanical hardware buttons connected via MQTT
It was a fun short project, even though I don't own a stove with touch interface. Thanks to good friends letting me address theirs!
This is to be seen as a worklog, the actual FFC has not been produced. It is a good start to putting it into action. Touch simulation was tested with aluminum foil, PCB wire and TTP223. I've created a.gif shortclip with the latter, but gifs are apparently not supported(?).
Next up: ordering the FFC at a PCB manufacturer (like PCBWay) :) then calibrating the touch sensing.
Oh, and AI - how else am I going to score?...
Thank you for the challenge! I like how it shifts the view on things, especially the inclusive ones and hope it may actually be of help. Also thx for sending me that superbox, it hat the right stuff inside in that case - I hadn't known ESP has capacative touch sensing builtin.
Sorry for the mess, last one sweeps.
PS:
I've just seen again that winning that superbox forces clicking the checkbox that at least three of the listed sponsors are used. And one (DFRobot was not available to me, as I hadn't seen the expiration date of the gift card - could have printed a timespan on the gift card in the first place..). This project literally just contains out of two parts. That one MCU part I used is from Seeed Studio. I've initially started with nRF from Nordic. But it has A) no touch sense input and B) I didn't want to use Zephyr for some reason after starting. The second part, the FFC flex cable could surly be ordered with PCBWay, I haven't had the time to place an order yet - well, I don't even own an touch cooking top anyhow. Still that makes me suitable for having three sponsors in development for basically a MCU only solution :)
The given feedback was only partially helpful. One was that buttons also can be state changing. And something about smartphone-UI control, which this is totally not about. It should enable vision and mobility impaired to change the way of touch control to something else by smartifying it.
Comments