An occupancy sensor is a measurement and sensing device capable of detecting and reporting the occupancy state in a designated area. It is usually implemented as motion-detecting device used to detect motion and identify when a person enters a room so that it can control lights, ventilation systems, or the temperature of the room. The sensors can also improve security and reduce energy consumption in the monitored area. It can use different technologies, such as infrared, ultrasonic, microwave, radar, or micro-vibration, to measure heat, sound, or environmental changes caused by humans. The most frequently used technologies are PIR, microwave, and ultrasonic.
Matter Is an open-source connectivity standard for smart home and IoT devices, based on Internet Protocol (IP) and compatible with other protocols such as Thread, Zigbee, Z-Wave, and Wi-Fi. It aims to improve interoperability and security by allowing local control as an option. The standard is royalty-free and supports various device types such as lighting, security, and appliances. Designed to simplify development for smart home product brands and manufacturers while increasing the compatibility of the products for consumers, it works through one or several compatible border routers, avoiding the use of multiple proprietary hubs. Matter is supported by many of the leading players in the smart home market including Google, Amazon, and Samsung.
This project describes briefly the implementation of a simple Matter-ready (Matter over Thread & Matter over WiFi) occupancy sensor with the nRF340 DK / nRF7002 DK and the S2GO RADAR BGT60LTR11 which can be used for home automation purposes.
The implementation essentially follows the comprehensive documentation provided by Nordic Semiconductor or the 3rd-party documentation linked therein (e.g. for configuring OTBR). Therefore, the necessary steps are given as reference to these documents and only project-specific additions and deviations are described in detail.
{ radar-occupancy-sensor w/ nRF5340 DK - Matter over ThreadComponentsNordic Semiconductor nRF5340 DK
The Nordic Semiconductor nRF5340 DK is a hardware development platform used to design and develop application firmware on the nRF5340 System on Chip (SoC).
The key features of the Development Kit (DK) are:
- nRF5340 SoC
- Support for the following wireless protocols: Bluetooth® Low Energy, NFC, 802.15.4, Thread, Zigbee, ANT™, 2.4 GHz proprietary
- Arduino Rev3 compatibility
- 2.4 GHz and NFC antennas
- SWF RF connector for direct RF measurements
- User-programmable LEDs (4) and buttons (4)
- SEGGER J-Link OB programmer/debugger
- Pins for measuring power consumption
- Drag-and-drop Mass Storage Device (MSD) programming
- 1.7 - 5.0 V power supply from USB, external Li-Po battery, or CR2032 coin cell battery
For more information, full specification and documentation refer to the product page and the documentation.
Infineon S2GO RADAR BGT60LTR11
The Infineon S2GO RADAR BGT60LTR11 is a fully integrated microwave motion sensor that includes Antennas in Package (AIP), built-in motion and direction of motion detector, and a state machine allowing fully autonomous operation of the MMIC without any external microcontroller. It is Infineon’s first Arduino-compatible 60GHz radar sensor for makers, developers, and prototyping. The BGT60LTR11AIP Shield2Go demonstrates the features of the BGT60LTR11AIP MMIC and gives the user a plug-and-play radar solution.
The BGT60LTR11AIP Shield2Go can be used independently as a state machine, which enables operations of the device without any microcontroller, or as a ‘plug-on’ radar sensor (e.g. with Arduino form factor boards or other microcontroller boards) to detect motion and direction of motion by using only two GPIOs. The detection range is configurable from 0.5 m to 7 m.
For more information please refer to the product page and documentation. A good introduction into the features of this sensor, and how to configure/use it is the Hackster project BGT60LTR11 Radar Shield2Go.
SoftwareNordic Semiconductor nRF Connect SDK and Microsoft VS Code
The nRF Connect SDK is a scalable and unified software development kit for building products based on all Nordic Semiconductor’s nRF52, nRF53, nRF70, and nRF91 Series wireless devices. It integrates the Zephyr RTOS, Zephyr applications, protocol stacks, drivers, and HomeKit Accessory Development Kit. The SDK offers developers an extensible framework for building size-optimized software for memory-constrained devices as well as powerful and complex software for more advanced devices and applications. It supports software development of Bluetooth Low Energy, Thread, and Zigbee applications.
The nRF Connect SDK is publicly hosted on GitHub and offers source code management with Git. It supports Visual Studio Code as IDE (nRF Connect for VS Code) but is also IDE agnostic, which other IDEs can be used also if preferred.
The nRF Connect SDK contains all needed software, including protocol stacks, for developing Bluetooth Low Energy, Wi-Fi, cellular IoT, Bluetooth mesh, Thread, Zigbee, and Matter products.
For more details refer to the documentation for the nRF Connect SDK and nRF Connect for VS Code.
Development & Testing EnvironmentInstalling nRF Connect SDK and VS Code
Install and setup the IDE VS Code and the nRF Connect SDK by following the instructions in the Get Started section of the nrf Connect SDK documentation or the nRF Connect SDK Fundamentals course on DevAcademy:
- Install nRF Command Line Tools
- Install VS Code
- Install nRF Connect Extension Pack
- Install Toolchain
- Install nRF Connect SDKNote:
This project has been implemented with version 2.4.2 of the SDK.
Setuptest environment
Since this project implements a device communicating via Matter over Thread a Matter controller and a Thread Border Router are required for testing the implementation. An appropriate testing environment can be setup by following the instructions in the Testing Matter in the nRF Connect SDK documentation.
For this project the configuration running the border router and CHIP tool on two separate devices has been chosen using the following hardware:
- 1x PC/Laptop (also used for development) with Ubuntu 22.04.3 LTS (Jammy Jellyfish) and
- 1x Raspberry Pi Model 4 with Ubuntu 22.04.3 LTS
In addition to:
- (Bluetooth LE device embedded inside PC)
- 1x nRF52840 Dongle - for the Radio Co-Processor (RCP) device
The Raspberry Pi has been setup as Thread Border Router following the instructions at
- Matter over Thread: Configuring Border Router and Linux/macOS controller on separate devices
- Configure the Thread Border Router
- The Thread Border Router page in the nRF Connect SDK documentation
- The guide for the OpenThread Border Router (OTBR)
and equipped with the nRF52840 Dongle as Radio Co-Processor (RCP) device which has been prepared following the instructions for Configuring a radio co-processor.
The CHIP Tool controller has been installed on the development laptop following the instructions to Configure the CHIP Tool for Linux or macOS using the the prebuilt tool package from the Matter nRF Connect releases GitHub page compatible with the used nRF Connect SDK version (v2.4.1),
Implementation - Part I - Connecting the sensorSince the S2GO RADAR BGT60LTR11 can work autonomously it is pretty easy to use it to detect motion and direction of motion by using only two GPIOs TD (target detection = motion) and PD (phase detection = direction). Handling these two signals can be easily implemented without a dedicated driver.
The S2GO RADAR BGT60LTR11 sensor is connected to the nRF5340 DK board as follows:
- 3V3 on the BGT60LTR11 to VDD on the nRF5340 DK (red)
- GND on the BGT60LTR11 to GND on the nRF5340 DK (black)
- TD on the BGT60LTR11 to P0.04 on the nRF5340 DK (yellow)
- PD on the BGT60LTR11 to P0.05 on the nRF5340 DK (orange)
The devicetree overlay file for this setup can be defined as follows:
/ {
bgt60 {
compatible = "gpio-keys";
targetd: d_target {
gpios = <&gpio0 4 GPIO_ACTIVE_LOW>;
label = "Target detection";
};
phased: d_phase {
gpios = <&gpio0 5 GPIO_ACTIVE_LOW>;
label = "Phase detection";
};
};
aliases {
targetd = &targetd;
phased = &phased;
};
};
To simplify handling the sensor a new C++ class BGT60 has been implemented:
#include <zephyr/kernel.h>
#include <zephyr/device.h>
#include <zephyr/devicetree.h>
#include <zephyr/drivers/gpio.h>
class BGT60
{
public:
static BGT60 &Instance()
{
static BGT60 instance;
return instance;
};
static void Init();
static bool IsReady();
static uint64_t GetLastTdTime();
static bool GetLastPd();
static bool GetTd();
static bool GetPd();
};
#define TD_NODE DT_ALIAS(targetd)
#define PD_NODE DT_ALIAS(phased)
static const struct gpio_dt_spec td = GPIO_DT_SPEC_GET(TD_NODE, gpios);
static const struct gpio_dt_spec pd = GPIO_DT_SPEC_GET(PD_NODE, gpios);
static bool ready = false;
static uint64_t last_td_time = 0;
static bool last_pd = false;
static struct gpio_callback td_cb_data;
void td_activated(const struct device *dev, struct gpio_callback *cb, uint32_t pins)
{
uint64_t now = k_uptime_get();
bool pd_pin = gpio_pin_get_dt(&pd);
last_td_time = now;
last_pd = pd_pin;
}
void BGT60::Init()
{
int ret;
printk("BGT60: Init ...\n\r");
if (!device_is_ready(td.port))
{
printk("BGT60: Target detection port (TD) not available ...\n\r");
return;
}
if (!device_is_ready(pd.port))
{
printk("BGT60: Phase detection port (PD) not available ...\n\r");
return;
}
ret = gpio_pin_configure_dt(&td, GPIO_INPUT);
if (ret < 0)
{
printk("BGT60: Target detection port (TD) not available for input ...\n\r");
return;
}
ret = gpio_pin_configure_dt(&pd, GPIO_INPUT);
if (ret < 0)
{
printk("BGT60: Phase detection port (PD) not available for input ...\n\r");
return;
}
ret = gpio_pin_interrupt_configure_dt(&td, GPIO_INT_EDGE_TO_ACTIVE);
if (ret < 0)
{
printk("BGT60: Could not configure ISR for TD ...\n\r");
return;
}
gpio_init_callback(&td_cb_data, td_activated, BIT(td.pin));
gpio_add_callback(td.port, &td_cb_data);
ready = true;
}
bool BGT60::IsReady()
{
return ready;
}
uint64_t BGT60::GetLastTdTime()
{
return last_td_time;
}
bool BGT60::GetLastPd()
{
return last_pd;
}
bool BGT60::GetTd() {
return gpio_pin_get_dt(&td);
}
bool BGT60::GetPd() {
return gpio_pin_get_dt(&pd);
}
Implementation - Part II - Matter applicationThe implementation of the Matter application is based on the Matter template sample and follows the guide Adding clusters to Matter application with the modifications and additions as described below:
- Copy Matter template sample: Copy the template example application to a new application named occupancy-sensor. Copy the source files for the BGT60 class (
BGT60.h
andBGT60.cpp
) to the sources directory of the occupancy-sensor applicationoccupancy-sensor/src
. - Edit clusters using the ZAP tool but
Create the new endpoint for a Matter Occupancy Sensor (0x0107)
Expand the Measurement & Sensing
menu and configure the Occupancy Sensing
cluster by setting the Server option from the drop-down menu.
Go to the Occupancy Sensing
cluster configuration and make sure that you have the Occupancy
attribute enabled.
Add the include for BGT60.h
to the file app_task.cpp
:
#include <app-common/zap-generated/attributes/Accessors.h>
#include "BGT60.h"
Declare the constants kOccupancyEndpointId
and kOccupancyOccupiedToUnoccupiedTransitionTimeMs
in app_task.cpp
:
namespace
{
constexpr size_t kAppEventQueueSize = 10;
constexpr uint32_t kFactoryResetTriggerTimeout = 6000;
constexpr EndpointId kOccupancyEndpointId = 1;
constexpr uint64_t kOccupancyOccupiedToUnoccupiedTransitionTimeMs = 60 * 1000;
Add the declaration and initialization of the bgt60
sensor handle to file app_task.cpp
:
void StopSensorTimer()
{
LOG_INF("Stopping sensor timer ...");
k_timer_stop(&sSensorTimer);
}
BGT60 bgt60;
CHIP_ERROR AppTask::Init()
{
/* Initialize radar sensor */
bgt60 = BGT60::Instance();
bgt60.Init();
if (bgt60.IsReady())
{
LOG_INF("BGT60 initialized: ready.");
}
else
{
LOG_ERR("BGT60 initialization failed.");
return CHIP_ERROR_INTERNAL;
}
/* Initialize CHIP stack */
LOG_INF("Init CHIP stack");
Modify the SensorMeasureHandler
to set or unset the occupancy attribute if a motion has been detected during a pre-defined time interval kOccupancyOccupiedToUnoccupiedTransitionTimeMs
:
oid AppTask::SensorActivateHandler(const AppEvent &)
{
StartSensorTimer(2000);
}
void AppTask::SensorDeactivateHandler(const AppEvent &)
{
StopSensorTimer();
}
void AppTask::SensorMeasureHandler(const AppEvent &)
{
uint64_t now = k_uptime_get();
uint64_t motion_time = bgt60.GetLastTdTime();
if (now - motion_time < kOccupancyOccupiedToUnoccupiedTransitionTimeMs)
{
chip::app::Clusters::OccupancySensing::Attributes::Occupancy::Set(kOccupancyEndpointId, chip::app::Clusters::OccupancySensing::OccupancyBitmap::kOccupied);
}
else
{
chip::app::Clusters::OccupancySensing::Attributes::Occupancy::Set(kOccupancyEndpointId, 0x00);
}
}
- Create a callback for sensor activation and deactivation
- Add new source file to CMakeLists: Also add the BGT60 class to the build by updating
CMakeLists.txt:
Addsrc/BGT60.cpp
source file to theapp
target.
Additionally the device overlay shown above in part one has to be incorporated in the boards/nrf5340dk_nrf5340_cpuapp.overlay
file.
The complete code is available from the repo linked in the Code section (v1.0).
After successfully building the firmware the application should be ready for testing.
TestTests has been setup and done as described in the Testing Matter in the nRF Connect SDK with the CHIP tool using the Matter over Thread: Configuring Border Router and Linux/macOS controller on separate devices configuration (OTBR on a Raspberry Pi 4 running Ubuntu Linux w/ nRF52840 Dongle as RCP and the CHIP tool running the development PC also running Ubuntu Linux).
Basic tests performed:
- Commissioning into a Thread network over Bluetooth LE
$ ./chip-tool pairing ble-thread <node_id> hex:<operational_dataset> <pin_code> <discriminator>
as described here;
- Activate the sensor by running the following command on the On/off cluster with the correct node_ID assigned during commissioning as as described here;
./chip-tool onoff on node_ID 1
- Read the measurement several times by checking value of
Occupancy
in the Occupancy Sensing cluster:
./chip-tool occupancysensing read occupancy node_ID 1
- Subscribing to the Occupancy attribute in the Occupancy Sensing cluster as described here;
echo occupancysensing subscribe occupancy 5 10 1 1 | ./chip-tool interactive start
- Deactivate the sensor by running the following command on the On/off cluster with the correct node_ID assigned during commissioning:
./chip-tool onoff off node_ID 1
Nordic Semiconductor nRF7002 DK
The nRF7002 DK is a development kit for the nRF7002, a Wi-Fi 6 companion IC that provides seamless connectivity and Wi-Fi-based locationing. It is designed to be used with Nordic’s existing Bluetooth and cellular IoT products, or with non-Nordic host devices. The nRF7002 DK enables low-power Wi-Fi applications and supports Wi-Fi 6 features like OFDMA, Beamforming, and Target Wake Time.
You can learn more about the nRF7002 DK on the product page at https://www.nordicsemi.com/Products/Development-hardware/nRF7002-DK.
Additional Components
- 8 Channel Bi-Directional Logic Level Converter Module
- AMS1117 5V to 3.3V Step-Down Regulator Module
Same as for the nRF5340 DK except for the Thread Border Router which is not required for testing the implementation with Matter over Wi-Fi
Implementation - Part I - Connecting the sensorSince the nRF7002 DK board operates on 1.8V and the S2GO RADAR BGT60LTR11 sensor operates on 3.3V some level shifting component is required to connect the radar sensor to the board. If the nRF7002 DK is operated with a 3.3V power supply the radar sensor can be connected directly to this (VIN), otherwise, if the board is operated with 5V power supply (e.g., via USB) an additional voltage regulator (step-down) might be required.
Following the instructions above up to the last step and incorporating the device overlay shown above in part one into the file boards/nr7002dk_nrf5340_cpuapp.overlay
accordingly will create the Matter application for the nRF7002 DK.
Enhancement (also applicable for the implementation with the nRF5340 DK)
The time delay, before the changes to its unoccupied state after the last detection of movement in the sensed area can be configured by an attribute from the one of the configuration attribute sets defined for the occupancy sensing cluster.
Since the Matter Application Cluster Specification defines no appropriate sensor type for an occupancy sensor using radar for motion detection the abbreviation PIR shall be interpreted as for Pretty Ingenious Radar in the scope of this project ;)
The enabled attribute(s) in the Occupancy Sensing cluster as configured with the ZAP tool:
The method AppTask::SensorMeasureHandler
can then be implemented as follows:
void AppTask::SensorMeasureHandler(const AppEvent &)
{
uint16_t aOccupiedToUnoccupiedDelay = 0;
chip::BitMask<chip::app::Clusters::OccupancySensing::OccupancyBitmap> aOccupancy = 0x00;
uint32_t occupiedToUnoccupiedDelayMs = 0;
bool occupied = false;
uint64_t lastMotion = bgt60.GetLastTdTime();
uint64_t now = k_uptime_get();
/* retrieve OccupiedToUnoccupiedDelay attribute from OccupancySensing endpoint ... */
chip::app::Clusters::OccupancySensing::Attributes::PIROccupiedToUnoccupiedDelay::Get(kOccupancyEndpointId, &aOccupiedToUnoccupiedDelay);
occupiedToUnoccupiedDelayMs = aOccupiedToUnoccupiedDelay * 1000; /* secs to millis */
/* if occupiedToUnoccupiedDelayMs < timeout for the SensorTimer, adjust value to avoid that a motion detection passes by unnoticed ... */
occupiedToUnoccupiedDelayMs = occupiedToUnoccupiedDelayMs < kSensorTimerTimeout ? kSensorTimerTimeout : occupiedToUnoccupiedDelayMs;
/* retrieve Occupancy attribute from OccupancySensing endpoint ... */
chip::app::Clusters::OccupancySensing::Attributes::Occupancy::Get(kOccupancyEndpointId, &aOccupancy);
occupied = aOccupancy.HasAny(chip::app::Clusters::OccupancySensing::OccupancyBitmap::kOccupied);
if (lastMotion + occupiedToUnoccupiedDelayMs >= now)
{
LOG_INF("Occupancy: OCCUPIED (reset in %lld ms)", (lastMotion + occupiedToUnoccupiedDelayMs) - now);
/* occupancy transition from unoccupied to occupied: update Occupancy attribute in OccupancySensing endpoint */
if (!occupied)
{
PlatformMgr().LockChipStack();
chip::app::Clusters::OccupancySensing::Attributes::Occupancy::Set(kOccupancyEndpointId, chip::app::Clusters::OccupancySensing::OccupancyBitmap::kOccupied);
PlatformMgr().UnlockChipStack();
}
}
else
{
LOG_INF("Occupancy: UNOCCUPIED");
/* occupancy transition from occupied to unoccupied: update Occupancy attribute in OccupancySensing endpoint */
if (occupied)
{
PlatformMgr().LockChipStack();
chip::app::Clusters::OccupancySensing::Attributes::Occupancy::Set(kOccupancyEndpointId, 0x00);
PlatformMgr().UnlockChipStack();
}
}
}
The complete code is available from the repo linked in the Code section (v1.2).
TestTest setup and tests are the same as for the implementation with the nRF5340 DK except there is no need for providing a Thread border router.
Basic tests performed:
- Commissioning into a Wi-Fi network over Bluetooth LE
$ ./chip-tool pairing ble-wifi <node_id> <ssid> <password> <pin_code> <discriminator>
as described here;
- Activate the sensor by running the following command on the On/off cluster with the correct node_ID assigned during commissioning as as described here;
./chip-tool onoff on node_ID 1
- Read the measurement several times by checking value of
Occupancy
in the Occupancy Sensing cluster:
./chip-tool occupancysensing read occupancy node_ID 1
- Change the configuration attribute by writing the new value to the
PIROccupiedToUnoccupiedDelay
attribute in the Occupancy Sensing cluster:
./chip-tool occupancysensing write piroccupied-to-unoccupied-delay 10 node_ID 1
- Subscribing to the Occupancy attribute in the Occupancy Sensing cluster as described here;
echo occupancysensing subscribe occupancy 5 10 1 1 | ./chip-tool interactive start
- Deactivate the sensor by running the following command on the On/off cluster with the correct node_ID assigned during commissioning:
./chip-tool onoff off node_ID 1
Power Consumption
Power consumption has been measured with a simple "USB Meter" connected in the USB power supply line for the nRF7002 DK. After start, before commissioning it is about 42 mA. During commissioning - esp. when scanning for WiFi networks - spikes up to 100 mA have shown up. During operation the measurement was about 76 mA ( + ~ 10 mA when the radar sensor detected a target and indicated this with the blue and red LEDs).
Works with Alexa?
Since I had an Amazon Echo 5 2nd Gen at hand, I tried to test some of the provided examples and the occupancy sensing application with Amazon's Alexa.
It was always possible to connect the application - build on both nRF5340 DK and nRF7002 DK - to Alexa. Even commissioning a device connected via Matter over Thread via OTBR worked flawlessly and e.g., the light bulb example was recognized as light and could be used as expected ("Alexa? Turn on the light!").
But when I tried to use a sensor application based on the "Adding clusters to Matter application" guide, the device still could be connected to Alexa and was shown as the correct sensor type but the measurement was never reflected in Alexa (e.g., measured temperature was always -1, occupancy was always 0 etc.) even if the correct attribute value could be read via CHIP tool.
From the Supported device categories and clusters section in the Alexa developer documentation it seems that only one sensor type is supported at the moment - the contact sensor:
So I tried to let the occupancy sensor application act (also) as a contact sensor by adding an additional endpoint for a Matter Contact Sensor (0x0015)
device with settings for the StateValue
attribute in the Boolean State
cluster:
In the AppTask::SensorMeasureHandler
method just two lines of code have been added to set the StatueValue
attribute accordingly to the occupancy state (occupied = false / unoccupied = true).
The complete code is available from the repo linked in the Code section (experimental branch).
This modified application is recognized by Alexa as a combination device with a motion sensor and a contact sensor and both sensor readings are reflected correctly by Alexa.
The following clip demonstrate the integration of the occupancy sensor in Alexa Smart Home:
This clip is from another test with just one Matter Contact Sensor
endpoint and reversed logic (occupied = true / unoccupied = false):
The complete code is available from the repo linked in the Code section (experimental branch).
So it works with Alexa... somehow...
The device has also been tested briefly (commissioning & basic test) with Home Assistant, an open-source software for home automation with support for Matter integration (currently still in beta status).
Test setup:
- Home Assistant Operating System 11.4 on Raspberry Pi 4
- radar-occupancy-sensor Matter application (master branch) running on nRF7002 DK
- Home Assistant App for Android
Comments