azizhinesonJoshua fleetYuri Kozhevnikov
Published

Project AURA-Architectural User-responsive Reactive Assembly

A gesture-controlled model of UoM’s Engineering Building using Pi 5 + ML — wave your hand, and the architecture moves.

IntermediateWork in progress24 hours534
Project AURA-Architectural User-responsive Reactive Assembly

Things used in this project

Hardware components

Raspberry Pi 5
Raspberry Pi 5
The heart of the whole operation
×1
Raspberry Pi Pico
Raspberry Pi Pico
We used a pi pico to control the motors as it was a method our group were familiar with, had we not been time limited we could have just used the Pi 5.
×1
DC motor (generic)
This Motor was used to rotate the entire model.
×1
Servo Module (Generic)
2 Servos were used to open the model up, showing an interior view. We used servo's here due to requiring more precise control.
×2
Rechargeable Battery, 4.8 V
Rechargeable Battery, 4.8 V
any power supply is suitable provided it has an accompanying voltage regulation circuit to ~5v
×1
1N4007 – High Voltage, High Current Rated Diode
1N4007 – High Voltage, High Current Rated Diode
protection diode for power supply to pico
×1
Logic Level FET P-Channel
Logic Level FET P-Channel
p channel mosfet for switching circuit for dc motor
×1
Resistor 1k ohm
Resistor 1k ohm
any resistor above 220 ohm is suitable for pull down resistors for which three are required for this project
×1

Software apps and online services

Raspbian
Raspberry Pi Raspbian
gpiozero

Story

Read more

Custom parts and enclosures

1:200 architectural model of UoMs Engineering building

accurate and artistic representation of UoM's very own engineering building opened in half (not to scale)

Schematics

Pi5-Pico interface

with servo motors, dc motor, MOSFET switch and protection diode.

Code

Gesture-Controlled architectural model Using PoseNet and Raspberry Pi 5

Python
use of PoseNet for real time human pose detection to control the movement of a model via gestures with the outputs triggering GPIO pins on a Raspberry Pi Pico.
from modlib.apps import Annotator
from modlib.devices import AiCamera
from modlib.models.zoo import Posenet
import numpy as np
from math import sqrt
from gpiozero import Servo
from gpiozero import LED
import time

#GPIO pin setup for digital communication with the raspberry pi pico
servoRight = LED(2)
servoLeft = LED(3) 
motorswitch = LED(4) 

#setting up and deploying pre-trained ai model
device = AiCamera()
model = Posenet()
device.deploy(model)

#setup annotation utility to dray pose landmarks
annotator = Annotator()

#start reading video frames
with device as stream:
    for frame in stream:
        detections = frame.detections[frame.detections.confidence > 0.5]
        poses = frame.detections  # always called frame.detections, depends on model not object
        
        poses.keypoints[0] # this will store 34 values of the first person detected
        person0 = poses.keypoints[0]
        
        #extract normaalised co-ordinates of hips and wrists for gesture detection
        leftHipx = person0[23]
        leftHipy = person0[22]

        rightWristy = person0[18]
        rightWristx = person0[19]

        leftWristy = person0[16]
        leftWristx = person0[17]

        #initialise gesture flags
        t_pose = False
        leftturn = False
        rightturn = False
        
        #calculate horizontal distance between wrists 
        leftDiffsx = abs(rightWristx - leftWristx)

        #if wrists are far enough apart assume t-pose
        if leftDiffsx > 0.45:
            print("T pose detected, model opening...")
            t_pose = True
        
        #calculate horizontal distance between left wrist and hip
        leftArmDist = abs(leftWristx - leftHipx)
    
        #if right arm stretched out assume right turn
        if leftArmDist > 0.16:
            if t_pose == False:
                print("model rotating right...")
                leftturn = True
                
        #if right arm stretched out and not currently rotating right or t pose, rotate left
        rightArmDist = abs(rightWristx - leftHipx)
        if rightArmDist > 0.16:
            if t_pose == False:
                if leftturn == False:
                    print("model rotating left...")
                    rightturn = True

        #based on gestures, trigger respective outputs
        if t_pose :
            motorswitch.on()
        else :
            motorswitch.off()

        if leftturn :
            servoLeft.on()
        else :
            servoLeft.off()

        if rightturn :
            servoRight.on()
        else :
            servoRight.off()
            
            
        # Annotate and display the frame
        annotator.annotate_poses(frame, detections)
        frame.display()

Raspberry pi pico servo control

C/C++
digital communication between pi 5 and pi pico which controls the opening of the model via servos
#include <Servo.h>

Servo rightServo;
Servo leftServo;
const int ledPin = 25;    // Onboard LED for Pico
const int rightPin = 2;   // GP2 (physical pin 4)
const int leftPin = 3;    // GP3 (physical pin 5)
const int inp1 = 0;       // Right servo control (active HIGH)
const int inp2 = 1;       // Left servo control (active HIGH)

// Servo states (FSM)
bool rightServoState = false;  // false=0°, true=180°
bool leftServoState = false;   // false=0°, true=180°

// Previous input states for edge detection
bool prevInp1 = LOW;
bool prevInp2 = LOW;

void setup() {
  pinMode(ledPin, OUTPUT);
  rightServo.attach(rightPin);
  leftServo.attach(leftPin);
  pinMode(inp1, INPUT);  // pull low resistor, input from Pi5 
  pinMode(inp2, INPUT);  // pull low resistor, input from Pi5 
  
  // Initialize
  rightServo.write(0);
  leftServo.write(0);
  digitalWrite(ledPin, LOW);
}

void loop() {
  // Read current inputs
  bool currentInp1 = digitalRead(inp1);
  bool currentInp2 = digitalRead(inp2);

  // Right servo control (trigger on rising edge)
  if (currentInp1 == HIGH && prevInp1 == LOW) {
    rightServoState = !rightServoState; // Toggle state
    
    if (rightServoState) {  // If true, set to 90°
      rightServo.write(90);
      digitalWrite(ledPin, HIGH);
    } else {  // If false, set to 0°
      rightServo.write(0);
      digitalWrite(ledPin, LOW);
    }
  }
  prevInp1 = currentInp1;

  // Left servo control (trigger on rising edge)
  if (currentInp2 == HIGH && prevInp2 == LOW) {
    leftServoState = !leftServoState; // Toggle state
    
    if (leftServoState) {  // If true, set to 90°
      leftServo.write(90);
    } else {  // If false, set to 0°
      leftServo.write(0);
    }
  }
  prevInp2 = currentInp2;

  delay(10); // Minimal delay to prevent busy-waiting
}

Credits

aziz
1 project • 5 followers
Contact
hineson
1 project • 4 followers
Contact
Joshua fleet
1 project • 4 followers
Contact
Yuri Kozhevnikov
1 project • 4 followers
Contact

Comments

Please log in or sign up to comment.