For the Ci20 Creator challenge my plan was to transform the development board into a full-fledged robotic platform. This was accomplished by designing an add-on board with a number of sensors and input/output interfaces. Coupled with the on-board connectivity options of the development platform, the Creator becomes the ideal platform for robotic applications.
The designed shield (named Skynet Nucleon) is designed to mate with the 26-pin dual male header on board. The shield augments the platform by giving it the ability to gain situational awareness of its environment.
The idea behind the project is to scan the environment and dispatch sentinels such as drones or robots to accomplish various tasks. This is the essence behind Skynet. It is a distributed computing platform that has the capability to gather input from multiple information feeds. By applying intelligent algorithms to the data it can direct its robotic minions to do its bidding.
In this project I am going to outline the components and methods and programs needed in order to build the add-on module that powers the robotic platoon hive mind. The final goal is to build a Skynet platoon , which will consist of a drone and a hive mind. The hive mind represented by Creator Ci20 (or multiple ones) will have the capability to gather sensor data from its environment and dispatch a drone or a robot once an event is intercepted.
The Nucleon shield is the critical module that gives the Ci20 the ability to sense the environment. Since most drones can be controlled via Bluetooth commanding them from the Ci20 should be straightforward using the hci-tool.
For the project i mostly focused on designing the hardware and writing the Python drivers for the sensors.
Hardware designThe hardware for the Nucleon shield consist of a number of sensors and IO interfaces that convert the Ci20 into a standalone platform where one can deploy and test robotic ideas. The shield is equipped with the following modules:
Sound
An electret microphone coupled with an MCP3221A5 analog to digital converter. The microphone signal is conditioned by a single supply op-amp filter. The output is then fed to the I2C controlled A2D. The low sampling rate of the ADC limits the acquired sounds to whistles or noises that have a limited bandwidth.
Ambient Light
An APDS9300 ambient light sensor controlled via I2C bus 1 is used to quantify the amount of surrounding light in Lux. Based on the acquired data the Ci20 can activate any user programmable task.
Temperature
An MPL3115A2 temperature, barometer and altimeter MEMS sensor controlled via I2C bus is used to give the robotic platform environmental awareness. The altimeter/barometer data coupled with the GPS and IMU unit give the Creator accurate navigational capabilities.
Display
A ST7735 1.8 inch TFT display is used to communicate the stats to any engineers willing to repair the robot. The display lets the robot display it's operational status. Moreover it can be used as a dashboard or menu display.
Compass and Accelerometer
An FXOS8700 accelerometer and magnetometer MEMS sensor is used to give the robot navigational capabilities.
Real Time Clock and Calendar
The Ci20 already has a RTCC chip. The board does not have an accessible battery connector however. I added a MCP79400 RTCC that also integrates 1Kbit of EEPROM memory and an SRAM memory. The RTCC keeps track of time when power goes off or in case the Creator Ci20 cannot synchronize time via NTP.
Non-volatile memory
A robot needs to be able to store sensitive mission data in a special non-volatile memory. A 4 MB NOR flash memory S25L from Spansion (now Cypress Semiconductor) was added to accomplish this.
RGB LED
An RGB LED was added to allow the robot to express its mood. Currently this is unpopulated.
Serial Console
If you want to debug the robot chances are that you'll need access to its serial login console. I added a FT232 serial to USB chipset which allows one to operate the Creator Ci20 in a headless manner. Right now I can log in from a serial console in case I want to debug sensor data.
Touchpads
The capacitive touch button controller and the accompanying three touch pads are used as a human/humanoid machine interface (HMI). Their main purpose is to control small menus.
Separate modules
In addition, a GPS module is attached to the 4 pin Molex connector.
PCB designThe Nucleon shield was designed as a 2-layer board. Kicad schematic and PCB files can be found in the Gihub repository together with the Python code and BOM.
Currently only the RGB led is not populated. For the first revision of the PCB the RGB anode pin was connected to ground. This has been corrected in the schematic.
FirmwareThe following paragraphs below describe the firmware for each of the sensor , I/O modules and the RTCC chipset modules.
TFT displayThe TFT display is interfaced via a 3 wire SPI interface. The Chip select is attached to CS0 on physical pin 24. On Debian 8 the present SPI kernel module only supports one chip select. After cloning the the Python spidev package from the Github repository I had to modify the the source to target channel spidev32760.0.
The TFT display Python module depends on the spidev package and my own Ci20GPIO module. Since the Ci20GPIO module uses C-types shared library a precompiled .so file has to be included.
Non-volatile NOR flash MemoryTo interface the NOR flash module there are two options. The first was to modify the present SPI kernel module to include the second chip select. Second solution was to use a software SPI routine. Due to lack of time I implemented the SPI communication in software. This has the disadvantage of being really slow.
Real Time Clock and CalendarThe RTCC module allows the shield to keep track of time in the event of a power outage. The RTCC is also equipped with a small 1KB embedded EEPROM memory. The provided firmware in Python also allows the user to leverage the embedded EEPROM chip.
Ambient light sensorThe ambient light sensor uses I2C bus 1. The provided firmware together with the examples show how to use the sensor to detect daylight or low lighting conditions. The sensor has an embedded amplifer with a gain function which allows it to sense really low ambient light conditions.
Accelerometer/magnetometerThis module also interfaces via the I2C bus 1. The sensor is very sensitive so any metallic parts in the vicinity will corrupt the readings. If a metallic part is placed in the vicinity, in order to get really accurate data the user has to calibrate the hard and soft iron field effects. The sensor is also equipped with automatic position detection capabilties as shown on the embedded video below.
Temperature/Altimeter/BarometerThe MEMS sensor communicates with the Ci20 via I2C bus 1. The sensor is very accurate. On top of the temperature measurement the sensor can also show the barometric pressure and altitude above sea level. An example of the altitude functionality is shown on the image below.
I used a capacitive touch IC from Microchip. This chip communicates via the I2C bus. It uses three copper pads as touch pads which sense the capacitance created by the hand. in the embedded video below I coded a simple menu for demonstration purposes.
Serial interfaceThe serial to USB driver uses the Pyserial library. The UART pins are reserved by the getty program so the user has to apply some fixes. The user can leverage the serial USB port to log in on the Ci20 via the serial console as shown on the image below.
GPS moduleThe GPS is connected to serial port /dev/ttyS4. The Python code in the repository uses a better library to parse the data. I wrote a thorough tutorial previously on this part. Refer to this previous project for more details.
https://www.hackster.io/dhq/gps-on-the-ci20-80e295?ref=user&ref_id=44707&offset=0
Skynet Platoon nodeThe last step was writing the main program for the Skynet . The program takes a reading from all the sensors. If a reading is above a certain threshold a command is written to the terminal. The command can be piped to any WIFI, Bluetooth or custom low power radio to enable the robot.
The idea was to dispatch a drone or a robot based on any environmental changes sensed by the main Skynet program.
Future Expansion
The multiple data points generated by the different sensors can be used with a trained neural network. This would allow the program to autonomously move beyond simple if-then scenarios.Alternatively one can use IOT frameworks like Flow, IFFT, etc to coordinate robot behaviour. From a hardware perspective, my future plans are to include more sensors.
ConclusionThe goal of this contest project was to convert the board into a complete robotic platform. The project was successfully completed by designing the Skynet Nucleon shield. The hardware together with the firmware modules makes it possible to implement all kinds of robotic/IOT projects.
Comments