I am a Lecturer in a 4-year university and teach Electrical and Computer Engineering to undergraduates. My senior students have a two-semester senior project or capstone. The story of this project started with the 2021 EET senior cohort, from which one student who was already working part time for Lockheed Martin and had heard that one of the satellite companies housed on the top floor of our engineering building had a project they wanted to give to a senior class. The project was an educational nano-satellite, meaning it was a small and fully-functional stack of printed circuit boards, sensors, and mechanical components in a small box, except it wasn't made to fly in Space. Students would take it apart and perform "acceptance tests" of each sub-system and then put it back together, thus learning what it took to build a satellite. Our students spent a few weeks bringing themselves to speed to work on this project, but it eventually fell apart because the company got cold feet and didn't give us the full system we were to upgrade. We found ourselves in the middle of a semester with the project dead in the water but all that preparation already done. So, long story short, we restarted the project deciding to build our own satellite more or less from scratch. The original name was MSUSAT after our school, but later we rechristened it to ColoradoCUBE. The first flight for a CubeSat usually tends to be on a weather balloon as the latter is orders of magnitude less expensive and forgiving than a launch to Low-Earth orbit (LEO). Videos of small cubes rising to 100,000 ft (30 km) in the atmosphere and, upon the bursting of the balloon due to the pressure differential, descending with a parachute back to the ground, are very fun to watch, so we wanted to have a camera. One thing led to another and when the AAC 2021 launched in the Fall, we decided we wanted to have multiple cameras. We had seen a video from a 6-camera balloonsat which captured the shadow of the Moon during the great solar eclipse of 2017, and that got us excited. So, we applied for the Xilinx Kria KV260 with our satellite project.
The timeline of such a project with busy seniors already looking for jobs and a Lecturer with 7 classes inevitably stretched out well beyond our initial expectations, so a large part of the sections below are in development or planned. This said, we have already had two of our students hired, one by United Launch Alliance and the other by the newly spun-off Sierra Space Corporation. Additionally, the project has been adopted by our Engineering department as a flagship project and is going to carry over to the next year, with several students already recruited.
The Vision of ColoradoCUBE is to launch to LEO a miniature proof-of-concept for modular on-orbit close-quarter operations relying heavily on computer vision. We are currently waiting on the fabrication of our first iteration of satellite-bus-stack PCBs, which we are flying on a high-altitude weather balloon on Apr 23. In May, we are going to use our next iteration of our hardware stack to run NASA's open-source Core Flight System space software stack (aka Android for Space) and run an in-flight penetration-testing hackathon. This Summer and Fall, we will be applying to NASA's CubeSat Launch Initiative.
What follows is a progression or project steps, from simple to complex, most of them still in development. Some are based strictly on geometrical and photo-computational techniques, but others are canonical machine learning with training and inference.
Hello, World!Our project requires that the Kria KV260 is mobile. So, the firs step is to provide mobile solutions for power and communications. The simplest platform to test these solutions is a rotating platform usually used for displaying jewelry from all sides. An NiMH battery pack is connected to a DC-to-DC voltage regulator to provide 12V to the KV260 carrier and sufficient current. An ESP8266 WiFi module is connect through an FTDI UART-to-USB chip to the Kria to provide remote (mobile) viewing of what the camera sees. A Raspberry Pi Camera V2 is mounted on a small tripod and rotates with the KV260. This step only uses the provided SmartCamera and is fast enough to detect faces with a rotating camera.
This first step is motivated by the random rotational motion of a CubeSat
Status: Completed
Optical GyroAn optical or optical-flow gyro(scope) is an application which can calculate the rotational motion of a camera relative to its access from the camera video stream. This is an important step toward apps for stabilization of the camera view, 360-degree scene stitching, and "de-spinning". This step requires no additional hardware relative to the previous step.
Status: Under development
Scene despinningThe view from the camera on a rotating platform is "de-spun" so that a consistent direction can be chosen and extracted from the 360-stitched scene (for example, "look East" would only display the scene arc centered at E and clipped by the display width). Additional fixed-view object recognition and segmentation can be easily added to the pipeline.
Human visual perception has multiple stages of view stabilization both early and late in the visual pathway which makes human vision robust to all normal human motion and not limited to rotation.
The optical gyro can provide a "de-spinning" signal but this can only stabilize the view of a camera rotating in a plane. If we want to stabilize (aka "freeze") the view of a camera that moves with more degrees of freedom, we need to add a 9-DOF IMU module and do sensor fusion with the camera. Adding a GPS module would also fix the global location of the origin of the view and open the possibility for automatic inspection of outdoor structures not based on photogrammetry but situational awareness.
Status: Planned
Celestial orientationUsing the open-source real-time Stellarium night sky application, an extension of the scene despinning project is a night-time application to achieve orientational and directional awareness and frozen view based solely on the visible stars. In this application, the absolute attitude signal can serve as an error signal for vision-only directional fix and view freeze.
Status: Under development
Multiple camerasScene de-spinning and stabilization done with more than one camera provide a "denser" stream for stitching and thus more high-resolution fixed views.
Status: Depends on multi-camera carrier
Camera-LIDAR fusionEven greater precision of near-space object recognition and segmentation can be done if the camera stream is fused with a LIDAR solution. We have a rotating 360-degree LIDAR device which, once fit with its own mobile power solution, can overlay distance information to the rotationally stitched video scene, for objects located withing 25 meters. One difficulty with this step is that the rotational speeds of the LIDAR and the platform are very different, so the LIDAR stream needs to be filtered first.
A more fitting setup would be if each of the multiple cameras has a non-rotating TOF device paired with it. In this case the camera and TOF sensor can be fused at the frame level.
Status: Under development
3-DOF motion with a droneHaving developed mobile power and communication/broadcast solutions for the KV260 allows us to mounting the KV260 on a drone, specifically the NXP HoverGames kit from a previous Hackster.io competition. This raises the view stabilization challenge quite significantly, but the two most important applications of this setup are: (1) obstacle avoidance and (2) 3D object reconstruction. These are in fact the primary capabilities we would want to develop for our on-orbit CubeSat, so they can be deployed to do automated close-quarters inspection and operations of orbital or free-space structures, as well as randezvous, docking, and mission extension.
360-degree obstacle avoidanceThe newest drone flight controllers come out with provisions for 360-degree obstacle avoidance. The multi-camera and multi-sensor capabilities of the Kria K26 enables the capture of a full-surround positional and attitude awareness and real-time navigation hints to the flight controller.
Status: Depends on multi-sensor carrier
3D object reconstructionContrary to the 360-degree obstacle avoidance, the 3D reconstruction and inspection can be done with a drone with a single directional sensor array (multi-spectral camera and, optionally, TOF and radar for ultra-precision).
Status: Under development
Custom carrierWe are planning a 6-camera 1U CubeSat for which a custom carrier has to be developed. We'll use the supported MIPI-CSI protocol and one of our candidate cameras is the Google Coral camera which is based on it. We are also planning for additional sensors, namely IMU, TOF, and radar (for detection at a distance). The power solution is going to be based on solar cells and 4 18650 batteries. A custom thermal solution will also need to be developed, and, depending on the shell design, it may have to have a 100,000ft version (for balloon flights) and a LEO version.
Status: Planned
First iteration of ColoradoCUBE
Our student team has designed an EPS, battery pack, CDH, sensor payload boards which will form the core hardware stack for the first 1U CubeSat ballon flight of ColoradoCUBE in April. For simplicity, the Kria-based vision payload will be omitted for this flight but integrated in a 2U CubeSat, which will run NASA's Core Flight System software stack. An adjacent 1U CubeSat will be running a penetration hackathon challenge over WiFi.
Status: In fabrication
Comments