In a rush? Here is all you need to know about the lunaFlow project!
Dinoflagellates are marine planktons that fluoresce blue light when subject to strain. They can be observed during the night time along beaches when waves break against the shore (Figure 1). In the past, these organisms have allowed scientists to track ships in the ocean from space and to develop weather forecasting models. We would like to use their strain-responsive luminescence to measure pressure fields in fluid flows (e.g. a flow around an impeller). We have assembled a team from a mixed fluid dynamics, engineering, chemical and biological background to do so. Our aim is to develop a cost-effective: (i) incubator to grow the organisms and (ii) a multi-camera system to acquire tomographic videos of the luminesced light for three-dimensional pressure field reconstructions.
Measuring pressure fields
Pressure is a fundamental property of a fluid flow. Parcels of fluid exert a force on their surroundings and the distribution of these forces are interlinked with the movement of the fluid.
Traditionally, pressure is measured at single points within a flow using devices such as Pitot tubes or pressure transducers. While these measurements can achieve a high degree of accuracy and temporal frequency, they are often intrusive to the flow and cannot be used to understand how pressure is spatially distributed in the flow field.
Current three-dimensional techniques for measuring pressure are indirect in that they are calculated from the measurement of the velocity of seeding tracers in the flow field (known as tomographic PIV reconstructions). Getting pressure fields in such a way requires a high degree of accuracy for the velocity measurements which is often very hard to achieve. Furthermore, these techniques typically rely on the use of expensive (and dangerous-to-operate) high-power pulsating lasers.
Using strain-responsive plankton for measuring pressure would provide an alternative to these measurement techniques which would circumvent many of these problems associated to them, with the added benefit of being much more cost-effective. These organism can also be used in combination with UV lights to simultaneously measure velocities and pressures. We plan to develop open-source easily-deployable systems to grow these amazing “bio-pressure-sensors” and a multi-camera system to image them three-dimensionally.
Incubators
The plankton are particularly sensitive to temperature, salinity and oxygen levels and require a 12-hour cycle of light and dark. Off-the shelf incubators are not designed with our particular application in mind and we are therefore limited to a number of commercially available products that are rather expensive.
Tomographic camera systems
Tomographic camera systems share the same burden of an excessive cost; often prohibitive to most researchers. Their cost has been choking both their development and deployment in the field. In recent years, flourishing alongside the fields of computer vision and robotics, low-cost electronics are making the design of complex acquisition system much more accessible. We would like to contribute to the community by showcasing a design for an ultra-low cost 3D imaging system (which could be used in a broad range of disciplines).
Biological systemsVarious species of Dinoflagellates are bioluminescent, and they emit blue light upon physical stimulation. The bioluminescence behaviour is related to the circadian cycle of these organisms, and it only occurs during the night cycle. The two most studies species are Pyrocystis lunula and Lingulodinium polyedra. While they share some of the general features in their bioluminescence, they vary in their mobility and biophysical response to a stress stimulus. As they can be cultured in very similar conditions we will be growing both species and assessing their suitability as pressure sensors in the flow visualisation device.
Hardware design goalsIncubator
We are currently assembling our incubator. We will control the temperature within the recommended range of 20-22 °C in two ways. For heating, we will operate resistance elements with PID-controllled heating. For cooling, a thermoelectric Peltier heat pump to cool the air above and around the batches. Tropical aquarium lights will be used for illumination, controlled by a timer switch.
Cameras
We will be using Raspberry Pi camera modules / webcams that will be controlled using multiple Raspberry Pi 3B+ board. We aim to develop a four-camera system and will allow us to control the cameras simultaneously. It also should allow us, in the long run, to embed the camera-acquisition system into an integrated system, in which the cameras can be moved in response to the flow measured.
Ideally, we plan to achieve a system that can record at 1 Megapixel resolution (comparable to the 720-1080p High Definition) at frame rates of 20-30 fps.
OutcomeWe intend to release both the incubator hardware and control software design and the entirety of the multi-camera acquisition system. The project will lead to a number of benefits. It will substantially increase the accessibility to a technique to measure a fundamental flow property (pressure). Both the incubator and the imaging system can be used in a wide range of different scientific applications.
The project is currently under development - keep up with our progress below.Growing dinoflagellates
Species
- Pyrocystis lunula (CCAP 1131)
- Lingulodinium polyedra (CCAP 1121)
Sourcing
- Buy from CCAP (https://www.ccap.ac.uk) £50 pounds each
Growth media
- f/2 fertiliser from amazon https://www.amazon.co.uk/Phyto-Plus-fertilizer-phytoplankton-1000ml-x/dp/B00RQUESJG
- L1 stock media from CCAP (£50 for 5L of media)
- Make your own L1 stock media, recipe: https://www.ccap.ac.uk/media/documents/L1.pdf
All of these options include all of the components needed for the dinoflagellates to survive (all the vitamins that they cannot produce, phosphates, nitrates, trace metals etc. – all the carbon is sourced via photosynthesis). All bioluminescent dinoflagellates are marine species, so the media will be prepared in artificial saltwater.
Growing
- CCAP will send 50ml of liquid culture,
- add them to 500ml of fresh media, it will take c. 2-3 weeks to grow to high density 10-15.000 cells/ml
- To quantify take a known volume and put it on microscope slide and count a few
- We can then keep diluting them to 10% density every c. 2 weeks (e.g. 5L next)
- A good guide for a more DIY setting https://www.ccap.ac.uk/documents/PyrocystisCulturing.pdf
Grow them in 2L conical flasks and fill them to max 500ml.
Incubator control system buildMain components
- 1 x Raspberry Pi Zero W
- 4x NTC thermistors (Open Smart)
- 3 x heating units (https://uk.rs-online.com/web/p/enclosure-heating-elements/2995950/)
- 1 x circulating fan (https://uk.rs-online.com/web/p/axial-fans/7496950/)
- 1 x lighting strip (310 mm, https://www.allpondsolutions.co.uk/pled/)
- 1 x Timer switch socket (https://uk.rs-online.com/web/p/plug-in-time-switches/8215058/)
- 1 x Light sensor (Open Smart)
- 1 x fridge cooling unit (https://www.banggood.com/12V-6A-DIY-Electronic-Semiconductor-Refrigerator-Radiator-Cooling-Equipment-p-1074404.html)
- 1 x AC/DC 12 V adaptor
- 3 x MOSFETs (FQP30N06L, https://uk.rs-online.com/web/p/mosfets/8075863/)
- 1 x Relay board (https://www.banggood.com/5V-4-Channel-Relay-Module-For-Arduino-PIC-ARM-DSP-AVR-MSP430-Blue-p-87987.html)
- Misc. Wires
- Assorted resistors; 1k, 10k and 20k resistors
- 3 x 2N3904 transistors
Calibration components
- Biomaker board
- Raspberry Pi Zero W
- Thermometer (we used a –10 to 50 deg C thermometer)
- Thermal measurement chamber (ceramic mug with cats on it)
- Hot water and ice
Thermistor calibration
The control system regulates the heating and cooling of our incubator to ensure the dinoflagellates are in the right conditions to grow. We have four sensors inside the incubator; one for light and three to measure temperature in different parts of the box, enabling identification of hotspots. As such, it is important that our temperature sensors are well calibrated.
For calibration purposes we connected 4 NTC thermistors to our Biomaker board, a Raspberry Pi Zero W, and placed the thermistors and a calibration thermometer in a water-filled thermal measurement chamber (i.e. a ceramic mug with cats on it) as shown in Figure 3. For convenience, a shield to plug all the temperature sensors into was constructed. The thermistors were tethered together and placed in the water. Care was taken to ensure the water was well stirred and that the thermistors and thermometer were close together in the water. The temperature of the water was slowly adjusted with ice and hot water, the raw analog data was recorded from 4 NTC thermistors, using a low-pass filter to smooth the signals. When a touch button on the Biomaker board was pressed, the analog sensor values were sent to the Raspberry Pi via a serial connection (see “Remote monitoring”) and the operator noted down the thermometer temperature in a spreadsheet. To ensure proper data indexing, an incrementing integer is displaced on the LM1637 4-digit display that is recorded in both data sets. The XOD sketch for thermistor calibration is here, and the data-logging Python script to run on the Raspberry Pi is here. Note that the use of the 4-digit display negates the use of an external SD card for data logging as the D11 pin is used by both devices.
The data from the analog values, and the actual temperature was assembled for all four temperature sensors, and the resistance was calculated given the 10k Ohm on the thermistor board. The measured resistance and actual temperature were fitted to the first order Steinhart-Hart equation below to derive T0 and B (below left) (https://learn.adafruit.com/thermistor/using-a-thermistor). The close correlation of actual temperature to (calibrated) measured temperature for sensor 3 is shown in Figure 4.
Incubator temperature control
Our first prototype for the incubator uses a simple algorithm to change state based on the temperature, and switch on/off the fridge cooling unit, the fridge fan, the heating elements and the internal air circulating fan. A simple temperature-control algorithm is implemented with a state machine.
All sensor readings are taken on the Biomaker board, implemented in XOD. A Raspberry Pi is used for internet access and datalogging and receives serial data by the XOD program on the Biomaker board (see “Remote monitoring”). We have chosen to use an electromechanical relay for the first prototype to handle the 12V and 6A of current required for the fridge, but we may change to N-MOSFETs which can handle up to 30A, and will operate silently. The relay activates with a GROUND signal (logical 0), and rather than invert the control signals in the XOD software, we have used 2N transistors to invert the ON/1/Vcc signal to the required low signal (a “NOT” gate). The 12 V power is supplied by an AC/DC power converter and conveniently broken out with a DC jack adaptor. The heating elements are controlled with N-MOSFETs and are powered by 12 V. There are four indicator LEDs; yellow for the circulating fan, red for heating elements, and two green LEDs for the fridge relay and fan.
For the first prototype the heating and cooling control is digital – on or off. However, by using the MOSFETs with the heating elements, we have built-in the option to use a PID controller with a PWM control signal, if initial testing reveals our simple heating system is not sophisticated enough. Some online research suggests that Peltier devices (i.e. our fridge unit) cannot be controlled with a PWM signal. An initial phase of data collection and analysis will indicate whether this is an issue for us, though the biology of the organism is such that being too cold is less of a problem than being too hot.
We have also added a 4DS display unit to the Biomaker board to display the internal temperature and indicate the ON/OFF status of the fridge, fan and heating units. In future iterations, we will develop this interface using a "scope" object to display the time-course of the temperature and add some interactive functionality.
The XOD sketch for the Incubator is here and the Python script to run on the Raspberry Pi is here. The live incubator data feed is on our ThingSpeak channel.
After setting up the Pi for serial communication (see "Remote Monitoring" below), execute the Python script on the Pi, by navigating to the directory containing the script and typing:
$ sudo nohup python Pi_ThingSpeak.py &
The use of 'nohup' ensures the python script still keeps running, even when the local session is closed (useful for headless SSH sessions). The '&' character effectively lets the script run in the background and returns control back to the console. To shut the script down, you'll need to find its process ID (by typing 'top' and finding the python program and ID), then typing 'kill ID', where ID is the process ID.
The circuit diagram for the incubator is shown below.
Remote monitoring
The incubator will run 24-7, maintaining the internal temperature and light conditions for optimal growth. In order for the team to monitor the status of the incubator remotely, we have connected the Rich UNO R3 to a Raspberry Pi Zero W, which in turn is connected to the internet. The Rich UNO R3 periodically sends sensor readings to the Pi via a software serial UART on the UNO (using XOD), and is read over the Pi hardware UART by a short Python script (Python 2.7). The Python script also posts the incubator data to an IOT ThingSpeak channel. Using ThingSpeak we will also be able to enable automatic alerts, if the incubator temperature exceeds a certain value for example. As the Rich UNO R3 uses 5 volts and the Pi uses 3.3 volts, a simple voltage divider is used on the Pi RX input. The sensor data is also recorded locally on an SD card for analysis.
Some configuration of the Raspberry Pi is required to use the serial UART port on the GPIOs (14 and 15). First disable console use of the serial port:
$ sudo raspi-config
Go to Connections --> Serial --> Disable console, Enable hardware serial. Reboot the Pi:
$ sudo reboot
Now we need to add a couple of things:
$ sudo nano /boot/config.txt
Confirm that the line "enable_uart = 1" appears in the file (usually at the end). When using the Raspberry Pi Zero W, we need to add the following line to the file and save:
dtoverlay=pi3-miniuart-bt
Reboot the Pi (sudo reboot) and GPIOs 14 and 15 should now be correctly configured.
Control software
Box build
Materials
- (reclaimed) plywood boards, cut to size on a table saw:
> 840 by 540 by 20 mm cub x2 (base & lid)
> 840 by 400 by 20 mm cub x2 (long sides)
> 500 by 400 by 20 mm cub x2 (short sides)
- brass hinges x2
- waterproof paint
- insulation board
- (reclaimed) stainless steel handles x2
- draught excluding double-sided tape
- castor wheels with brakes x4
- wood screws
> No 6 x 5/8" (for castors and hinges) x 28
> No 8 x 2" (for box joints) x 40
> No 4 x 5/8'' (for mounting TotemMaker frame onto plywood) x 16
Tools
- Battery drill
- Drill bits (5mm)
- Wood saw
- Philips Screwdriver (PH2)
- Rasper
- Wood file
Assembly
To improve on the cooling, we have designed and tested a water-cooled circuit that uses a higher power Peltier unit placed outside the incubator. The water that is cooled by the external Peltier is circulated into the incubator and cools the internal air via a Aluminium finned heat exchanger.
Materials list:
- 1 x Polystyrene box
- Black polythene sheeting for blocking light inside the box
- Tape for securing black sheeting to inside of box
- White silicone sealant
Electronics:
- 1 x Rich UNO R3 microcontroller board (Open-Smart, Shenzhen)
- 1 x NTC thermistor (temperature sensor)
- 1 x RGB NeoPixel LED ring
- 1 x Light sensor (checking that lights are on inside box)
- 1 x fridge cooling unit - peltier device -(https://www.banggood.com/12V-6A-DIY-Electronic-Semiconductor-Refrigerator-Radiator-Cooling-Equipment-p-1074404.html)
- 1 x AC/DC 12 V adaptor (used to power 12 V fridge cooling unit)
- 1 x Relay board (https://www.banggood.com/5V-4-Channel-Relay-Module-For-Arduino-PIC-ARM-DSP-AVR-MSP430-Blue-p-87987.html)
- Misc wires
- 1 x AC/DC 5V power supply for the Rich UNO R3 board (i.e. a USB port on an AC plug).
Optional:
- 10k and 20k resistor (optional, for communication with Raspberry Pi for data logging)
- 1 x Raspberry Pi Zero W (optional, for data logging)
- 1 x USB WiFi dongle for Raspberry Pi (optional, for pushing data to ThingSpeak)
- 1 x AC/DC 5V power supply for Raspberry Pi
In addition to the larger incubator, we have also built a smaller version, using the same control logic and peltier cooling module. For this build we have used a large polystyrene box as the incubation chamber for ease of construction and polystyrene's favourable thermal insulation properties. In addition, rings of Neopixel lights and the Real Time Clock (RTC) is used to accurately control lighting conditions within the box and synchronise the behaviour of the dinoflagellates. The assembly is relatively straightforward; the polystyrene box is mounted onto a wooden board with adhesive. A small hole was cut through the box to allow the peltier module to enter with a snug fit, making sure not to allow any gaps and to seal the rubber faceplate on the peltier module with tape. Small holes for wires for a temperature sensor, light sensor and neopixel lights were made and sealed with silicone sealant. There is some penetration of ambient light in and out of the polystyrene box. To allow total control of internal light conditions, the inside of the box was lined with opaque black sheets of plastic (discarded compost bag liners).
Electrical connections
- The AC/DC 12V supply is wired through the relay such that power is only received on the fridge cooling unit when the correct signal is received from the Rich UNO R3 microcontroller.
- The control signal for the fridge cooling unit is connected to the appropriate relay signal and "D5" on the Rich UNO R3.
- The thermistor sensor is powered by 5V from the Rich UNO R3, and the signal is read through "A1" on the Rich UNO R3.
- The ring of neopixel lights are powered by 5V (might need separate power supply) and the control signal (Din) is connected to "D6" on the Rich UNO R3 board.
- All grounds connections are shared.
- Optional: the Raspberry Pi is connected for communication with the Rich UNO R3 with a voltage divider as shown in Figure 7. In our XOD sketch the Rich UNO R3 UART is connected as: RX = D3 and TX = D4 (RX can connect directly to the Raspberry Pi, but TX goes through the voltage divider as shown). Be 100% sure you have correctly identified the UART RX and TX pins of the Raspberry Pi before you power up as otherwise you might fry your pi!
The XOD code is a simpler version of the large incubator code, and can be found here.
Imaging System ConfigurationWe are developing software that allows to control an imaging system on different platforms and with different devices.
As a basic requirement, you will need:
- a PC running a Windows 7/8/10 OS or Linux Debian OS with 2-4 USB ports (not tested with Mac iOS)
- 2-4 cameras (tested on Logitech C310 Webcams)
We are also developing a image AQ system. This runs on multiple Raspberry Pi (hereafter rpi) boards which are controlled from a PC and communicates over a private local area network over an ethernet network switch.
Using multiple devices that run independently substantially increases the bandwidth (i.e. resolution and acquisition rate) that can be achieved using a single machine.
To acquire with multiple cameras on a single machine, jump to interfacing.
For more information on how to develop a multi-device imaging system, keep reading.
Hardware
The image AQ system requires:
- 2-4 raspberry pi 3B+ boards with 8GB+ Micro SD cards
- a network switch with 5+ ports
- 3-5 cat5/cat6 cables
alongside the PC and webcams. You can also substitute the webcams with rpi camera modules.
Connectivity
Here we outline the steps required to set up the imaging system. These include:
- setting up rpi boards to work with MATLAB
- setting up a private network
- setting up an MQTT Protocol configuration
Setting up Raspberry Pi boards
The imaging system runs on a combination of MATLAB (for app interfacing, see interfacing) and Python (for rpi interfacing) scripts.
First, you will need to configure the Raspberry Pi boards to interface with MATLAB following the instructions at:
> https://uk.mathworks.com/hardware-support/raspberry-pi-matlab.html
We recommend cloning the MATLAB Raspbian image onto a new Micro SD card. Run the add-on manager in MATLAB to setup your rpi board. To connect to eduroam, choose a WPA-2 enterprise connection and fill your details using your network access token (for Uni of Cambridge students, https://tokens.csx.cam.ac.uk/). You will need an internet connection to download the required software to set up the private network.
Creating a private network
Here, we will look at how to create a private network over an ethernet network switch between the rpis and a PC.
We will configure one of the rpis as a uDHCP server (micro Dynamic Host Configuration Protocol) and the other rpis and the PC as clients.
Proceed with caution while setting up network connections and ask for help if you are unsure. We do not want to be liable for anything going wrong in your home or department!
Setting up server:
1. First, open a terminal on your server rpi and install udhcpd
$ sudo apt-get update
$ sudo apt-get install udhcpd
2. Second, you will need to determine the name and MAC address of your server rpi's ethernet port (i.e. the unique identifier to that network bus). Run
$ ifconfig
On older versions of Raspian, the name of the port is by default eth0 (case 1). On newer versions of Raspian, the port name is given as enxXXXXXXXXXXXX (case 2, where the capital Xs are the numbers & letters of your MAC address without the colons). You can also find your ethernet's MAC address, given after
ether XX:XX:XX:XX:XX:XX
3. Next, you will need to set up a static IP address for your server rpi for this private network. Go to and edit:
$ sudo nano /etc/network/interfaces
and add to the bottom (for case 1)
auto eth0
iface eth0 inet static
address yyy.yyy.YY.1/24
or (for case 2)
auto enxXXXXXXXXXXXXX
iface enxXXXXXXXXXXXXX inet static
address yyy.yyy.YY.1/24
where yyy.yyy.YY.1 is the IP address you want to assign to the rpi server. For simplicity, we allocated the number 1 to the first device we added to the network. Save and exit.
>> Do not choose an IP address that is likely to clash with existing ones in your network! <<
4. Next, you will need to enable the dhcp server. Go to and edit
$ sudo nano /etc/default/udhcpd
and change the status from the default "no" to a "yes"
DHCPD_ENABLED="yes"
Save and exit.
5. Next, you will need to edit the configuration file for the network. Go to and edit
$ sudo nano /etc/udhcdp.conf
and comment out everything, but
interface enxXXXXXXXXXXXXX ------- local ethernet port
max_leases 0
option subnet 255.255.255.0
option domain local
option lease 864000
at the end of the file, also add the MAC addresses of the other devices in the network (below represented by Zs) and the IP address you will allocate to them (i.e. the clients).
static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.2
static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.3
static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.4
static lease ZZ:ZZ:ZZ:ZZ:ZZ:ZZ YYY.YYY.YY.5
Save and exit. You can come back to this step later if you do not know the MAC addresses yet.
6. Finally, you will want to restart the server by running
$ systemctl enable udhcpd
$ systemctl restart udhcpd
The server should now be running.
Setting up clients (rpis):
Repeat steps 1-4 from setting up server. For step 3, allocate an IP Address with the appropriate device number (e.g. YYY.YYY.YY.2 and so on).
Setting up clients (pc windows):
1. Simply find your MAC address in your cmd prompt by running
> ipconfig /all
and identifying under
Ethernet Adaptor Ethernet:
Physical Address ....... ZZ:ZZ:ZZ:ZZ:ZZ:ZZ
#NEW:
A GUI to create a DHCP private network on a Windows PC is available from
http://www.dhcpserver.de/cms/running_the_server/
(with thanks to
Uwe A. Ruttkamp
)
Setting up an MQTT Protocol
The imaging system communicates over an MQTT protocol. This is a simple communication protocol in which devices can publish to a server (referred to as a broker) and subscribe to messages sent to that broker. More information on how MQTT works is available on:
> https://uk.mathworks.com/help/thingspeak/mqtt-basics.html
Download the MATLAB support packages for MQTT from their file exchange:
> https://uk.mathworks.com/matlabcentral/fileexchange/64303-mqtt-in-matlab
MQTT communication requires a server to act as the broker. We are running our communication over a mosquitto channel. You will need to install and run mosquitto on the machine that acts as the broker.
To install mosquitto on a PC visit:
> https://mosquitto.org/download/
and follow the installation instructions.
Alternatively, you could use one of the rpis as a broker and install mosquitto on it by running:
$ sudo apt install mosquitto -y mosquitto-clients
in a terminal. You will need to run mosquitto on your broker machine to enable communications will running the imaging system.
To run on a Windows PC, first add mosquitto to your environment variables. Then run mosquitto in the background, with
> START mosquitto \B
#NEW:
Add mosquitto to your system environment variables to launch from the lunaflow App.
To run on the rpi, run mosquitto in background (as a daemon, -d) by running
$ mosquitto -d
You can find examples on how to use MQTT in Python (using the paho-mqtt module) from:
> https://pypi.org/project/paho-mqtt/
To connect your devices you will have to find your IP addresses by running ipconfig (in a Windows command prompt) or ifconfig (in a Linux terminal).
For file transfer, the imaging system sends files over SSH. You will have to install and enable ssh on your rpis by running
$ sudo apt-get update
$ sudo apt install openssh-server
$ sudo systemctl enable ssh
in the terminal.
>> Remember to change your rpi passwords from default! <<
To change password, run
$ sudo raspi-config
Following the setup you can test whether acquisition via the rpis works using the camFlowAq app (see below).
Interfacing
A bespoke app has been developed by the lunaFlow team to help users acquire with multiple cameras simultaneously.
The app has been designed to work with any device connected to the computer (for example, multiple webcams connected via USB). A remote AQ mode is available to use the app over an MQTT connection established over a local area network (see connectivity).
The app runs on MATLAB (tested on v2019) or as a standalone. It is available for download from the following github link:
> https://github.com/FrancescoCiriello/lunaFlow
While running on MATLAB, the following toolboxes are needed:
- image acquisition toolbox
- image processing toolbox
- computer vision toolbox
You will also have to install the hardware support package for your particular camera interfacing. Run
supportPackageInstaller
in the MATLAB command window. For webcams, install the OS generic video interface package:
The camFlowAqApp also works with scientific cameras! (tested on a JAI Spark 5000M USB with a GenICam interface) You will have to add your camera's software development kit (SDK) to your environment variables.
Add the app to your MATLAB search path (Home > Environment > Search Path) and run the app from the command window with
camFlowAqApp
By default, the camFlowAqApp runs on devices connected directly to your machine.
To access remote device acquisition, click on the File > Remote Aq Mode tab.
You will have to perform the setup instructions outlined in connectivity.
More documentation and examples in the README file. The lunaFlow suite is released under an MIT open-source licence.
#NEW:
Add vlc to your system environment variables to run video views.
The app can be launched without MATLAB using the standalone executable by installing MATLAB 2019B Runtime.
A new mode has been added to the lunaFlow App to use with Jetson boards. Jetson boards should allow for future developments into live tomography (
work in progres
s).
Imaging System Build
Single computer system
To run the single computer system, simply connect the webcams via USB. After configuring MATLAB for image acquisition (see connectivity), you can launch the app and start acquiring with multiple cameras simultaneously.
Multi-computer system
The multicomputer systems work with rpi boards and a network switch. You will need CAT5/6 cables to connect the rpis and PC to the switch and power supplies to power the devices. Rpi camera modules connect via the camera link (make sure these are enabled on your rpi) and webcams can connect via USB.
An example imaging system is shown below.
Assembly
The imaging system is mounted onto a custom-built Totem Maker kit.
Visit the TotemMaker website for instructions on how to assemble connections:
> https://totemmaker.net/wiki
We used M2 screws and nuts to mount the rpi camera modules and nylon screws to secure the boards and the network switch to the chassis.
#NEW:
We have designed a 3D-printable holder to hold the boards and network switch
in place.
The assembled system is shown below.
Download link: https://a360.co/34DtPMF
Using the lunaFlow AppsLocal mode
1. Choose the device interface, format and ID to use in the camera tab. Press the load camera button to connect to device for acquisition. You can use the quickload button to connect to multiple devices in one click. Use the reset button to restart the camera connection to MATLAB.
2. The preview tab can be used to start a livestream. It will display images from devices whose Checkbox (top right) is ticked.
3. Adjust camera settings interactively in the control panel.
4. Acquire images in the acquire panel. Acquisition is triggered simultaneously on active cameras (whose Checkbox is ticked). The acquire background button saves a mean image from a short video for background subtractions. Choose your project folder to save your images.
Remote access mode
1. To start the mosquitto channel, press on the mosquitto launch icon in the start panel.
2. log-in with your raspberry pi credentials.
3. establish an SSH connection for file transfer and reboot controls.
4. launch your PC as an MQTT broker under the MQTT protocol settings panel.
5. acquire sync'd images or videos using the PiCameraControl panel.
6. calibrate camera extrinsics manually (guided checkerboard acquisition) or automatically (based on SURF feature detection)
Comments