The Zybo Z7 comes with a bunch of user-controlled LEDs, push buttons and switches. All of which are accessible via the Zynq’s GPIO controller. However, only one LED and two buttons are directly connected to the processing system (PS), the remaining are connected to the programmable logic (PL). In this project I will therefore not only show how to control the Zynq’s GPIO but also how to interact with the PL using GPIO and MMIO.
Note that I'm assuming you know how to run Genode on the Zybo Z7. If not, please have a look at my Getting started with the Zybo Z7 project.
ArchitectureThe below figure illustrates the general architecture in Genode for pin-level access. At the bottom resides the Core / Init component, which possesses ultimate authority and is free from access policy. It merely provides authority over all device resources to the Platform Driver via the IRQ and MMIO service. The latter provides the Platform service via which particular components can gain authority over individual devices. In this scenario, the Pin Driver gains authority over the GPIO controller. Similarly, it provides a Pin control, Pin state and IRQ service to pass on write, read and notification authority over individual pins to particular components. In this example, I will implement a MIO Demo component that controls the PS-accessible GPIO pins for button 4, button 5 and LED 4 of the Zybo Z7 board.
Genode 21.11 introduced the Pin I/O session interfaces that have first been implemented by the pin driver for the A64 SoC. Using the A64 pin driver as a blueprint, I followed suit and implemented a pin driver for the Zynq SoC. Thanks to the slick groundwork, I was able to re-use most of the code and focus on the SoC-specific parts.
With the pin driver at hand, I am already able to control the PS-accessible LED and buttons. Hence, I wrote a tiny MIO demo component. The component shall turn the LED on when button 4 was pressed and off when button 5 was pressed. It also generates a state report that reflects the LED state.
The first step when writing a component is to create a target.mk file. The convention is to place the source code of a component in a sub-directory of src/app,src/server,src/drivers or src/test depending on the type of component. In my clone of the genode-zynq repository, I thus created the file src/app/zynq_gpio_demo/mio/target.mk with the following content. Note since I'm going to implement multiple related components, I'm bundling all components in a common sub-directory zynq_gpio_demo.
TARGET := zybo_gpio_demo_mio
SRC_CC := main.cc
LIBS := base
The target.mk contains a few declarations that will be interpreted by Genode's build system when compiling the component. The first line specifies the name of the target binary. The second line states what source files shall be passed to the C++ compiler. In the third line, I define library dependencies (base
denotes Genode's base API).
Next, I created the main.cc file. Let's start with the first few line.
/* Genode includes */
#include <base/component.h>
#include <irq_session/connection.h>
#include <pin_state_session/connection.h>
#include <pin_control_session/connection.h>
#include <os/reporter.h>
namespace Demo {
using namespace Genode;
struct Main;
}
/* [...] see below */
void Component::construct(Genode::Env &env)
{
static Demo::Main main(env);
}
We must include a few header files: The base/component.h
header is required for implementing a native Genode component. The next three includes are required for the aforementioned Pin I/O session interfaces implemented by the pin driver. Last, os/reporter.h
is needed for generating the state report mentioned above.
I further declared a Main
object in a separate Demo
namespace. A native Genode component does not have a main()
function as an entry point as we are used from POSIX programs. Instead, the entry point is the Component::construct()
method in which I simply instantiate by Demo::Main
object. Now, let's have a look at its implementation.
struct Demo::Main
{
/* members */
Env &_env;
Pin_state::Connection _btn4 { _env, "Btn4" };
Pin_state::Connection _btn5 { _env, "Btn5" };
Pin_control::Connection _led4 { _env, "Led4" };
Irq_connection _irq4 { _env, "Btn4" };
Irq_connection _irq5 { _env, "Btn5" };
Signal_handler<Main> _irq_handler {
_env.ep(), *this, &Main::_handle_irq };
Expanding_reporter _reporter { _env, "state", "state" };
/* methods */
void _update_state(bool on)
{
_reporter.generate([&] (Genode::Xml_generator & xml) {
xml.attribute("value", on ? "yes" : "no");
});
_led4.state(on);
}
void _handle_irq()
{
_irq4.ack_irq();
_irq5.ack_irq();
if (_btn4.state())
_update_state(true);
else if (_btn5.state())
_update_state(false);
}
/* constructor */
Main(Env &env) : _env(env)
{
_update_state(false);
_irq4.sigh(_irq_handler);
_irq5.sigh(_irq_handler);
_irq4.ack_irq();
_irq5.ack_irq();
}
};
Members: First of all, the Main
object holds a reference of Genode::Env
as this is required for the construction of other members. For access to the buttons' input pins, I'm using Pin_state::Connection
objects. The string passed to its constructor is called the session label and is evaluated by the pin driver to apply a matching access-control policy (details below). Similarly, for access to the LED's output pin, I'm using a Pin_control::Connection
object. Moreover, since I want to be notified of button state changes, I added an Irq_connection
object for each button. These are accompanied by a Signal_handler
object that will be registered to the IRQ connections. This signal handler will call the _handle_irq()
method. Last, I'm using an Expanding_reporter
for the state report. The Expanding_reporter
takes two strings as constructor arguments. The first string specifies the name of the top-level XML node (reports in Genode are typically XML-formatted) whereas the second string defines the session label, which can be seen as the name of the report.
Methods: The _update_state()
method is a helper for setting the state of the LED and updating the component's state report. The _handle_irq()
method is called by the signal handler whenever an IRQ occured for either of the buttons. This method acknowledges the IRQ and evaluates the button states to update the LED state accordingly.
Constructor: The constructor is quite straightforward. It sets the initial LED state, registers the signal handler at both IRQ connections and acknowledges any pending IRQs.
The next ingredient we need is a run script. In Genode, a run script is a definition of a certain scenario to be executed. More precisely, it contains information about the components to be built and how they are composed to create the particular scenario. The run script may also evaluate the (serial) output of the running system to check for errors. Since run scripts reside in the run/ directory, I created the file run/zynq_gpio_demo.run with the following content in my clone of the genode-zynq repository.
create_boot_directory
import_from_depot [depot_user]/src/[base_src] \
[depot_user]/src/init \
[depot_user]/src/report_rom \
[depot_user]/src/zynq_platform_drv \
[depot_user]/src/zynq_pin_drv \
[depot_user]/raw/[board]-devices
build { app/zybo_gpio_demo }
install_config {
<config>
<parent-provides>
<service name="LOG"/>
<service name="PD"/>
<service name="CPU"/>
<service name="ROM"/>
<service name="IO_MEM"/>
<service name="IRQ"/>
</parent-provides>
<default caps="200"/>
<start name="report_rom">
<resource name="RAM" quantum="1M"/>
<provides>
<service name="Report"/>
<service name="ROM"/>
</provides>
<route>
<service name="ROM"> <parent/> </service>
<service name="CPU"> <parent/> </service>
<service name="PD"> <parent/> </service>
<service name="LOG"> <parent/> </service>
</route>
<config verbose="yes"/>
</start>
<start name="platform_drv" managing_system="yes">
<binary name="zynq_platform_drv"/>
<resource name="RAM" quantum="1M"/>
<provides><service name="Platform"/></provides>
<config>
<policy label="zynq_pin_drv -> ">
<device name="gpio0"/>
</policy>
</config>
<route>
<any-service> <parent/> </any-service>
</route>
</start>
<start name="zynq_pin_drv">
<resource name="RAM" quantum="1M"/>
<provides>
<service name="Pin_state"/>
<service name="Pin_control"/>
<service name="IRQ"/>
</provides>
<route>
<service name="ROM"> <parent/> </service>
<service name="CPU"> <parent/> </service>
<service name="PD"> <parent/> </service>
<service name="LOG"> <parent/> </service>
<service name="Platform">
<child name="platform_drv"/>
</service>
</route>
<config>
<in name="Btn4" bank="1" index="18" irq="rising"/>
<in name="Btn5" bank="1" index="19" irq="rising"/>
<out name="Led4" bank="0" index="7" default="on"/>
<policy label_prefix="zybo_gpio_demo_mio -> Btn4" pin="Btn4"/>
<policy label_prefix="zybo_gpio_demo_mio -> Btn5" pin="Btn5"/>
<policy label_prefix="zybo_gpio_demo_mio -> Led4" pin="Led4"/>
</config>
</start>
<start name="zybo_gpio_demo_mio">
<resource name="RAM" quantum="1M"/>
<route>
<service name="Pin_control">
<child name="zynq_pin_drv"/>
</service>
<service name="Pin_state">
<child name="zynq_pin_drv"/>
</service>
<service name="IRQ">
<child name="zynq_pin_drv"/>
</service>
<service name="report">
<child name="report_rom"/>
</service>
<service name="ROM"> <parent/> </service>
<service name="CPU"> <parent/> </service>
<service name="PD"> <parent/> </service>
<service name="LOG"> <parent/> </service>
</route>
<config/>
</start>
</config>
}
build_boot_image { zybo_gpio_demo_mio }
run_genode_until forever
I don't want to go into all the details here. The run script specifies what archives to import from the depot (Genode's package management) and what components to build from the source tree. It also installs a configuration for the top-level init component. Please refer to the System configuration chapter of the Genode manual for a detailed explanation of init's configuration. What we can identify without an in-depth knowledge, though, is that the scenario consists of four components: report_rom
, platform_drv
, zynq_pin_drv
and zybo_gpio_demo_mio
.
report_rom
:This component provides a ROM and a Report service. It allows components to post their reports, which are made accessible via ROM connections. It thereby implements a single-writer, multiple-readers scheme.
platform_drv
: This is a driver component that takes ultimate control over the hardware peripherals (e.g. MMIO devices). According to its configuration, it passes control to individual devices to other components via the provided Platform service. The platform driver requires access to a board-specific devices ROM, which I imported from the depot. Its configuration contains label-based policies that assign devices (as defined by the devices ROM) to the corresponding Platform sessions. Here, the zynq_pin_drv
is granted access to the gpio0
device.
zynq_pin_drv
: This is the pin driver. Its configuration specifies two input pins for the buttons and one output pin for the LED. Looking at the Zybo Z7 manual, I identified that LED 4 is connected to MIO pin 7 and that buttons 4 and 5 are connected to MIO pin 50 and 51 respectively. Knowing that GPIO bank 0 covers MIO pins 0 to 31 whereas bank 1 covers pins 32 to 53, I ended up with the posted configuration. Similar to the platform driver, access to individual pins is defined by label-based policies.
zybo_gpio_demo_mio
: This is the demo component that I implemented above. Note that I routed its Pin_state
, Pin_control
and Irq
session requests to the pin driver. The report
session request is routed to the report_rom
component.
The scenario is built and run as follows:
build/arm_v7a$ make run/zybo_gpio_demo BOARD=zynq_zybo_z7 KERNEL=hw
With this setup, I am able to switch an LED on/off using two push buttons. On the serial console, I could further witness the following output:
[init -> report_rom] report 'zybo_gpio_demo_mio -> state'
[init -> report_rom] <state value="no"/>
[init -> report_rom] report 'zybo_gpio_demo_mio -> state'
[init -> report_rom] <state value="yes"/>
Not spectacular, but still satisfying to see the pin driver at work. In a next step, I created a custom bitstream for the FPGA to control the switches, buttons and LEDs connected to the PL.
Creating a custom bitstreamImplementing a complex custom design for the programmable logic can get fiddly, especially if you are new to the world of FPGAs. My last practice with Xilinx FPGAs was about a decade ago in which the tooling changed quite a bit. I therefore started with a very simple design to check whether I got the basic setup right.
As a prerequisite, I had to install Vivado ML Standard in a separate Ubuntu VM. I was struck by the amount of disk space it took (~60GB) and therefore had to increase the VM size accordingly.
As a starting point, I followed this tutorial. Since it already includes detailed step-by-step instructions for Vivado, I’ll rather stick to a brief summary of the individual steps in this article.
In Vivado, I opened a new project and created a block design using the Flow Navigator on the left side. In the block design, I added the Zynq processing system as an IP (intellectual property) core. After adding an IP core, Vivado usually presents the option to run connection automation to connect all obvious signals. Running the automation connected the DDR
and FIXED_IO
interfaces of the IP core to the corresponding (automatically created) external ports.
At this point, I was curious whether my bare minimum, yet useless, design was basically complete and correct, and therefore hit the "Validate Design" button.
Oops! Apparently, because I skipped adding an AXI IP core as suggested by the aforementioned tutorial, the connection automation was not able to decide on how to connect the AXI clock signal. By locking at the other steps in the tutorial, I was able to decide that the AXI clock signal should be connected to the FCLK_CLK0
interface of the IP core.
Having fixed this, I generated the HDL wrapper (right-click on the block design). Unfortunately, my Vivado installation always got stuck during this procedure as it did when "Initializing the Language Server". Changing the syntax checking in Tool → Settings → Tool Settings → Text Editor → Syntax Checking from "Sigasi" to "Vivado" solved this issue for me after restarting Vivado.
At this point, I started deviating from the aforementioned tutorial. Instead of instantiating an AXI_GPIO IP core (which would require support by the pin driver), I want to interact with the PL using the SoC’s GPIO controller. This is possible because banks 2 and 3 of the GPIO controller are connected to the PL via the EMIO interface. The EMIO signals only need to be routed to the correct pins of the FPGA.
By double-clicking on the Zynq PS IP core, I enabled GPIO EMIO on the Peripheral I/O Pins card. Moreover, on the MIO Configuration card, I set the EMIO GPIO width to 12 (for 4 switches, 4 LEDs, 4 buttons). By doing this, the IP core gained a GPIO_0
interface. In order to make this an external signal, I selected the name and right-clicked to choose Make external from the context menu. This created an external interface named GPIO_0_0
connected to the IP core. Since the GPIO signals are tri-state, the external signal will be named gpio_0_0_tri_io
by Vivado. I double-checked this naming scheme by looking into the HDL wrapper (after regenerating it).
Last, I added a Xilinx Design Constraints (XDC) file to tie the gpio_0_0_tri_io
signals to those device pins that are actually wired to the switches/LEDs/buttons. Fortunately, Digilent provides master files for their boards. With the master file, one only needs to uncomment individual lines and insert the corresponding signal name. I added the Zybo-Z7-Master file via the Add sources dialogue, uncommented the lines for the LEDs, buttons and switches, and inserted the signal names gpio_0_0_tri_io[0]
to gpio_0_0_tri_io[11]
. You can find more detailed instructions in the aforementioned tutorial.
When generating the bitstream, I noticed that Vivado eats up a lot of RAM. Since I’m running the tool in a separate Ubuntu VM this caused unexplainable build errors at times. Adding another GB of RAM to the VM and reducing the number of jobs to 1 did the trick for me. Eventually, I was able to export the resulting bitstream file via File → Export → Export Bitstream File.
Loading a bitstream at boot-upSince an FPGA uses volatile memory to store its programming, it must be re-programmed after each power cycle. The bitstream file contains the necessary (device-specific) information. The easiest way is to let the boot loader take care of loading the bitstream. Xilinx’ FSBL as well as u-boot provide support for this. When compiled with the corresponding configuration options, you can load a bitstream using u-boot’s fpga
command. There are two sub-commands: load
and loadb
. The former expects a raw bitstream (.bin) as, e.g., acquired by read-back. The latter expects a.bit file as exported by Vivado. In contrast to the raw bitstream, this file has a different byte order and includes a file header.
In order to simplify the bitstream loading at boot-up, I added two commands to u-boot’s default environment that check for an fpga.bin resp. fpga.bit file and, if present, execute the corresponding fpga
command before booting into Genode. Furthermore, you can populate the SD card image with a bitstream by adding the following lines to your etc/build.conf.
RUN_OPT_zybo += --image-uboot-bitstream "/path/to/bitstream.bit"
Testing the bitstreamFor a test-drive of the bitstream, I implemented a zybo_gpio_demo_sw component which takes control of the switches and the LEDs that are placed next to each switch. The code is pretty straightforward. You can find it in the genode-zynq repository.
I also added the following start node to the run/zybo_gpio_demo.run script:
<start name="zybo_gpio_demo_sw">
<resource name="RAM" quantum="1M"/>
<route>
<service name="Pin_control"><child name="zynq_pin_drv"/> </service>
<service name="Pin_state"> <child name="zynq_pin_drv"/> </service>
<service name="IRQ"> <child name="zynq_pin_drv"/> </service>
<service name="ROM"> <parent/> </service>
<service name="CPU"> <parent/> </service>
<service name="PD"> <parent/> </service>
<service name="LOG"> <parent/> </service>
</route>
<config/>
</start>
Furthermore, and more interestingly, I modified the pin-driver configuration to add the corresponding policies:
<config>
<!-- zybo_gpio_demo_mio pins -->
<in name="Btn4" bank="1" index="18" irq="rising"/>
<in name="Btn5" bank="1" index="19" irq="rising"/>
<out name="Led4" bank="0" index="7" default="on"/>
<policy label="zybo_gpio_demo_mio -> Btn4" pin="Btn4"/>
<policy label="zybo_gpio_demo_mio -> Btn5" pin="Btn5"/>
<policy label="zybo_gpio_demo_mio -> Led4" pin="Led4"/>
<!-- zybo_gpio_demo_sw pins -->
<in name="Sw0" bank="2" index="0" irq="edges"/>
<in name="Sw1" bank="2" index="1" irq="edges"/>
<in name="Sw2" bank="2" index="2" irq="edges"/>
<in name="Sw3" bank="2" index="3" irq="edges"/>
<out name="Led0" bank="2" index="8" default="off"/>
<out name="Led1" bank="2" index="9" default="off"/>
<out name="Led2" bank="2" index="10" default="off"/>
<out name="Led3" bank="2" index="11" default="off"/>
<policy label="zybo_gpio_demo_sw -> Sw0" pin="Sw0"/>
<policy label="zybo_gpio_demo_sw -> Sw1" pin="Sw1"/>
<policy label="zybo_gpio_demo_sw -> Sw2" pin="Sw2"/>
<policy label="zybo_gpio_demo_sw -> Sw3" pin="Sw3"/>
<policy label="zybo_gpio_demo_sw -> Led0" pin="Led0"/>
<policy label="zybo_gpio_demo_sw -> Led1" pin="Led1"/>
<policy label="zybo_gpio_demo_sw -> Led2" pin="Led2"/>
<policy label="zybo_gpio_demo_sw -> Led3" pin="Led3"/>
</config>
Since I want to trigger an interrupt on rising and falling edges, I used irq="edges"
for the switch input pins.
Up to this point, I have intentionally neglected the existence of two (one) RGB LEDs on the Zybo Z7-20 (Z7-10). Each of these is actually composed of a green, a blue and a red LED. In principle, these LEDs can be controlled similarly to the other LEDs. However, since their brightness needs to be regulated by pulse-width modulation (PWM), it is much wiser to implement a custom IP core for this job. This is covered by part 2 of this story, which also includes instructions for reproducing the complete GPIO demo with my pre-built bitstreams.
Comments