Having grown up with Star Wars, it holds a special place in my heart. Over the last couple of years I have been able to combine my enjoyment of Star Wars with making/crafting by creating costumes with two volunteering groups: the 501st Legion and the Rebel Legion. Expanding on that, I decided to try my hand at making a GNK (Gonk) power droid prop for fun and to maybe display at smaller events, like local comic cons.
In addition to having a display piece, I wanted to put in additional functionality that would make the prop at least a little useful to have while it's just sitting around at home. To make the prop more practical, I added support for Google Assistant and power outlets, since the GNK droid in Star Wars is essentially a walking portable battery. I also wanted to learn how to use a new features that can run on Android Things, so I made a companion app with the Nearby Connections 2.0 API.
This tutorial will cover:
- How I put together the display piece
- The Android Things electronics setup
- Blinking an LED
- Adding the Assistant SDK to an Android Things device
- Playing custom sounds over the AIY Voice Hat for the Raspberry Pi
- Connecting to a companion app using the Nearby Connections 2.0 API
- Rebooting an Android Things device
It's worth noting that the embedded assistant used in this tutorial is already a little out of date. It is currently using AlphaV1, but AlphaV2 came out while I was building this project, and has support for device traits. I'm in the middle of learning how to use traits and v2 right now, and will either update this post or create/link to a new one with an updated EmbeddedAssistant class later. Luckily, it's all software, so updating later isn't a big deal :)
Creating the Base PropThe first thing we will want to do is create the general shape of our display prop that will house the electronics. For a reference, this is an example of a gonk from the game Star Wars: Battlefront
and another one from the original trilogy movies
One of the fun things about making props like the gonk is that you have a lot of creative freedom when assembling them. For mine, I decided to stick with a 'less is more' mentality by skipping the side indents and keeping things a lot more simple in regards to external decorations, as you can always add more later. I also went with a Star Wars Imperial theme for booth display purposes, though you may want to color and decorate your prop in a way that fits your own creativity. As we work through this part of the tutorial, think of it as a general guideline and an example of what worked for me, and then build on that to do something even more awesome :)
To start, I picked up two storage bins that could be placed on top of each other to create the body. I went with the kind that have ridges on the sides in order to give the prop a bit more texture, similar to these. I didn't use the lids, but you could cut out greeblies from them to attach to your own prop. You will also want to remove any handles attached to the bins. After you have your bins, you'll notice that gonk droids have an indented square for their 'face'. I had a lot of luck cutting out a section from the front of one of the bins with a box cutter, and then placing a square metal pan (I found mine at the dollar store, you don't need to buy a good one) in the cutout section and using a hot glue gun to secure it in place.
Remember the rule of measure twice, cut once. You can always cut out more, but you can't easily fill in space if you cut out too much. You'll notice some rough edges in other pictures of my project, as I learned this lesson first hand, but it's not the end of the world. Have fun with it, accidents happen :) If you do cut too far and want to fix it, Bondo actually works wonders for smoothly filling in gaps, though a bit of hot glue works, as painting later covers up a lot more than you'd expect.
Next, you'll want to sand down both bins. They will almost certainly have logos that aren't flat on the bottom that you will want to flatten, and sanding all exterior surfaces of the bins will help paint stick to the surface.
After sanding, wash down all of the exterior surfaces with soapy water to remove any oils and dust that could cause paint from applying well. You can now paint the bins whenever you're ready. I used a medium gray color, though I've seen others use brown or green for their own versions. This is just the first coat, so don't worry about perfection - you're going to be cutting, drilling and attaching additional things to this prop. You'll want to keep a decent distance and lightly coat the bins with spray paint to avoid drip marks.
The feet can be one of the more complicated parts to build, depending on the tools that you have available. I was lucky enough to have a bandsaw available, but this can also be done with hand tools with a lot more time. I used a piece of 5.5 inch x 4 inch wood to cut into shape, though I've heard of others using polystyrene sheets cut and glued together. There was also a great pattern that I found online for measurements on cuts from someone in the Mandalorian Mercs Costume Club that is free to use, which helped a ton in getting these built. I did have to adjust the numbers a bit to fit the piece of wood, and I didn't add the side decorations or hole in the middle.
After your feet are cut, they should look similar to this (again, they probably won't be perfect, and that's OK):
With medium to fine grade sandpaper, sand the feet so that they are smooth and you can avoid any extra splinters that may have popped up while cutting. Painting the feet can be a bit more tricky. Wood is porous, so it tends to absorb paint and leave wood lines. While I did multiple coats of paint to (mostly) get around this, a friend suggested spraying them with clear coat as a primer, then spraying on your normal paint. I haven't personally tried this, but if anyone wants to give it a shot and comment below, it'd be nice to know :). For the paint, I went with a can of hammered metal so that they had a more textured look.
When you've finished applying multiple coats to the feet and you're happy with how they look, it's time to attach the feet to the bottom bin of the body with legs. You will need two pieces of 2"x4" wood roughly a foot long, a piece of 2x4 long enough to lay across the bottom of a bin, some drain piping, angle brackets and screws (I used two different sizes of drywall screws that I had left over from projects around the house - I want to say they were 3 1/2" and 5 1/2", or somewhere around there).
Side note before proceeding: because I attached the feet to the legs with angle brackets and screws, I had to leave the piping a little shorter than the length of the legs. If I were to do it again, I would have put screws through the bottom of the feet so that the piping could cover the entire area. It doesn't look too bad as it is, but I notice it as the maker and figured I'd mention it for anyone that wants to do better :p
Before you start attaching everything, you can try standing everything up to get spacing figured out.
Once you know how spacing will work, you can place the piece of wood that goes into the bottom bin into place,
and attach the angle brackets to the legs
After the brackets are attached to the legs, you can attach them to the bin. I got lucky in that the bin I picked up had a pattern that lent itself well to the placement of the legs.
Hold the drainage piping next to the legs to get an idea of how far you want to cut. Using a box cutter here worked great. I ended up cutting about an inch further so I could reach the screws going into the foot, as mentioned above, so this is an area you could definitely improve in the making process.
Finally, attach the feet via angle brackets and screws (or an improved method). At this point, your prop should look like this when you put on the top bin. You'll notice drill holes in the face - we'll get to that later. I did things a bit out of an optimal order as I figured out what I was doing :)
Let's wrap up the general shape by attaching the top bin to the rest of the prop. Because I wanted easy access to the electronics, as well as to be able to use the prop as a storage space, I attached a door hinge to both bins to keep them in place.
Once the basic frame is put together, it's time to add the electronics. This project uses a Raspberry Pi 3B flashed with Android Things and an AIY Voice Kit with additional LEDs.
To start, drill out the holes that you'll use for your LEDs in the face. You should use a drill bit that matches your LEDs in size - in my case it was 5mm. I planned to use five in a column, though you can do any pattern that you want.
Next, attach your Raspberry Pi inside of your prop. I mounted mine to the top wall near the face using spacers. Make sure you leave enough space on all sides for plugging in a power cord, ethernet/USB, and HDMI for debugging if you ever need to access a display with your prop. I found setting it up so that the pins are aligned with the middle of the prop works best for exposing the USB and ethernet ports.
The AIY Voice Kit is where things get a little more complicated. The kit comes with multiple holes for attaching expansion headers, and we will need to use two of the GPIO pin sets on the board for this tutorial.
I soldered header pins to Servo0 - Servo5 with the long end facing up. Once you attach the button, microphone and speaker (official documentation on setup here), your AIY kit should look similar to this (with or without the Raspberry Pi attached, and the speaker ground wire actually plugged in :p)
You will need to cut holes with a box cutter into your top bin fairly close to your Raspberry Pi for mounting the button and speaker. You will also want to drill two small holes for the microphone in the top of the bin where it will be mounted so that it can pick up a user's voice.
Although we will get into the code later for adding the Assistant SDK, this video will give you a general idea of what the outside looked like on mine.
Now it's time to add the LEDs. I strongly recommend setting them up on a breadboard first so that you know they work, are bright enough, and don't immediately blow out. I used three RGB fading LEDs, one yellow that will turn on and off over a set interval, and one blue that will flash when using the Google Assistant. I attached current limiting resistors (1k: brown-black-orange-gold) to each LED and ran them with a shared ground, put the three color fading LEDs into a parallel circuit, and planned to give the blue and yellow LEDs their own line back to the GPIO pins on the Pi for controlling their state.
Once you know how you will set up your LEDs, hit the metal face of your prop with another coat of paint, as you won't be able to easily paint the face again after this point. After painting, place the LEDs into the holes you drilled earlier and secure them into place with hot glue (which, coincidentally, is a great insulator). After you have soldered on your resistors and ground/power wires, as mentioned above, I recommend capping the wires off with banana connectors so that you can disconnect from your Raspberry Pi easily.
I connected these wires to the Raspberry Pi hat by taking two sets of 3 pin female headers and soldering wires to the bottom with the complimentary type of banana connector. The wire for the blue and yellow LEDs went onto pin 1 in either row, respectively, the middle pin of the top row is 5v out for driving the color changing LEDs, and the last pin in the top row was the common ground. I then attached this header to Servo4 and Servo5 on the AIY Voice Hat.
In the end, your setup should look something like this (though this picture was before wrapping up extra areas with electrical tape, and the soldering is really bad :p)
Once you're done attaching everything, you're all set with the electronics hardware. There is one additional step that I did later in the process that you would probably want to do now: adding power. I purchased a fairly cheap USB and outlet power strip,
then cut a hole in the back of the prop (because I knew it wasn't going to look very pretty)
and attached the outlet with angle brackets and a lot of hot glue to hold it into place.
I also ran a longer extension cable with two power outlets on it into the prop so that I could power the Raspberry Pi and the outlet.
After you've finished attaching everything, paint the body of the prop one last time to cover any scratches or new screws/nuts/bolts that were added. Be sure to pull the microphone away from its mounted position (which is why I taped it) and reattach it once the paint has dried.
Project SetupThis project uses Android Things on the Raspberry Pi. Rather than going into detail of how to setup Android Things in this tutorial, I have written a separate guide that goes over using the flashing tool and connecting to a wireless network. Once your device is flashed, you can connect to your Raspberry Pi using a terminal and running the command ./adb connect x.x.x.x where the second argument is the wireless IP address for your device.
Once your device is setup, it's time to create your project in Android Studio. You will need two modules: an Android Things module, and a Phone/Tablet module for the companion app that will be added later. Accept the default settings for Empty Activities on both.
Blinking an LEDTo start the project off, we'll keep things simple by blinking an LED. Open the things module and open MainActivity.java. For this tutorial, I plugged the yellow LED that I will blink every second into Servo4's data pin, which translates to BCM12. We can save that value at the top of the class. You will also need to create and initialize a Handler for the runnable that will run the LED.
private String LED_BLINKER = "BCM12";
private Handler mHandler = new Handler();
Under the onCreate() method, add the following code to access the PeripheralManagerService and initialize the Gpio object.
try {
PeripheralManagerService service = new PeripheralManagerService();
mBlinker = service.openGpio(LED_BLINKER);
mBlinker.setDirection(Gpio.DIRECTION_OUT_INITIALLY_LOW);
mHandler.post(mBlinkRunnable);
} catch( IOException e ) {}
Finally, you will need to create the Runnable that drives the blinking LED.
private Runnable mBlinkRunnable = new Runnable() {
@Override
public void run() {
try {
mBlinker.setValue(!mBlinker.getValue());
mHandler.postDelayed(mBlinkRunnable, 1000);
} catch (IOException e) {}
}
};
Now when you install your app on the Raspberry Pi, you should see an LED blink on and off every second. The final step is to properly tear down your components in onDestroy().
if( mBlinker != null ) {
try {
mBlinker.close();
} catch( IOException e ) {}
}
mHandler.removeCallbacks(mBlinkRunnable);
Adding Google AssistantThe Google Assistant lets you add voice functionality to a device for asking questions and getting general information about topics including your schedule, traffic, weather, and reminders. All code for this section will occur in the things module of your app.
Before we can get into writing code, we will first need to generate credentials for accessing the Assistant SDK. You can find documentation for creating your credentials and enabling the Google Assistant API in the official sample here, so go ahead and get that set up before moving back to your Android app. The credentials.json file that gets generated during this process will go into the res/raw directory of the things module.
After your credentials are created with Google, you will need to declare some permissions for your app. Open the AndroidManifest.xml file and add the following lines within the manifest tag, but before the application tag.
<!-- Necessary for Google Assistant -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="com.google.android.things.permission.MANAGE_AUDIO_DRIVERS" />
It's worth noting that you will need to restart your device after installing the app with these permissions in order for them to be granted.
Next you will need to copy the gRPC module into your app for communicating with the Assistant SDK. This gets a little tricky, so the best place to get it is from the Google Assistant Android Things sample app, which can be found here. You will then need to update your settings.gradle file to reflect the new module.
include ':app', ':grpc', ':mobile'
After updating settings.gradle, include the module as a dependency in the things module by including the following line in the things build.gradle file and include Google's Voice Hat driver.
implementation project(':grpc')
implementation 'com.google.android.things.contrib:driver-voicehat:0.6'
and include protobuf as a dependency in your project level build.gradle file.
classpath "com.google.protobuf:protobuf-gradle-plugin:0.8.2"
At this point your project should sync and compile. Your modules on the side of Android Studio may look like this:
Next, let's include the oauth2 library in our project by opening the thingsmodule's build.gradle file and adding the following under the dependencies node:
compile('com.google.auth:google-auth-library-oauth2-http:0.6.0') {
exclude group: 'org.apache.httpcomponents', module: 'httpclient'
}
You may run into conflicts here if your project has the Espresso dependency with an error message similar to this:
Warning:Conflict with dependency 'com.google.code.findbugs:jsr305' in project ':things'. Resolved versions for app (1.3.9) and test app (2.0.1) differ. See http://g.co/androidstudio/app-test-app-conflict for details.
If so, just remove the Espresso dependency from build.gradle.
After you have synced your project, create a new class named Credentials.java to access your credentials.
public class Credentials {
static UserCredentials fromResource(Context context, int resourceId)
throws IOException, JSONException {
InputStream is = context.getResources().openRawResource(resourceId);
byte[] bytes = new byte[is.available()];
return new UserCredentials(
json.getString("client_id"),
This will pull the credentials from your credentials.json file when they are needed by your app.
Next we will create the EmbeddedAssistant.java file, which is a helper class for interfacing with the Assistant SDK, and will be taken directly from the Google Assistant sample. You can find the file here.
After creating your EmbeddedAssistant class, head back into MainActivity. You will need to create the following static values at the top of the class
private static final int BUTTON_DEBOUNCE_DELAY_MS = 20;
private static final String PREF_CURRENT_VOLUME = "current_volume";
private static final int SAMPLE_RATE = 16000;
private static final int DEFAULT_VOLUME = 100;
as well as these four member variables
private Max98357A mDac;
private EmbeddedAssistant mEmbeddedAssistant;
private AudioDeviceInfo audioInputDevice = null;
private AudioDeviceInfo audioOutputDevice = null;
The DAC (Digital-to-Analog-Converter) is built into the AIY Voice Hat with the driver available in the Voice Hat driver libraries that we included earlier in build.gradle, and is used to control audio input and output from your device. The EmbeddedAssistant object is how we will interact with the EmbeddedAssistant helper class, and the AudioDeviceInfo objects will keep track of which hardware devices are operating as a speaker and microphone (through the AIY Voice Hat).
Finally, add the information needed to blink the Assistant LED (in my prop, the blue LED at the bottom of the column of LEDs) when interacting with the Google Assistant, as well as the button that will trigger the Google Assistant. BCM24 is the data pin for Servo5 on the AIY Voice Hat, which is where I connected the LED that is driven by interacting with voice operations.
private String LED_PIN = "BCM24";
private Button mButton;
private Gpio mLed;
Once your member values are created, go into onCreate() and initialize your AudioDeviceInfo objects. We will use the TYPE_BUS here to use the built in I2S bus on the Voice Hat. You can find more information about AudioDeviceInfo types here.
audioInputDevice = findAudioDevice(AudioManager.GET_DEVICES_INPUTS, AudioDeviceInfo.TYPE_BUS);
audioOutputDevice = findAudioDevice(AudioManager.GET_DEVICES_OUTPUTS, AudioDeviceInfo.TYPE_BUS);
In the above, findAudioDevice() is a helper method in MainActivity that we need to add:
private AudioDeviceInfo findAudioDevice(int deviceFlag, int deviceType) {
AudioManager manager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
AudioDeviceInfo[] adis = manager.getDevices(deviceFlag);
for (AudioDeviceInfo adi : adis) {
if (adi.getType() == deviceType) {
return adi;
}
}
return null;
}
Next, initialize all of the hardware needed to use the Google Assistant like so
try {
mDac = VoiceHat.openDac();
mDac.setSdMode(Max98357A.SD_MODE_SHUTDOWN);
PeripheralManagerService pioService = new PeripheralManagerService();
mLed = pioService.openGpio(LED_PIN);
mButton = VoiceHat.openButton();
mButtonLight = VoiceHat.openLed();
mButtonLight.setValue(false);
mButton.setDebounceDelay(BUTTON_DEBOUNCE_DELAY_MS);
mButton.setOnButtonEventListener(this);
mLed.setDirection(Gpio.DIRECTION_OUT_INITIALLY_HIGH);
mLed.setActiveType(Gpio.ACTIVE_HIGH);
} catch (IOException e) {
return;
}
After the hardware is initialized, you can retrieve the default volume for the Assistant and your credentials
SharedPreferences preferences = PreferenceManager.getDefaultSharedPreferences(this);
int initVolume = preferences.getInt(PREF_CURRENT_VOLUME, DEFAULT_VOLUME);
UserCredentials userCredentials = null;
try {
userCredentials =
EmbeddedAssistant.generateCredentials(this, R.raw.credentials);
} catch (IOException | JSONException e) {}
To wrap up onCreate(), use the EmbeddedAssistant Builder object to create and configure the Assistant helper and all of the callbacks that will be used for changing device volume and updating our LED.
mEmbeddedAssistant = new EmbeddedAssistant.Builder()
.setCredentials(userCredentials)
.setAudioInputDevice(audioInputDevice)
.setAudioOutputDevice(audioOutputDevice)
.setAudioSampleRate(SAMPLE_RATE)
.setAudioVolume(initVolume)
.setRequestCallback(new EmbeddedAssistant.RequestCallback() {})
.setConversationCallback(new ConversationCallback() {
@Override
public void onResponseStarted() {
if (mDac != null) {
try {
mDac.setSdMode(Max98357A.SD_MODE_LEFT);
mLed.setValue(false);
} catch (IOException e) {}
}
}
@Override
public void onResponseFinished() {
if (mDac != null) {
try {
mDac.setSdMode(Max98357A.SD_MODE_SHUTDOWN);
mLed.setValue(true);
} catch (IOException e) {}
}
}
@Override
public void onConversationEvent(EventType eventType) {}
@Override
public void onAudioSample(ByteBuffer audioSample) {
try {
mLed.setValue(!mLed.getValue());
} catch (IOException e) {}
}
@Override
public void onConversationError(Status error) {}
@Override
public void onError(Throwable throwable) {}
@Override
public void onVolumeChanged(int percentage) {
Editor editor = PreferenceManager
.getDefaultSharedPreferences(AssistantActivity.this)
.edit();
editor.putInt(PREF_CURRENT_VOLUME, percentage);
editor.apply();
}
@Override
public void onConversationFinished() {}
})
.build();
mEmbeddedAssistant.connect();
Once onCreate() is fleshed out, you will need to implement the OnButtonEventListener interface in your class by updating your class declaration line with the interface
implements Button.OnButtonEventListener
This listener will simply turn the Assistant LED on/off, depending on the state. The button acting as a trigger for the Assistant SDK is handled by the VoiceHat driver.
@Override
public void onButtonEvent(Button button, boolean pressed) {
try {
if (mLed != null) {
mLed.setValue(!pressed);
}
} catch (IOException e) {}
}
Finally, you will need to properly tear down all of the hardware used in onDestroy().
if (mLed != null) {
try {
mLed.close();
} catch (IOException e) {}
mLed = null;
}
if (mButton != null) {
try {
mButton.close();
} catch (IOException e) {}
mButton = null;
}
if (mDac != null) {
try {
mDac.close();
} catch (IOException e) {}
mDac = null;
}
mEmbeddedAssistant.destroy();
At this point your prop should look something like this
If it is, great job! You're over the largest hump of starting a project, building out the hardware and getting the Assistant SDK added. From this point on it's all just extra details and 'nice-to-have' features. Let's dive into the next portion :)
Nearby Connections 2.0 and Companion Android AppWhile the Assistant requires a sturdy wireless connection, this isn't always going to be possible at any other location, such as conventions. To get around this, I decided to create a companion app that lets me send commands to the device. The three commands I decided to initially support are
- Rebooting the device
- Playing specific sounds from a limited pool of locally stored files
- Looping sounds in random intervals from a limited pool of locally stored files
In order to get the device communicating with the companion app, I decided to use the Nearby Connections 2.0 API from Google Play Services, as it came out somewhat recently and I haven't had a chance to play with it yet. In this scenario, the prop will act as an advertiser, announcing itself to other devices that would like to connect to it, and the mobile app will be a discoverer, which looks for and connects to advertisers.
To start using the Nearby Connections API, you will need to include it as a dependency in both the things and mobile module's build.gradle files. You will probably want to adjust the version from a 11.6.2 to whatever latest version runs on your devices - this is just the latest for the
compile 'com.google.android.gms:play-services-nearby:11.6.2'
After including the dependency and syncing in Android Studio, you will need to open both AndroidManifest.xml files and include all necessary dependencies. Remember that your device will need to reboot after installing in order to grant the permissions.
<!-- Necessary for Nearby API -->
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
Nearby Connections: AdvertiserOnce the permissions have been set, we will create two new classes in the things module: DroidCommands.java and CommandParser.java. CommandParser will take a String and attempt to figure out what operation should be performed,
public class CommandParser {
private DroidCommands mListener;
private String COMMAND_REBOOT = "reboot";
private String COMMAND_BACKGROUND_LOOP = "loop";
private String COMMAND_PLAY_SOUND = "playsound";
public CommandParser(DroidCommands listener) {
mListener = listener;
}
public void parse(String payload) {
String arguments[] = payload.split(" ", 2);
if( arguments[0].equalsIgnoreCase(COMMAND_REBOOT) ) {
mListener.onRestartCommand();
} else if( arguments[0].equalsIgnoreCase(COMMAND_BACKGROUND_LOOP)) {
if( arguments[1].equalsIgnoreCase("on")) {
mListener.onBackgroundSoundToggle(true);
} else if( arguments[1].equalsIgnoreCase("off")) {
mListener.onBackgroundSoundToggle(false);
}
} else if( arguments[0].equalsIgnoreCase(COMMAND_PLAY_SOUND)) {
if( arguments[1].matches("\\d+(?:\\.\\d+)?")) {
mListener.onPlaySound(Integer.valueOf(arguments[1]));
}
}
}
}
and DroidCommands is an interface for communicating between MainActivity and CommandParser.
public interface DroidCommands {
void onRestartCommand();
void onBackgroundSoundToggle(boolean enabled);
void onPlaySound(int sound);
}
Next, open MainActivity in the things module and add the following interfaces to the class declaration.
GoogleApiClient.ConnectionCallbacks, GoogleApiClient.OnConnectionFailedListener, DroidCommands
The first two interfaces define three methods that you will need to stub out. We will not use them in this class, but they are available for expanding out your project with other useful classes in Google Play Services.
@Override
public void onConnected(@Nullable Bundle bundle) {}
@Override
public void onConnectionSuspended(int i) {}
@Override
public void onConnectionFailed(@NonNull ConnectionResult connectionResult) {}
DroidCommands adds three other methods that we will perform specific operations on your Android Things device. We will return to those after we cover how the advertiser connects with another device.
You will need to initialize the following member values at the top of the class to support the Nearby Connections API, as well as the additional functionality that will be handled by the Android Things device.
private static final String SERVICE_ID = "UNIQUE_SERVICE_ID";
private int buttonPressCounter = 0;
private Gpio mButtonLight;
private Handler mConsoleTimerHandler = new Handler();
private Handler mHandler = new Handler();
private Handler mLoopingSoundHandler = new Handler();
private GoogleApiClient mGoogleApiClient;
private String endpoint;
private CommandParser mParser;
private WifiManager wifiManager;
In onCreate(), initialize and connect to the Google API Client like so
mGoogleApiClient = new GoogleApiClient.Builder(this, this, this)
.addApi(Nearby.CONNECTIONS_API)
.enableAutoManage(this, this)
.build();
You will also want to add the following code to onCreate() to check and (if off) enable WiFi on device boot. This section is required because the Nearby Connections API will automatically turn off standard wireless in order to advertise itself, and if the device restarts in the middle of advertising, it will reboot with the device's wireless disabled.
wifiManager = (WifiManager) getSystemService(Context.WIFI_SERVICE);
wifiManager.setWifiEnabled(true);
Finally, in onCreate(), create a new CommandParser object that will be used later in this section.
mParser = new CommandParser(this);
In order to enter console mode with the droid, we need a way to trigger starting advertisement on the Nearby Connections API. I decided to use the Google Assistant button on top of my prop to trigger entering console mode if the button is pressed three to six times in five seconds. We can do this by keeping track of the number of times the button is pressed in onButtonEvent() and maintaining a timer in a Runnable that will reset the counter once it has fired. In addition, we can check if the device is connected to a wireless network or not and play a sound rather than attempting to connect to the Assistant SDK from this method.
if (pressed) {
if( buttonPressCounter == 0 ) {
mConsoleTimerHandler.postDelayed(mConsoleModeRunnable, 5000);
}
buttonPressCounter++;
if( wifiManager.isWifiEnabled() ) {
mEmbeddedAssistant.startConversation();
} else {
playSound(R.raw.droid_gonk_03);
}
}
We will go into the playSound() method in a later section.
mConsoleModeRunnable will check the button counter and start advertising through the Nearby Connections API if it matches our requirements.
private Runnable mConsoleModeRunnable = new Runnable() {
@Override
public void run() {
if( buttonPressCounter >= 3 && buttonPressCounter <= 6) {
startAdvertising();
}
buttonPressCounter = 0;
}
};
startAdvertising() will, as the name implies, start advertising over the API. The service ID helps both devices know if they should be able to connect together, the ConnectionLifecycleCallback object (which we will define next) handles callbacks during the connection lifecycle, and the connection strategy that we are using, P2P_STAR, tells the API that we are using a 1 to N connection, where any number of clients may connect to our Android Things device.
private void startAdvertising() {
Nearby.Connections.startAdvertising(
mGoogleApiClient,
"Droid",
SERVICE_ID,
mConnectionLifecycleCallback,
new AdvertisingOptions(Strategy.P2P_STAR));
}
The ConnectionLifecycleCallback object has three methods that must be defined: onConnectionInitiated(), onConnectionResult(), and onDisconnected(). onConnectionInitiated() is where the majority of our work will occur - when a client attempts to connect to it, we will save the endpoint ID and attempt to accept the connection. If the connection is a success, we will light up the button on top of the prop.
When the client disconnects from the prop, we will turn the button light off and stop advertising over the API.
private final ConnectionLifecycleCallback mConnectionLifecycleCallback =
new ConnectionLifecycleCallback() {
@Override
public void onConnectionInitiated(String endpointId, ConnectionInfo connectionInfo) {
endpoint = endpointId;
Nearby.Connections.acceptConnection(mGoogleApiClient, endpointId, mPayloadCallback)
.setResultCallback(new ResultCallback<com.google.android.gms.common.api.Status>() {
@Override
public void onResult(@NonNull com.google.android.gms.common.api.Status status) {
if( status.isSuccess() ) {
try {
mButtonLight.setValue(true);
} catch (IOException e) {
}
}
}
});
Nearby.Connections.stopAdvertising(mGoogleApiClient);
}
@Override
public void onConnectionResult(String endpointId, ConnectionResolution result) {}
@Override
public void onDisconnected(String endpointId) {
try {
mButtonLight.setValue(false);
} catch( IOException e ) {}
}
};
You may have noticed that we also use a PayloadCallback object when accepting the connection. This object is used to accept data sent from a client to the prop. In the onPayloadReceived() method is where we will take that payload and translate it to a String, that will then go into the CommandParser.
private PayloadCallback mPayloadCallback = new PayloadCallback() {
@Override
public void onPayloadReceived(String s, Payload payload) {
mParser.parse(new String(payload.asBytes()));
}
@Override
public void onPayloadTransferUpdate(String s, PayloadTransferUpdate payloadTransferUpdate) {}
};
Finally, make sure that you properly tear down the LED in the button when onDestroy() is called.
if( mButtonLight != null ) {
try {
mButtonLight.close();
} catch( IOException e ) {}
mButtonLight = null;
}
Now that your prop can advertise, it's time to dive into how to connect a client to the droid in order to send commands.
Nearby Connections: DiscovererIn an earlier section you should have added the Nearby Connections API dependency and permissions to the mobile client codebase. If not, now is when you should do that. Next, update the activity_main.xml layout file with the following code
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.ptrprograms.droidconsole.MainActivity">
<ListView
android:id="@+id/list"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_above="@+id/input_container"/>
<LinearLayout
android:id="@+id/input_container"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true">
<EditText
android:id="@+id/edittext"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_weight="4"/>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_weight="1"
android:text="Send"
android:onClick="sendPayload"/>
</LinearLayout>
</RelativeLayout>
The above layout consists of a ListView that will show sent commands and received responses, and an EditText field with a Button for sending commands to the prop.
Going into the MainActivity class, implement the two Google Play Services callback interfaces that were also used in the advertiser.
implements GoogleApiClient.ConnectionCallbacks, GoogleApiClient.OnConnectionFailedListener
The only method from these interfaces that you will need to update is onConnected(), which will call a new helper method that we will define later called startDiscovery() once the app has connected to Google Play Services.
@Override
public void onConnected(@Nullable Bundle bundle) {
startDiscovery();
}
Next, add the following items to the top of the class
private static final String SERVICE_ID = "UNIQUE_SERVICE_ID";
private GoogleApiClient mGoogleApiClient;
private String mEndpoint;
private ArrayAdapter<String> mAdapter;
private EditText mEditText;
private ListView mListView;
Similar to the advertising device, you will need a PayloadCallback and a ConnectionLifecycleCallback. When onPayloadReceived() is triggered, you will add the received text to the ListView, and when onConnectionInitiated() is called, you will call acceptConnection() just like the advertiser, but will also stop attempting to discover other devices.
private PayloadCallback mPayloadCallback = new PayloadCallback() {
@Override
public void onPayloadReceived(String s, Payload payload) {
addText(new String(payload.asBytes()));
}
@Override
public void onPayloadTransferUpdate(String s, PayloadTransferUpdate payloadTransferUpdate) {}
};
private final ConnectionLifecycleCallback mConnectionLifecycleCallback =
new ConnectionLifecycleCallback() {
@Override
public void onConnectionInitiated(String endpointId, ConnectionInfo connectionInfo) {
Nearby.Connections.acceptConnection(mGoogleApiClient, endpointId, mPayloadCallback);
mEndpoint = endpointId;
Nearby.Connections.stopDiscovery(mGoogleApiClient);
}
@Override
public void onConnectionResult(String endpointId, ConnectionResolution result) {}
@Override
public void onDisconnected(String endpointId) {}
};
In addition to the above callbacks, you will also need an EndpointDiscoveryCallback, which will check any discovered endpoints to ensure that the service ID matches before attempting to initiate a connection.
private final EndpointDiscoveryCallback mEndpointDiscoveryCallback =
new EndpointDiscoveryCallback() {
@Override
public void onEndpointFound(
String endpointId, DiscoveredEndpointInfo discoveredEndpointInfo) {
if( discoveredEndpointInfo.getServiceId().equalsIgnoreCase(SERVICE_ID)) {
Nearby.Connections.requestConnection(
mGoogleApiClient,
"Droid",
endpointId,
mConnectionLifecycleCallback);
}
}
@Override
public void onEndpointLost(String endpointId) {
addText("Disconnected");
}
};
The onCreate() method is fairly straight forward. It will initialize the views and adapter that will be used in our app, and will connect to Google Play Services.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initViews();
mGoogleApiClient = new GoogleApiClient.Builder(this, this, this)
.addApi(Nearby.CONNECTIONS_API)
.enableAutoManage(this, this)
.build();
}
The last thing that will need to be done is defining the helper methods used by this class. The first, initViews(), creates references to the views in our XML file and sets them up.
private void initViews() {
mEditText = (EditText) findViewById(R.id.edittext);
mListView = (ListView) findViewById(R.id.list);
mAdapter = new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, android.R.id.text1);
mListView.setAdapter(mAdapter);
}
sendPayload() takes the contents of the EditText and sends it to the advertiser, as well as clears the view and adds the text to the ListView.
public void sendPayload(View v) {
String text = mEditText.getText().toString();
addText(text);
mEditText.setText("");
Nearby.Connections.sendPayload(mGoogleApiClient, mEndpoint, Payload.fromBytes(text.getBytes()));
}
addText() takes a String and adds it to the ListView's ArrayAdapter.
private void addText(String text) {
mAdapter.add(text);
mAdapter.notifyDataSetChanged();
mListView.post(new Runnable() {
@Override
public void run() {
mListView.setSelection(mListView.getCount() - 1);
}
});
}
Finally, startDiscovery() works very similarly to startAdvertising() on the Android Things app.
private void startDiscovery() {
Nearby.Connections.startDiscovery(
mGoogleApiClient,
SERVICE_ID,
mEndpointDiscoveryCallback,
new DiscoveryOptions(Strategy.P2P_STAR));
}
When this app is run and connected to the prop, you should be able to send the commands reboot, playsound #, and loop on/off. The screen of your app will look similar to this
Rebooting is the most common way to reset the settings of a device. While you could unplug and replug the device, there's a good chance that the outlet isn't always going to be in an easy to reach location. In order to get around this, the console app can send the command reboot to reboot the device. Android Things provides an easy way to do this with the DeviceManager class. When the command is received, the following callback in the things module's MainActivity class is triggered, which then responds to the mobile client and reboots the device.
@Override
public void onRestartCommand() {
Nearby.Connections.sendPayload(mGoogleApiClient, endpoint, Payload.fromBytes("Rebooting...".getBytes()));
DeviceManager deviceManager = new DeviceManager();
deviceManager.reboot();
}
In addition, if you want to reboot the device without connecting through an app, you can update mConsoleModeRunnable to reboot if the Assistant button is pressed seven or more times in five seconds.
private Runnable mConsoleModeRunnable = new Runnable() {
@Override
public void run() {
if( buttonPressCounter >= 3 && buttonPressCounter <= 6) {
startAdvertising();
} else if( buttonPressCounter > 7 ) {
mLoopingSoundHandler.removeCallbacks(mLoopingNoiseRunnable);
DeviceManager manager = new DeviceManager();
manager.reboot();
}
buttonPressCounter = 0;
}
};
Playing SoundsTo understand how you can play sounds on an AIY hat, you have to understand that Assistant responses are actually small audio clips that are sent back to the device and played for the user. Using the same logic, we can create a new method in the EmbeddedAssistant class named playSound() that accepts an InputStream and plays the content for the user.
public void playSound(InputStream inputStream) throws IOException {
AudioTrack audioTrack = new AudioTrack.Builder()
.setAudioFormat(mAudioOutputFormat)
.setBufferSizeInBytes(mAudioOutputBufferSize)
.setTransferMode(AudioTrack.MODE_STREAM)
.build();
if (mAudioOutputDevice != null) {
audioTrack.setPreferredDevice(mAudioOutputDevice);
}
audioTrack.setVolume(AudioTrack.getMaxVolume() * mVolume / 100.0f);
audioTrack.play();
ByteBuffer buffer = ByteBuffer.wrap(IoUtils.toByteArray(inputStream));
audioTrack.write(buffer, buffer.remaining(),
AudioTrack.WRITE_BLOCKING);
audioTrack.stop();
audioTrack.release();
}
Back in MainActivity, we can create another playSound() method that accepts a resource id and generates the InputStream that is sent to the EmbeddedAssistant class.
private void playSound(int sound) {
InputStream inputStream = getResources().openRawResource(sound);
try {
mDac.setSdMode(Max98357A.SD_MODE_LEFT);
mEmbeddedAssistant.playSound(inputStream);
} catch( IOException e ) {}
}
You'll notice that the DAC has to set its mode before playing the sound, as this lets the device know that the AIY Voice Hat's bus is being used.
Now that you know how to play a sound, you can update the onPlaySound() callback method to look up a sound by the argument that was passed to it and play it, while also responding to the client.
@Override
public void onPlaySound(int sound) {
Nearby.Connections.sendPayload(mGoogleApiClient, endpoint, Payload.fromBytes("Attempting to play sound...".getBytes()));
playSoundByIndex(sound);
}
private void playSoundByIndex(int index) {
switch( index ) {
case 1: {
playSound(R.raw.droid_gonk_01);
break;
}
case 2: {
playSound(R.raw.droid_gonk_02);
break;
}
case 3: {
playSound(R.raw.droid_gonk_03);
break;
}
case 4: {
playSound(R.raw.droid_gonk_04);
break;
}
case 5: {
playSound(R.raw.droid_gonk_die_01);
break;
}
case 6: {
playSound(R.raw.droid_gonk_die_02);
break;
}
}
}
Looping Sounds/Demo ModeThe final part to the console commands is the ability to put the prop into a demo mode. This mode will loop through playing sounds on the droid without any additional interaction. The logic here is a combination of playing sounds and blinking a single LED.
private Handler mLoopingSoundHandler = new Handler();
private Runnable mLoopingNoiseRunnable = new Runnable() {
@Override
public void run() {
playSoundByIndex(new Random().nextInt(5) + 1);
mLoopingSoundHandler.postDelayed(mLoopingNoiseRunnable, 5000);
}
};
@Override
public void onBackgroundSoundToggle(boolean enabled) {
if( enabled ) {
mLoopingSoundHandler.post(mLoopingNoiseRunnable);
} else {
mLoopingSoundHandler.removeCallbacks(mLoopingNoiseRunnable);
}
Nearby.Connections.sendPayload(mGoogleApiClient, endpoint, Payload.fromBytes("Success".getBytes()));
}
Finishing the PropWith all of the coding finished, it's time to wrap up how your prop looks. To make the two bins blend together better, you can wrap the middle with a rubber garage door seal. I attached the seal to my prop with more screws driven into the top bin. Be sure to only attach the seal to one bin, as you will want to be able to continue opening and closing the prop.
Finally, take some time and find some greeblies (decorations) to glue to your new droid prop. I have a somewhat inexpensive 3D printer at home (a Prusa i3 clone) that I used to print off some rocker switches from a TIE pilot chest box prop and C3PO bicep decorations to attach to the face after painting, though you can find various things that would look great at a local hardware store. I also printed off some code discs from the TIE pilot set to cover the screws in the middle seal, though you can also use common screw caps.
For the finishing touches, I printed off a droid restraining bolt to attach to the front,
added a cover for the speaker, and put on some large Imperial decals that I picked up from a friend who owns a vinyl cutter.
Have some fun with this step and really make the project your own. After all of the hard work that goes into making it, it'll feel great to have a cool prop around the house that can also have a practical function.
Comments