Genymotion and the Android Emulator won't play nice with each other, so I had to use a physical device in my recording. The quality is not great - you'll have to turn the sound up. Sorry.
This is an overview of my final workflow. All communication between the wear device and the mobile is done through the Listener Services and passed via messages. Messages are sent to the appropriate Activity/ Service based on the message flags.
The mobile device has a Main Activity screen that displays previous earthquakes, there are two Quake Services - one retrieves data regarding stored quakes and on listens for new quakes, both using the USGS API. Both Quake Services send their intent to the Location Service that calculates distance using the Location API, then sends intents to the Map Activity and Listener Service. Map Activity displays the location on the map. When the Listener Service receives a message back from the wear for a photo this is passed as an intent to the Photo Activity which then displays a photo from the Flickr API using the city, state as tags. The meta data is then passed back to the Listener Service as an intent. When a request is received to stop, the application goes back to the Main Activity display.
The wear device has a Main Activity that displays "waiting for an earthquake", all other displays are handled by Quake Activity. When a quake comes in Listener Service sends an intent to Quake Activity and Vibrate Service. When the user shakes the device a request is sent from the Sensor Service through the Listener Service for a photo. Meta data from the photo is send back from the Listener Service and displayed by Quake Activity. This Service stays on until the user taps out of the Quake Activity and back to the Main Activity.
Comments