ActFirst is an app that alerts medically-trained bystanders of nearby emergencies and assists in first-response procedures. Users can instantaneously find out and respond to alerts within their vicinity. Furthermore, ActFirst provides assistive tools and quick references for emergency procedures to further improve the quality of emergency treatment.
The Design:Brainstorm:
During the brainstorming process, we compiled a list of every possible app we could think of that could utilize some feature of the smartwatch. From this list, we narrowed down our list from 50+ items to 5, then to 3, and then one through a series of blind-voting and discussion.
In the end, we decided on developing ActFirst, an application that alerts medically trained users of nearby emergencies in order to provide quick and efficient first response. We decided on this app because we believed it had the potential to positively impact society. Also, because emergencies are time-sensitive in nature, we realized we could take full advantage of one of the smartwatch’s greatest perks: efficiency and convenience.
The above sketch was the very first sketch of how we envisioned our application. Evidently, the idea was still vague at this point, and so the next step was to conduct some contextual inquiry to gain more insights on how we should approach this idea (see section “User Studies” below for more information). From our interviews, we realized that slow response times is a very real problem, especially in rural areas, and that every minute lost can reduce one’s chance of survival significantly. With this information, we realized our app has a very real purpose and can solve a very real problem. We also gained a better idea of what a licensed medical professional would want in an app like ActFirst (patient information, navigation, timeliness), and also what a trained, but inexperienced user would want to see in our app (reference books and procedural assistance). With this information in mind, we were finally able to begin designing the features and interaction flow of our application.
Design Sketches/WireframesAfter conducting our contextual inquiry, We decided our app would perform several primary tasks. It would:
- Alert the user
- Help the user navigate to the scene,
- Provide the user with procedural assistance
- Provide medical references
- Display AED locations on the map
We developed a pretty linear interaction flow, which began with accepting a notification, followed by opening up navigation to reach the scene, and finally using the app’s assistive tools to perform first-aid. After creating a rough sketch of our interaction flow, we moved on to further developing the application’s tasks by creating and drawing out scenarios. This way, we could get a better idea of how our application might be used. At this point, we felt confident enough in our interaction flow to move onto a moderate-fidelity mock-up. While creating the mock-up, we thought it would be useful to add an estimated time of ambulance arrival, so that users would know what course of action to take during the rescue, so we threw that in there.
However, we realized our current mock-ups did not follow certain watch conventions (wearable notifications did not work as we assumed they did, we forgot backswiping closes an app, we realized the notifications on the watch do not look anything like the notification in the above picture, etc), so we further revised and embellished this design on our hi-fidelity framer prototype, this time making sure that we adhered to watch conventions.
At this point, we felt like we were approaching our final iterations of the design cycle. Our prototype was ready to undergo usability testing (for more detailed results, see section “User Studies”, below).
The users tests revealed quite a few problems with our design. Users were confused about, or were not able to discover certain features of our app (e.g. our CPR mode was was very poorly labeled and not intuitive), and they also expressed concerns that a short estimated time of ambulance arrival might disincentivize users from coming. We also realized our application did not have an aesthetically unifying theme. So, taking all the feedback into account, we iterated over the design cycle yet again.
This led us to our current and final design. We took out the ambulance ETA, redesigned our CPR mode to be more intuitive, redesigned the skin on both the mobile and the watch to be more aesthetic and minimal, and added a “do not disturb” option that is easily accessible from both the watch side and the mobile side. Finally, we included a way for users to end the emergency by pressing the end-emergency button twice, as a way of confirming. The following is our final design:
We were only able to find two applications on the market that are similar to ActFirst: Pulsepoint and GoodSam. Like ActFirst, both apps notify their users of nearby emergencies, and both apps target users who are medically-qualified good Samaritans. However, both Pulsepoint and GoodSAM have critical flaws that are solved in ActFirst.
GoodSAM: One major problem of GoodSAM is that it relies on individual observers to report emergencies to their system. This can introduce additional latencies with verifying and routing emergencies, which could be costly in emergencies. Also, it would only work if the observer is also a user of GoodSAM. ActFirst solves his problem by hooking up directly to dispatch. This means alerts will always arrive in a timely fashion.
PulsePoint: One critical flaw of PulsePoint is that it does not provide the user with navigational tools, only the address. This means users of PulsePoint would need to manually enter in the location on a separate maps app, which wastes crucial time. ActFirst solves this by making navigation a priority. ActFirst provides GPS support to guide the user to the scene.
Both: ActFirst improves upon both GoodSAM and PulsePoint by offering a smartwatch extension; our users are less likely to miss notifications and are more likely to respond quicker with a smartwatch, which is crucial because every second matters in emergencies. ActFirst also provides real-time CPR assistance, ensuring that even the less experienced users can accurately and confidently provide first aid.
Personas
Although we chose a narrow group of target users, we realized they can still vary greatly in terms of their expertise and experience in first response, which can significantly affect performance. So, when defining our personas, we prioritized variance in medical knowledge, training, and experience, and ability to perform in stressful situations. We found two personas were enough to help us understand how our users might use our application:
Persona #1: Tony is a medical professional that is both well-trained and well-versed in first response, and is used to handling high-stress situations. He is always able to provide first-aid with precision and confidence.
Persona #2: Jay is a college student that has been certified in both CPR and first-aid. However, he does not work in the medical field and has never had a chance to practice his training. Although he is willing to help someone in need, he is not confident in his abilities to provide quality aid under pressure, and has never experienced the stresses of an emergency.
Understanding how Tony, an experienced professional, would use ActFirst helped solidify the bare essentials that our app needed to save a life (navigation, information, timeliness, convenience). However, after evaluating Jay, who is trained but lacks practical experience, we realized training does not necessarily translate to preparedness. It is possible that users like Jay would need some sort of assistance or reassurance, in case they happen to blank out under the pressure. This is why we included quick procedure guides on the phone, as well as real-time CPR assistance on the watch.
Scenarios
To better understand how Tony and Jay would use ActFirst, we developed a scenario.
Task 1, accepting to help: A car accident occurs, and one of the drivers is unconscious at the scene. A bystander calls 911, but the paramedics will not arrive for another 9 minutes. Jay is enjoying his coffee two blocks away from the scene. Tony just got off work three block away from the scene. Both of them are unaware of what had just transpired. Jay and Tony each receives a buzz notification on their watch, notifying them of the situation. Them both accept and quickly rushes to the scene.
Task 2, navigating to the scene: Jay and John were just separately notified of a person nearby suffering from cardiac arrest, and quickly accepted to help out. However, they are unsure of where the accident occurred, so they take out their phones, which automatically brings us a map of the area, along with the patient’s location. On their way there, their phones provide audio navigation so they do not have to look back at the screen. Additionally, Jay, who has not practiced in a while, skims through the CPR procedures listed under the “Guidebook” tab on the phone to refresh his CPR training so that he is ready to perform at the scene.Tony and Jay successfully arrive on the scene.
Task 3, CPR assistance: After assessing the patient, Tony starts performing CPR for the patient. Tony, being a professional, is confident in his abilities to provide reliable aid, and so does not need to use our CPR assistance feature. After some time, Jay takes over to let Tony have some rest. However, Jay is not confident in his abilities to provide accurate aid, and so to make sure he stays on the correct compression rhythm, Jay taps on the “begin CPR” option on his watch. The watch begins to vibrate at 100 BPM, the recommended pace of compression. Every 30 compressions, the watch reminds Jay to breathe air into the patient. Jay, who is not used to such emergency situations, is thankful for the assistance provided by the watch. Jay and Tony continue to take turns supporting the patient until paramedics arrive.
User Studies
In order to keep the scope of our target users narrow, we decided that our users must be certified in some form of first aid in order to use the application. So, we went ahead and talked to these people.
Contextual Inquiry
For our contextual inquiry, we tried to talk to people with varying degrees of medical expertise and experiences, hoping that we would be able to gain different perspectives and insights from each person.
This was exactly what happened. We talked to three people: a licensed EMT (T), a nurse and certified lifeguard (M), and J, a person that is CPR- and First-aid certified but is not a medical professional.
- From talking to T and M, who have had a lot of first-hand experience with first response, we learned that there are often situations where medical equipment is necessary to provide proper care. Also, being able to properly assess the situation is critical to choosing the right course of action.
- Since J is not a medical professional, her responses gave much insight regarding how certified non-professionals who are not used to the pressures of an emergency might perform under such pressures, and how our app could help to alleviate some of those issues. It turns out that, even with training, our less experienced users may still blank out when faced with a high-stress situation.
Aggregating their feedback, we decided to go for a minimalist design, such that all important features are easily and quickly accessible. We also decided to include a tab that provides all the details dispatch has on the patient and the situation, a separate tab for quick procedural references, and procedural assistance to reassure inexperienced users, as well as to guarantee quality aid can be provided.
This gave us enough information to begin our next design iteration, as well as creating our high-fidelity framer prototypes.
Usability Testing:
Once we had our framer prototype up and running, we were ready to begin user testing. We had two users test our prototype:
- Participant 1: A licensed EMT working for an ambulance company. We selected this user because we wanted an opinion from an expert. Since he is the most knowledgeable participant we could find, he would be able to point out any extraneous information or features in our app.
- Participant 2: Although he is trained in the use of AEDs and CPR, he has never had to apply his training in a real-life situation. We selected this user because we thought his behaviors during the experiment would allow us to gauge, very generally, how our less experienced users might use the application, and what features they may find to be helpful.
For the tests, we created a scenario for our participants to re-enact. The scenario started with an alert of an emergency, and the facilitator helped guide the user through the scenario, providing context and location of the emergency. The participant then made his way to the scene, with assistance from our prototype, and begins performing first aid. After the test, we discussed with the participants their thoughts, impressions, and reasons for their actions.
The usability tests turned out to be incredibly insightful. By inviting users who had no part in the designing to test our application, we were able to get a lot of good information about our app’s usability without being skewed by the bias of the designers.
We realized our CPR mode was not intuitive at all, and that too much information can make a user less incentivized to go. For example, if the ambulance is only going to arrive 1 minute later than the user, the user isn’t motivated to go, even when the statistics point to the fact that they could increase the probability of survival by 10% (in the case of cardiac arrest).
So we made the following changes:
- Improve CPR mode by including more intuitive labelling/icons.
- Get rid of the ambulance arrival time.
- Reduce eliminate extraneous information in our notification.
- Redesign our skins to adhere to a more unifying theme.
Our final design focused on simplicity. We wanted to make sure that all of our features are not only easily discoverable, but also can be quickly and easily accessed. We tried getting rid of as much unnecessary and extraneous items and text as we could, so that more important information and features can be easily seen in high-stress situations.
Accept Notification (watch)
Mobile:
Navigation (Mobile): Automatically opens Google Maps when user hits navigate in the app.
CPR Guidebook and Patient Information tabs provide responders with additional information:
Menu and CPR Assistance(Watch)
The user can end an emergency from the watch. However, the user must press the button twice, as a way to confirm and make sure it was not a misclick. The CPR counter counts and vibrates up to 30 compressions, and halts for 5 seconds, reminding the user to breathe, and repeats the process.
Notifications: User can turn off notifications indefinitely during inconvenient times.
Coding the app was, for the most part, a relatively smooth process. However, there were a few problems that we ran into which took us a while to figure out.
Message passing between phone and watch: Our largest struggle was passing messages between a watch activity and a mobile fragment. The problem was that there was very little documentation on android wearable, and a lot of the sample code we found online did not work the way we wanted it to for one reason or another. Since we were not able to find a way to directly talk between a mobile fragment and a watch activity, we ended up creating a listener service for both mobile and wear, as well as additional activities to route the messages to and from the fragments.
CPR Counter: We were able to make our watch increment, vibrate, and reset at specified intervals by creating a Handler() object and delaying the handler’s runnable(). However, whenever we stopped the counter, the runnable would still continue to run, and removing the runnable’s callbacks at each iteration did not work. In the end, we eventually realized we had to override onStop() and kill the process by removing all the callbacks there.
Google Navigation: Our initial idea was to allow for Google maps navigation within the fragment. However, we were only able to pull up a map of the location within the fragment. After much research, we realized in order to enable navigation, we have to be able to pull up the Google Maps application, so that’s what we did. Our maps tab included a map of the location, with an additional navigation button which launches Google Maps with the desired coordinates automatically inputted.
The Final Product (Video):
Project Summary
For this project, we wanted to see what a smartwatch could do for the medical field. The result was ActFirst, a first-response application that can leverage the smartwatch’s greatest advantages (its timeliness, efficiency, and convenience) in order to bring trained personnel to the scene as fast as possible.
After speaking with both medical professionals and trained non-professionals, and going through numerous design iterations, we have succeeded in designing an app that has the potential to respond to emergencies quicker than our current first-response system.
Slow emergency response times have always been a problem, as there is no such thing as “too fast” in emergencies. By developing ActFirst, we have shown that there is a feasible solution to this problem that is well within our reach.
Comments