My team was assigned the challenge to design and develop an Android application that would deliver an integrated multi-device experience across the phone and the watch. Can the Android watch enable valuable use cases beyond simple message notifications?
Norman Zhong, Jon Lai, William Xu, Joey Pereira
User Research - Contextual Inquiry, Personas, Scenarios, Task Analysis, Storyboard
Interaction Design - Mobile+Watch Wireframe, Mobile Interactive Prototype
Usability Testing - Think-Aloud
Front-End Development - Android Mobile, Android Watch
The challenge was to design and develop a multi-device application that would utilize both Android mobile and watch. After brainstorming over 50 ideas, my team decided to create a crowdsourcing emergency medical response app. We first developed personas, scenarios and task analysis based on contextual inquiries with three target users with varying levels of expertise in emergency medical response. This work helped us decide the main functionalities would be to alert medical emergencies, assist navigation, count CPR rhythm, and provide procedural reference.
Next we started prototyping the watch and the mobile interfaces, paying special attention to how the two devices would interact to create an integrated user experience. We went from lo-fi to hi-fi, improving the design via usability testing. Finally, with the design finalized, I developed the functional Android app in collaboration with teammates. This project was an interesting experience exploring the bleeding edge of the new wearable technology, and yes blood was shed while searching through the virtually non-existent Android wear documentation, but it was also fun to get a taste of the multi-device user experience of the future.
During the brainstorming process, we compiled a list of every possible app we could think of that would benefit from multi-device interactions. Through a series of envision exercises, comparative analyses, and blind-voting, we narrowed down from 50+ items to five, then to three, and finally to one. In the end, we decided to develop ActFirst, a medical emergency and assistance app that would help to save lives.
According to the Sudden Cardiac Arrest foundation, for every minute a cardiac arrest patient goes without CPR and defibrillation, the victim’s chance of survival decreases by 7-10%. The American Heart Association states that immediate CPR after cardiac arrest can double someone’s chance of survival, but only 32% of victims receive effective CPR from a bystander. In the Alameda County, EMS response time is 10 minutes on average and on some occasions can take over 15 minutes to arrive. These facts highlight the importance of the involvement of medically trained bystanders for timely medical response, since they can quickly arrive on the scene and provide basic help before the paramedics arrive.
Our app, ActFirst, would alert medically-trained bystanders of nearby emergencies and assist in their navigation and first-response procedures.
ActFirst would operate on both Android watch and phone, with the watch interface focusing on real-time notifications and quick access, and the phone interface focusing on more detailed information. Here is our platform justification.
I drew a storyboard to illustrate our vision of the app.
This app is geared towards the good Samaritans who have received at least some form of medical training and are capable of providing medical aid. They can range from medically licensed professionals (e.g. doctors, nurses, paramedics, etc.) to average citizens who are CPR-certified. Depending on the emergency, nearby users with the appropriate certifications will be notified.
For our contextual inquiry, we tried to talk to people with varying degrees of medical expertise and experiences, hoping that we would be able to gain different perspectives and insights from each person. We talked to three people: a licensed EMT (T), a nurse and certified lifeguard (M), and a college student that was CPR- and First-aid certified but was not a medical professional (J).
From talking to T and M, who had a lot of first-hand experience with first response, we learned that there are often situations where medical equipment is necessary to provide proper care. Also, being able to properly assess the situation is critical to choosing the right course of action.
Since J was not a medical professional, her responses gave much insight regarding how certified non-professionals who are not used to the pressures of an emergency might perform under such pressures, and how our app could help to alleviate some of those issues. It turns out that, even with training, our less experienced users may still blank out when faced with a high-stress situation.
Aggregating their feedback, we decided to go for a minimalist design that would give quick access to important features. We also decided to include a tab that would provide all the details dispatch had on the patient and the situation, and another tab for quick review of procedural references to reassure inexperienced users.
From the contextual inquiry, we realized that a user's level of expertise and experience in first response can significantly affect their performance. Therefore, when creating our personas, we prioritized variance in medical knowledge, training, and experience, and ability to perform in stressful situations. We found two personas were enough to help us understand how our users might use our application:
Tony is a medical professional that is both well-trained and well-versed in first response, and is used to handling high-stress situations. He is able to provide first-aid with precision and confidence.
Jay is a college student that has been certified in both CPR and first-aid. However, she does not work in the medical field and has never had a chance to practice her training. Although she is willing to help someone in need, she is not confident in her abilities to provide quality aid under pressure, and has never experienced the stresses of an emergency.
Understanding how Tony, an experienced professional, would use ActFirst helped solidify the bare essentials that our app needed to save a life (navigation, information, timeliness, convenience). However, after evaluating Jay, who was trained but lacks practical experience, we realized training does not necessarily translate to preparedness. It is possible that users like Jay would need some sort of assistance or reassurance, in case they happen to blank out under the pressure. This lead us to include quick procedure guides on the phone, as well as real-time CPR assistance on the watch.
Our story begins with Tony. Tony has Act First on his smartphone and Moto 360 smartwatch. As Tony is going about his day, his phone is checking for nearby emergencies; when it finds one, his phone pushes an alert notification to his watch, and the Moto 360 lights up and buzzes.
"Cardiac Arrest, 3 min ago. 0.4 miles away." Tony immediately accepts on his watch, and Google Navigation opens on his phone, already directing him to the scene of the emergency. When Tony arrives at the address, he sees someone lying on the ground, unresponsive. He immediately rushes into action.
On his smartwatch, he taps patient info to see if there is any important patient info before he begins administering CPR. The patient info opens on his phone, available to see at any time. Meanwhile, he taps on the CPR section on the watch, and is immediately brought to a pacer that buzzes the correct compression rate and keeps track of the number of compressions.
The patient regains consciousness, and the paramedics have arrived! Relieved, Tony taps to end the emergency on his watch. Tony goes about his day, a modest everyday hero.
To better understand how Tony and Jay would use ActFirst, we developed three tasks a user would perform on ActFirst.
Task 1: accepting to help
A car accident occurs, and one of the drivers is unconscious at the scene. A bystander calls 911, but the paramedics will not arrive for another 9 minutes. Jay is enjoying her coffee two blocks away from the scene. Tony just got off work three block away from the scene. Both of them are unaware of what had just transpired. Jay and Tony each receives a buzz notification on their watch, notifying them of the situation. They both accept and quickly rush to the scene.
Task 2: navigating to the scene
Jay and Tony were just separately notified of a person nearby suffering from cardiac arrest, and quickly accepted to help out. However, they are unsure of where the accident occurred, so they take out their phones, which automatically brings us a map of the area, along with the patient’s location. On their way there, their phones provide audio navigation so they do not have to look back at the screen. Additionally, Jay, who has not practiced in a while, skims through the CPR procedures listed under the “Guidebook” tab on the phone to refresh her CPR training so that she is ready to perform at the scene.
Tony and Jay successfully arrive on the scene.
Task 3: CPR assistance
Tony starts performing CPR for the patient. Tony, being a professional, is confident in his abilities to provide reliable aid, and so does not need to use our CPR assistance feature. After some time, Jay takes over to let Tony have some rest. However, Jay is not confident in her abilities to provide accurate aid, and so to make sure she stays on the correct compression rhythm, Jay taps on the “begin CPR” option on her watch. The watch begins to vibrate at 100 BPM, the recommended pace of compression. Every 30 compressions, the watch reminds Jay to breathe air into the patient. Jay, who is not used to such emergency situations, is thankful for the assistance provided by the watch. Jay and Tony continue to take turns supporting the patient until paramedics arrive.
After conducting user research, We decided our app would perform the following primary tasks:
Alert the user
Help the user navigate to the scene
Provide the user with procedural assistance
Provide medical references
Display automated external defibrillator (AED) locations on map
We progressed from wireframes to interactive prototypes.
We developed a linear interaction flow, which was to accept, open up navigation to reach the scene, and use the app’s assistive tools to perform first-aid. Referencing the scenarios and tasks from the user research we made sketches of our interaction flow, and then created lo-fi mock-ups. While creating the mock-up, we thought it would be useful to add an additional estimated time of ambulance arrival, so that users would know what course of action to take during the rescue, so we included that feature as well.
We revised the wireframes to comply with the Android watch interaction conventions, and simplified the phone app features to ensure quick access to the essentials. Then we created an interactive watch prototype using Framer.js, and an interactive phone prototype using Envision.
- Notification -
- Navigation -
- Procedure Assistance -
Once we had our interactive prototypes up and running, we were ready to begin user testing. We had two users test our prototypes:
Participant 1: A licensed EMT working for an ambulance company. We selected this user because we wanted an opinion from an expert. Since he was the most knowledgeable participant we could find, he would be able to point out any extraneous information or features in our app.
Participant 2: A college student who had received AEDs and CPR training, but had never applied his training in a real-life situation. We selected this user because we thought his behaviors during the experiment would allow us to gauge how our less experienced users might use the application, and what features they might find helpful.
For the tests, we created a scenario for our participants to re-enact. The scenario started with an alert of an emergency, and the facilitator helped guide the user through the scenario, providing context and location of the emergency. The participant then made his way to the scene, with assistance from our prototype, and began performing first aid. After the test, we discussed with the participants their thoughts, impressions, and reasons for their actions.
The usability tests turned out to be very insightful. We realized our CPR mode was not intuitive to use, and that too much information could make a user less incentivized to join the rescue. For example, if the ambulance is only going to arrive 1 minute later than the user, the user isn’t motivated to go, even when the statistics point to the fact that they could increase the probability of survival by 10% (in the case of cardiac arrest).
So we made the following changes:
Improve CPR mode by including more intuitive labeling/icons. Upon feedback that our CPR pacer was confusing, we added some brief instructions to clarify its usage. Whereas before, users didn't know if it was a countdown or count-up, we now explicitly explained that it counted up.
Get rid of the ambulance arrival time.
Eliminate extraneous information in our notification.
Redesign the application skins to adhere to a more unifying theme.
A more polished means to end emergency mode. We originally didn't have a way to end emergency mode on the watch, leaving our testers confused about what to do after an emergency was over. Now, the user could easily end emergency mode on the watch and phone.
Eliminated half-baked features on the watch that we felt wasn't well executed, like blood loss mode. Removing them allows us to focus on executing our core features well.
All the design changes aimed at increasing the speed of using the app, and incentivizing users to help the emergency. The illustration below highlights some important changes.
Our final design focused on simplicity. We tried getting rid of as much unnecessary and extraneous items and text as we could, so that more important information and features could be easily accessed in high-stress situations. Below illustrates the final interaction design for accomplishing the tasks discussed in the user research section.
Task 1: Accept Notification
Task 2: Navigate to Scene
Automatically opens Google Maps on the phone when user hits navigate in the app.
First-Aid Guidebook and Patient Information tabs provide responders with additional information that they can review on their way to the scene.
Task 3: CPR Assistance
The user can end an emergency from the watch. However, the user must press the button twice, as a way to confirm and make sure it was not a misclick. The CPR counter counts and vibrates up to 30 compressions, and halts for 5 seconds, reminding the user to breathe, and then repeats the process.
Task 4: Manage Notifications
User can turn off notifications during inconvenient times.
Now that we had finalized the design, we moved on to the second phase of the project - application development.
To provide guidance on the system design, we first created a flowchart to outline what technical steps were necessary for interaction screens to transition from one to the other. The flowchart, as shown below, separated watch and mobile screens between a data layer and explicitly specified which screens needed to communicate across the data layer.
Skeleton Android App
Skeleton programming facilitates a top-down design approach, where a partially functional system with complete high-level structures is designed and coded, and this system is then progressively expanded to fulfill the requirements of the project.
For our skeleton Android app, using the Android SDK, we made ImageButtons that filled each screen with a screenshot, then linked the screens together using the appropriate event handler (e.g. onClick, onTouch, onLongClick). In the end, our skeleton showed the complete interaction, just without the implementation.
Below are the skeleton app demons.
Coding the app was, for the most part, a relatively smooth process. However, there were a few problems that we ran into which took us a while to figure out.
Message passing between phone and watch: Our largest struggle was passing messages between a watch activity and a mobile fragment. The problem was that there was very little documentation on android wearable, and a lot of the sample code we found online did not work the way we needed. Since we were not able to find a way to directly talk between a mobile fragment and a watch activity, we ended up creating a listener service for both mobile and wear, as well as additional activities to route the messages to and from the fragments.
CPR Counter: We were able to make our watch increment, vibrate, and reset at specified intervals by creating a Handler() object and delaying the handler’s runnable(). However, whenever we stopped the counter, the runnable would still continue to run, and removing the runnable’s callbacks at each iteration did not work. In the end, we realized we had to override onStop() and kill the process by removing all the callbacks there.
Google Navigation: Our initial idea was to allow for Google maps navigation within the fragment. However, we were only able to pull up a map of the location within the fragment. After much research, we realized in order to enable navigation, we had to be able to pull up the Google Maps application, so that’s what we did. Our maps tab included a map of the location, with an additional navigation button which launches Google Maps with the desired coordinates automatically inputted.