IoT played an increasingly important role in people's lives; I was excited to explore its UX aspect. In this class project, I embarked on a UX research project to evaluate and improve the usability of an early-stage IoT application called OpenBAS. 

Class Project


Robert Davis, April Dawn Kester


User Research - Interview, Diary Study, Heuristic Evaluation, User Testing, Card Sorting 

Design & Prototyping - Participatory Design, Balsamiq Prototype



Project Overview

The challenge was to evaluate the web UI usability of a building HVAC automation system called OpenBAS.

We started off with interviews and a diary study to understand how our targeted users – building managers –currently used the UI. The diary study was especially informative: a building manager logged all the HVAC tasks she performed over a week, along with the types of interfaces used. It turned out that the building manager performed most of the tasks using physical controls, bypassing the web UI of interest.

We proceeded to conduct heuristic evaluations and usability studies to find out what was holding users back from using the web UI. We discovered many usability issues; most notably, the UI showed too much granular detailed HVAC data that building managers did not care about, and these data cluttered the interface and made the user experience intimidating and confusing.

I went on to conduct card sorting and participatory design to understand users’ mental model. These studies revealed feature priorities and groupings; I learned that there were only a handful of features that users actually needed from the web UI, that temperature adjustment was the most important feature, and that users valued actionable insights over rich data. Our research findings were well-received by the OpenBAS team and were incorporated into its future UI designs.






Project Background


I learned about the OpenBAS project through a CS course I took on IoT development. When I approached the OpenBAS team about doing UX research for their project, they gladly led me and my teammates onboard.



OpenBAS was an early stage open-source building automation platform that provided tools to integrate isolated sensor networks in commercial buildings. It was created by EECS Professor David E. Culler and his team at the University of California, Berkeley.

Commercial buildings have huge sensor networks, currently composed of temperature sensors, electrical sub-meters, fans, air dampers, light relays, pumps and valves. With the proliferation of more devices such as security, electrical, appliances, entertainment, etc., collectively referred to as the “Internet of Things”, the number of sensors a building manager needs to monitor is increasing at an unprecedented rate. However, most of these sensor devices do not talk to one another. Even though modern networked devices offer web service API’s, they are all different even across models from the same vendor. OpenBAS is developed to enable compatibility and interoperability between sensor devices in commercial buildings.

Integrating sensor networks would not only make building management an easier task for building managers, but also make smart energy savings possible, as better decisions can be made with access to more building data.

In its current beta version, OpenBAS focused on integrating the HVAC (heating, ventilation, air-conditioning) and lighting system. It was consisted of an open-source IT architecture, a plug-and-play control hardware, and a web-based user interface. Through this interface, building managers could monitor and adjust HVAC device settings remotely all in one place. 

Target Users

OpenBAS' primary target users were building managers of small and medium size commercial buildings that were equipped with multiple sensor devices. One-fifth of U.S. energy use is in commercial buildings, more than half of which are under 50,000 square feet. OpenBAS hoped to deliver a readily adoptable tool to bring new energy efficiency to smaller commercial buildings.

There were currently two deployments of the system, at locations P and C.  C was the focus of our usability assessment. It was a medium-size (7500 sqft) office situated on the second floor of a downtown Berkeley historic building (1935). The office had 5 HVAC zones, and was equipped with lighting control, and environmental sensors (carbon dioxide, etc.).

Research Objectives

  1. Understand the nature of office managers' daily work

    • Interview

    • Diary Study

  2. Evaluate what is working/not working with the existing OpenBAS UI

    • Heuristic Evaluation

    • User Testing

  3. Explore better UI solutions

    • Card Sorting

    • Participatory Design

We worked on our three research objectives roughly in the same chronological order as they were listed above. Findings from an earlier objective would inform the work for a later one. I worked on the first two objectives with my teammates, and on the last objective individually.




Generative Research


 Understand the nature of office managers' daily work


User Interviews

To understand how an office manager used OpenBAS, we needed to first understand what their jobs were like. We decided to interview office managers to find out how they conducted their daily work.

Recruiting and interviews

Recruiting our target users turned out to be a challenge. Based on our search (given time-constraints), at the moment there were very few small or medium sized commercial buildings equipped with multiple sensor devices around the Berkeley area. Although putting sensor devices in buildings was a macro trend, we were still at the technology's early adoption stage. At the end, six individuals agreed to talk to us: 

Two current and former OpenBAS users: The OpenBAS system was deployed for pilot testing in two commercial buildings in Berkeley. Through these deployments, we got in contact with two office managers who had experience using the OpenBAS interface.

Two proxy users: We got in contact with a manager for a building not equipped with sensor devices, and a building energy saving consultant who advised sensor devices for commercial buildings. Even though the proxy users did not fit our exact target user group, they had similar experiences or relevant knowledge that would help us better understand how our users conduct their daily work.

Two expert users: We also talked to two members of the OpenBAS development team.

Input from the current/former users and proxy users was used to formulate insights into user habits, needs, and pain points. Input from the expert users was used only for project background and context. In lieu of a formal interview guide, we developed a set of questions prior to each interview based on the context of the interviewee.



These interviews helped us understand the nature of an office manager's work. Talking to the expert users also helped familiarize us with the core functionalities of the OpenBAS system.

Officer managers in small and medium sized buildings were often tasked with many responsibilities. Controlling HVAC and lighting systems were low priorities in the context of their everyday jobs. They felt existing manual methods of controlling these systems adequately performed the required functions. However, interviewees indicated that the concept of a digital dashboard that could monitor and control all systems would be appealing, if the digital interface was as easy to manipulate as the existing manual solutions.


Thermostats we saw during interviews:


Diary Study

To gain a better understanding of an office manager’s daily building management activities, I decided to conduct a diary study with our user J. I would have preferred a sample size of five people or more for the study, but given our recruiting difficulties, I decided a diary study with a single participant would still provide us with valuable insights. 


To better understand what the office manager currently does to maintain the office:

  • What building management tasks does she perform?

  • What are the pain points?

  • How much of this is accomplished through openBAS?

  • How much is through traditional interfaces?

One week (4/10/15 - 4/17/15)
Paper log sheets that I gave to the manager at the beginning of the study. (Even though using electronic media would allow us to view the manager’s entries in real-time, I decided against it because a electronic media would take longer for the manager to retrieve than a paper log by her desk, and as a result might decrease the manager’s logging accuracy.)


The manager logged 12 entries in total. Among them, 10 were for turning on/off the lights in the office hallway. One entry was for turning up the heat in one building zone, and another one was for turning down the temperature in the conference room. All tasks were performed on the physical interfaces (switches for lights, thermostats for temperature).

I derived three observations about the manager’s building management activities:

  • The manager did not perform many building management tasks over the week.

  • The manager opted for the physical interface to perform tasks that the web interface was capable of.

  • The lighting control tasks were very quick (“took a second”), but the AC adjustments took longer (“took ~5 minutes”).





Summative Research


Evaluate what is working/not working with the existing OpenBAS UI


Heuristic Evaluation

Now we have a better understanding of an office manager's work. We moved on to evaluate the current user interface. We thought heuristic evaluation would be a good method to start with, as it would allow us to quickly spot some major UI issues, while familiarizing ourselves with the OpenBAS web interface in the process. 


The Existing OpenBAS web UI



  • Visibility of system status: Inconsistent and confusing system response for selected operations

  • Match between system and the real world: Information conveyed to be read by a software engineer, not by an end user

  • User control and freedom: With no menu hierarchy, easy to get lost after clicking on links

  • Consistency and standard: Different icons, text and colors used throughout the UI for different operations

  • Error prevention / Help users recognize, diagnose, and recover from errors: Can delete crucial components with no warning and no easy recovery

  • Recognition rather than recall: Multiple ways to locate key operations

  • Flexibility and efficiency of use / Aesthetic and minimalist design: Many nested screens and use of links

  • Help and documentation: Very little if any help features or documentation

In summary, users were uncomfortable exploring the interface. The lack of error prevention, recovery, and little documentation made the user uncertain what operations would damage the system. The nested screens and links with poor navigation tools made it easy to get lost and difficult to remember where you previously found a key operation.

User Testing

The second method we chose for evaluating the current UI was user testing.


We contacted the two OpenBAS users that we interviewed earlier:

  • User 1 (J): novice, rarely interacted with the product

  • User 2 (A): more experienced, had used the product on a daily basis over a period of several months

Both users were office managers that fitted the project’s target user set. The varying levels of expertise of our two users allowed us to assess whether certain tasks became easier to perform given increased exposure to OpenBAS.


Based on our user interviews, I developed a set of five tasks that we presented to each user at the beginning of the test. For each task, we asked the user to narrate her thought process as she worked to complete it, then to rate the overall difficulty of the task upon completion.

  • Task 1: the office is thinking about making every Wednesday a work from home day. Adjust the thermostat schedule such that the building doesn’t heat/cool on every Wednesday from now on.

  • Task 2: somebody in the office just complained that the northeast corner office was too cold. We want to increase the temperature of that office.

  • Task 3: someone mentioned that they noticed the lights were on in the office late at night over the past couple weeks. We want to verify if that was the case.

  • Task 4: we installed a new thermostat a few months ago but we’re not sure if it’s actually working. In order to determine whether it is working, we want to check on the date and time that the thermostat last reported its status to the openBAS system.

  • Task 5: let’s go to the display tab of the website,how would you design it if we were to start from scratch? What features would you want to see?

Following the usability test, we conducted a brief interview to better understand the user’s experience with the product and how it did or did not align with her needs as an office manager. 


Our usability tests confirmed many of the findings from our earlier heuristic evaluation. First, the UI occasionally hung up for periods of up to two minutes, preventing the user from performing any actions; this issue came up several times in both usability tests. Second, the UI was unintuitive - the users had difficulty navigating the interface, could not easily distinguish interactive elements of the UI from static elements, and did not understand the purposes of many of the features. Third, the UI showed too much granular detailed HVAC data that building managers did not care about, and these data cluttered the interface and made the user experience intimidating and confusing.





Formative Research

Explore better UI solutions

Card Sorting

The above studies helped us understand the nature of office managers's work, and revealed usability issues present in the existing UI. At this point we had mostly fulfilled the requirements of this class project. However, as a UX designer, I wanted to pursue one step further and create better UI design alternative for OpenBAS. Hence, I embarked on the third phase of our research project - explore better UI solutions. This time I worked individually.

Our previous studies showed that one major issue with the current design was it had too many features that ended up confusing the user. To explore better UI solutions, the first thing I wanted to find out was the primary features an office manager would need. To answer this question, I did a card sorting session with user J. 


I designed a two-tier card system. Separating cards into two tiers would help make the task easier and more manageable for the participant, likely resulting in better data and a more comprehensive understanding of the mental model. Tier-1 was the overall feature categories: 

  • Control Temperature

  • Control Lighting

  • Monitor Energy Usage

  • Set Building Schedules

  • View when a device was last detected

And under each category, there was a list of tier-2 features:


Control Temperature

  • Select a thermostat on a clickable location map

  • Select a thermostat from a list of thermostat locations

  • Set current temperature

  • View heat/cool setpoint

  • Adjust heat/cool setpoint

  • Make temporary temperature change

  • Set the length a temporary change remains in effect

  • Plot temperature change history

Set Building Schedules

  • Edit regular schedule

  • Edit exception dates

  • Monitor Device Connection Status


Control Lighting

  • Select a lighting zone on a clickable location map

  • Select a light zone from a list of thermostat locations

  • Turn on/off lights in an entire zone

  • Turn on/off lights in an entire room

  • Turn on/off an individual light

Monitor Energy Usage:

  • View monthly energy target

  • View real-time energy consumption data

  • View energy consumption history

  • View comparison between current consumption and monthly target

View when a device was last detected

  • View if a device is active/inactive


I generated this collection as a quasi-exhaustive list that covered features available on the current interface, and additional ones brought up during interviews and studies. To cover unlisted categories and features, I also prepared ample blank cards for impromptu writing during the study.


During the study, I first instructed the manager to sort the tier-1 categories in the order of preferences, and then asked her to sort the tier-2 features under each tier-1 category. In the process, the manager suggested a number of additional features that I included into the cards. At the end of the session, I had a clear list of preferences.  



I made the following observations:

  • Temperature control was the most important feature, followed by lighting control.

  • Monitor energy usage was desirable but was challenging to present in an understandable and actionable way for a building manager (in fact, the manager didn’t like any of the tier-2 features we provided). This seems to be an area that should be automated, either fully or to the extent that actionable items could be provided to the manager.

  • “Set building schedule” module was important but did not need to appear on the home page.

  • “Monitor device connection status” module should be integrated with temperature and lighting controls.

  • Some Important features are currently overlooked:

    • Label devices in a locatable way

    • Inform users which devices were controllable by OpenBAS

    • Allow users to sync thermometer dates in an easy and reliable way

Participatory Design (Aha Moment!)


Referencing results from the card sorting, I felt ready to put designs on paper. Since OpenBAS was a new system with a limited number of early adopters, I thought using a participatory design approach to involve an early adopter in the design process would help me better address user needs. J agreed to help out. 

To prepare for the participatory design session, I pre-selected a list of important features from card sorting and created a collection of pencil-drawn design wireframes, and separated them into moveable module. During the design session, I showed J each design module and asked for her critique. Then I asked her to arrange the modules to form a layout she would like.

Findings, and epiphany 

The following design challenges surfaced during the session:

  • Crowding caused by the large amount of data to show and parameters to adjust (e.g. zone name, current status (heating/cooling/idle), current temperature, connection status, set target temperature, set time in effect, show confirmation).

  • Absence of proper naming system for locating lights (needed for controlling individual lights from the web interface).

After the session, I experimented with various design changes in an effort to make the interface less cluttered while still maintaining the granularity of data and controls. Whatever way I tried, it was impossible to get it right. Then, while sketching yet another iteration, an idea hit me. I thought back to some comments J made during the participatory design session and had a sudden realization:


When I asked how she usually determined what temperature adjustment to make when someone complained, J said “usually a couple degrees initially, and if someone complains again, adjust one more degree.” If this is how the manager adjusts the temperature, it seems a computer algorithm could do equally well, and most likely better.

When I asked how she managed the lights, J said “other than turning on/off the hallway lights (the only lights that did not have sensors and not fully automated in the building), I never touch the lights.” Then when I asked if she would use the lighting zone control, she said “No.”


J's comments made me realize that granular control of the system may not be desirable to office managers. Instead, they might want to issue a simple command and let the system decide the precise parameters.
This realization led me to my findings.

  • Temporary temperature adjustment needs to be better automated. Implement an algorithm that automatically adjusts temperature when an adjustment request is received, and develop an interface that handles adjustment requests.

  • The status of user interface should be secondary to automation. User should only need the interface to supplement information to assist automation, and to do manual adjustments when automation fails to deliver.

These realized finally cleared up my design direction, and I designed a new interface mock-up soon after.

Proof-of-Concept Mock-Up

To help the OpenBAS team envision what an improved interface could look like, I designed a PoC mock-up that addressed the issues listed above. As part of a user research deliverable, I did not spend much time polishing and testing this mock-up, as its main purpose was not to be a final design itself, but rather to illustrate the research findings and inspire new designs. I did, however, showed the mock-up to J, and she thought it was a major improvement to the current design.





Our recommendations are based on the research findings explained in the above section. Due to the limited scope of our research (6 interviews, 3 heuristic studies, 2 usability studies, 1 diary study, 1 card sorting, 1 participatory design), we think more research and studies will be necessary to further validate them. With this understanding in mind, here are our most important recommendations (full Finding Summary available here):


Issue 1

Temporary temperature adjustment is difficult and time-consuming with the current product design. (see: diary study, interview, usability study, participatory design).


  • Develop a more streamlined interface for handling adjustment requests.

  • Implement an algorithm that automatically adjusts temperature when an adjustment request is received.


Issue 2 

Date/time syncing issues between openBAS and connected wall thermostat units. (see: interview, card sorting).


  • Fix any back-end issues that prevent accurate syncing.

  • Enable users to view on the web interface the current date and time setting for each thermostat, and make changes when necessary.


Issue 3

The current interface contains excessive features and data that the typical office manager does not need. (see: interview, usability study, heuristic evaluation).


  • Develop separate interfaces for (1) technicians and (2) office managers.

  • The interface for technicians can retain most of the current features.

  • The interface for the office manager will be simpler. The interface is designed to be a supplementary tool to automation. Mostly, the manager will not need to use the interface as openBAS automatically takes care of the daily operation. The interface is used to (1) put in temporary adjustment requests (2) adjust schedule (3) manually control when the auto operation is not working. The interface will also send push alerts to the manager (e.g. via email) if a malfunction is spotted (e.g. a device disconnects).

  • Please see the Interactive Prototype (explained in next section) for the proposed redesign.


Issue 4

For a typical office manager, the existing energy consumption data visualization interface is hard to interpret or translate to action. Moreover, energy consumption is not a high-priority issue in the context of the day to day job function of an office manager. (see: interviews, usability study).


  • Automate most energy consumption decisions without office manager input.

  • If human intervention is needed, give office managers actionable items rather than raw energy consumption data.

Final Report

I presented our research findings and design recommendations to the OpenBAS team. Our work was well-received by the team. Going forward, OpenBAS plans to incorporate the findings and design into their new product design.

I want to thank the OpenBAS team and our users, who generously helped us out of their busy schedules. I also want to thank my INFO214 course instructor, Steven Fadden. His insightful guidance and encouragement made this project a very fun and rewarding experience.