Watson ER

Master's Program

The team for this project composed of a multi-talented group of Interaction Design graduate students. Our goal was to imagine what the emergency room of the future could look like by understanding the needs being left unmet today.


Our greatest takeaways in our pain point research included a lack of feedback for waiting time, the extended waiting time for patients who are not in critical status, and the inability of hospitals to turn away any patient until they receive a doctor's diagnosis.


Our team conducted research with nurses, doctors, healthcare professionals, regular hospital visitors, and healthcare designers. We decided to focus our project around emergency rooms, to consider ways to help them work better in the complicated world that is the healthcare industry.

Prototype 2

Watson ER

The team's core concept was to use AI in a way to help nurses perform their job, instead of replacing them. We found that this solution could speed up the process, giving more tools for the nurses and doctors, and at the same time, providing a better service for patients, by reducing waiting times and giving them proper feedback through the ER experience.

The Space

We decided to create a space with open counters in the place of the usual front desk. This way, the nurse standing behind the counters can greet every new patient and decide if they are facing a life-threatening situation, and need to be seen by a doctor immediately, or if they should interact with Dr. Watson, the AI agent. The nurse can also easily navigate through the area, and help any patient in need of assistance.


The testing was a great opportunity to demonstrate and observe the patient’s journey through our experience. Having the users test the screens with a high-fidelity physical prototype of the environment was helpful. We could also demonstrate how the nurses would interact with the patients at different levels. We acted through the ways a nurse would greet an incoming patient, and how the AI agent signals an emergency - by a direct message to the nurse's device, as well as by a visual light signal at the counter. We came away with significant insights into the way interactions work best between the nurse, patient, and the AI.

Acuity Levels

We broke the use cases down into three use cases based on acuity levels. The first, appendix problems, was something that may seem like low priority, but the AI would be able to triage as high priority. The second, something that may seem like a high priority, heart problems, was diagnosed as anxiety by the AI doctor. The system would show the user that an urgent care would be a shorter wait time, without denying care. This was a challenge to design, as hospitals may not deny service to any patients that walk into an emergency room. The third, neck pain, was determined to be a low priority case.