Auditing an in-store medicine dispensing system

Overview

Columbus is an internal colleague tool, used by Boots' pharmacies to appropriately dispense and manage repeat medical prescriptions. The team responsible for the development of Columbus commissioned the UX team to perform an internal UX audit study of the Columbus system.

Goal

Based on some initial assumptions — that the Columbus system is outdated and difficult to use — our team's goal was to do an extensive review and identify current pain points, so that the development team could better understand where to focus their efforts in improving the Columbus system.

Solution

Our methodology for this project was to visit five high output pharmacy stores, across Nottinghamshire, and observe colleagues using the Columbus system. We interviewed users as they performed certain tasks, including dispensing, retrievals, and stock management and asked follow up questions related to experience, habits and pain points.

Results

The result of our observations and UX analysis, enabled our team to highlight a number of key areas of improvement and provide focused recommendations to the development team to improve the overall experience and usability of the Columbus system. After an impact vs effort exercise we provided a prioritised list of suggested changes to the project backlog.

My role

Product design lead

Methods

Observational research, in-depth interviews, data analysis, system audit, heuristic review, ideation, brainstorm, prototyping, roadmap, impact vs effort, presentation.

Process

Planning

Following a request to performed a system analysis and colleague experience study, I began the project with a detailed project plan. This included a task list, milestone points, estimated effort, and stakeholder playbacks. This enabled our team to focus on each task at hand as well as communicate accurately with our stakeholders.

Our plan included in-store observations, heuristic analysis, screen flow mapping, ui enhancement exploration, recommendations, and prioritisation.

Project plan

Discovery

The methodology we used for this study, included five in-store in person observation sessions lasting up to four hours at each location. Broken up over multiple days, we observed pharmacy colleagues' perform their daily tasks and any interaction they performed using the Columbus system. We witnessed users using Columbus for various tasks, such as due now, due date, eps - dispensing retrievals and stock management. During the sessions we asked a number of questions related to the overall experience, ui design, usability, functionality and system performance.

Following our store visits I collaborated with a research partner to analyse our observations and findings. Our in-store observation research found that overall, users of Columbus find it easy to learn and use with many commenting it was well liked in terms of its purpose and usability. The most significant problem identified across all stores was system stability, with users mentioning frequent crashing and downtime. Alongside this, some other minor usability issues were also identified, including the number of pop-ups and ease of switching between modes.

Post observation research analysis

In addition to the in-store observation sessions I also organised multiple demo sessions with the operations team. This enabled us to better understand the screen flows a user would progress through for certain tasks and scenarios.

Screen flow mappin

Based on these research tasks I also conducted a heuristic review of the current user interface. These included visibility of system status, matching between system and real life scenarios, user control and freedom, etc. This step enabled us to better understand the correlation between offscreen tasks and onscreen user interactions – and how they related to usability best practice.

Heuristic UI analysis exercise
Current UI screens

Ideation

Included in our project plan was the task to explore what a new user interface would look like when applied to the Columbus system. I pulled inspiration from the Boots design system library and referenced similar external platform interface patterns. This helped in our conversations with ourstakeholders to better illustrate a view into the future and the power of modernised design.

Impact vs effort mapping and prioritisation

At the conclusion of this research project my research partner and I did a final playback of our research findings to our stakeholders. I also facilitated an impact vs effort exercise to help our stakeholders identify which of our recommendations were technically feasible versus the impact they would have and the effort needed to implement. The work resulted in a researched backed list of prioritised reccomendations.

Impact vs Effort exercise
List of prioritised recommendations