Eliminating Blockers
Problem Statement
My UX team was regularly dominating our scheduled meetings in order to work through blockers on ad-hoc, high-priority items. I recognized the need for a general, regularly scheduled meeting for our team to discuss these ad-hoc items. This meeting needed to be a safe opportunity for the team to work through blockers and discuss ad-hoc items.
Personas for Shop Talk
Solution
I reviewed various terms for regular meetings - coffee talk, stand up, block meeting - but determined that my team needed more flexibility that some of these titles allowed. I ended up scheduling a "Shop Talk." This title helped frame the meeting in a useful way to help my team discuss anything related to our work. I reviewed team calendars and selected a midweek time when my team would be involved in their work and ready to share blockers. For my team of 5, I determined that 1 hour would be a good use of our team's time.
Metrics for Success
This solution is successful when the Shop Talk is productive, insightful, and engaging:
Team members with blockers have a designated time to bring up these issues and leave with specific next steps
Team management has a pulse on team dynamics
All members have an opportunity to discuss and engage in the content presented
Scoring a SUS
Problem Statement
The System Usability Scale (SUS) is a repeatable method for benchmarking usability. This method is administered via a 10-question survey. Scoring an average SUS score from multiple respondents is time-consuming and prone to errors. Additionally, some survey tools do not provide the data in a way that makes it easy to calculate the SUS. My team was spending a lot of time manually calculating respondent SUS scores and then averaging them. Calculating this benchmark number needed to be quick and automated and needed to make use of available spreadsheet tools.
Solution
I built a spreadsheet tool that takes in two kinds of data - the total number of respondents and the number of respondents who picked each ranking for each question in the SUS. In other words, how many people filled in the SUS and how many picked each response option? The spreadsheet tool then converts this data into an average SUS.
Metrics for Success
This solution is successful when scoring a SUS is accurate, efficient, and accessible:
It accurately produces the average SUS score from all respondents
It takes less time than manually scoring each respondent's SUS and averaging the results
It uses available spreadsheet tools (Excel, Google Sheets)
The People, Tools, Environment Model
Systems Approach to Problem-Solving
Problem Statement
I observed people outside of my UX team focusing on solutions that required a lot of effort from the users involved. These solutions often created a major cognitive burden for our users. I needed a simple framework to explain to non-UXers how to consider solutions that would create a smaller cognitive burden for users.
Solution
I shared the People, Tools, Environment model with my colleagues. This model is useful because it helps break down any problem into the specific People, Tools, and Environments involved. Breaking down problems in this way helps to identify how the Tools and Environment can be leveraged to improve the users' experience. For example, rather than creating a training module for the People to learn to use a feature in the product, we can modify the Tool, such as by employing more intuitive user controls, or the Environment, by reorganizing the front-end architecture of the software.
Metrics for Success
This solution is successful when my colleagues are equipped to discuss the People, Tools, and Environments:
The People, including the users, software trainers, and configuration specialists
The Tools, including notifications, settings, and guardrails
The Environments, including the whole software system and the physical environments