November 7, 2023
By Erin Davis and Alix Dorfman
Asking usability test questions
As someone interested in human factors engineering, you have likely learned that there is an art to asking great questions. An important aspect of leading a usability test is learning how to customize your questions to the situation, rather than always asking questions exactly as posed in the moderator’s script. The tips below describe considerations for asking effective, unbiased usability test questions.
How to ask questions during a usability test
-
Be concise. Avoid overcomplicating your test questions and instead focus on asking short questions. Similarly, ask questions one at a time, rather than burdening a participant with multi-part questions.
-
Ask why. Ask the participant to explain “why” they think or do things a certain way. This line of questioning can ensure you obtain the depth of feedback needed to perform an adequate analysis of usability test data.
-
Ask unbiased questions. Ask open-ended and unbiased questions that do not lead the participant’s response. For instance, ask “What are your impressions of this text?” rather than, “Does this text make sense?”
-
Ask for clarification if needed. If the participant is not providing a clear answer to a question, ask the question again in a different way. Similarly, if you are struggling to understand what the participant is getting at, do not put words in the participant’s mouth, but rather ask follow-up questions to try to get them to clarify. That said, you can confirm your understanding by reiterating what you heard if you hedge appropriately (e.g., “What I’m hearing is that you had some difficulty finding the button because of its location on the screen’s bottom left. Is that correct?”).
-
Respond to participant answers appropriately. It is reasonable to acknowledge your participant’s response to a question by saying “OK” or nodding your head. However, you do not need to repeat the response back to them if clarification isn’t needed. Additionally, do not comment on the quality or accuracy of the response, such as saying “great” or “that’s right,” respectively. Instead, thanking them for the feedback is sufficient. Finally, give your participant time to consider and respond. Do not assume they don’t have an answer or do not understand if they don’t respond immediately.
-
Ask, don’t interrogate. When leading and moderating a usability test make debriefs conversational and soften your questions to reduce the chance participants feel they are being interrogated. This helps to ensure the participant doesn’t think they are to blame for any mistakes, which facilitates a more productive root cause debrief. Phrases like “Why do you think that happened?” are much better than “Why did you do it incorrectly?” If you notice the participant getting defensive or blaming themselves, pause and remind the participant you are focused on evaluating the device and not them, and you want to understand what about the device might have led to the error.
-
Give enough context during the debrief. When debriefing to identify root causes, ensure you provide enough context and “set the scene” clearly for the participant. Here’s a good example of setting the scene: “Now let’s talk about when you delivered a dose during the first task. After you cleaned the skin with an alcohol wipe, I noticed you wiped the skin with a cotton swab. Why did you take that step?”
-
Recognize that design recommendations are not root causes. Design recommendations are not the same as root causes. For instance, knowing the participant would prefer that the warning include red text does not tell you why the participant misunderstood or overlooked the warning. Accordingly, avoid asking “what could make it better” during HF validation-style tests for which your goal is to identify root causes, not design changes. That said, you can use the technique of asking for design recommendations as a last resort if the participant is not providing any reasonable reported root causes. But, strive to ask why the participant provided the recommendation to get a second chance at uncovering the actual root cause. If you take this approach during a session and have observers, it’s good practice to explain during your post-session debrief why you asked for a design recommendation (i.e., it was a final attempt to extract a root cause). This way, you can proactively explain that it’s not common practice during an HF validation test, but you had a good reason for “breaking the rules.”
-
Use transitional phrases. Most usability test moderators have several “transitional phrases” in their toolbox that they use routinely when leading a test. A good transition will help confirm the participant’s understanding and move the session along to the next activity without inducing bias, artifact, or confusion. Sample transition phrases can be asking a participant “have you completed the task?” at the end of a use scenario, or saying, “I’m going to catch up on my notes and then we’ll move on to the next task.”
-
Review background questions efficiently. You should move through the background/recruiting grid questions relatively quickly. You don’t need to re-interview with the participant (i.e., ask the same questions again) if the recruiter’s input is clear. Rather, you can rapidly confirm we have all the accurate information (e.g., “It says here that you’re 65, have experience using NovoLog, use a pen-injector 3 times a day – is that still the case?”). This leaves more time for important questions during the evaluation activities.
Hopefully, these tips empower you to ask excellent questions and lead usability test sessions with confidence. Being a savvy interviewer will help you uncover accurate, valuable insights from your test participants, which ultimately is the goal of any usability test.
Leading usability tests with Emergo by UL
If you are interested in learning more on this subject, check out our webinar, as well as our eLearning training courses available on OPUS (our HFE digital platform), or books such as Moderating Usability Tests, Medical Device Use Error: Root Cause Analysis, and Usability Testing of Medical Devices.
Erin Davis is Associate Research Director and Alix Dorfman is Managing Human Factors Specialist at Emergo by UL.
Request more information from our specialists
Thanks for your interest in our products and services. Let's collect some information so we can connect you with the right person.