Skip to content

Even More Difficult Usability Testing Participants

Four years ago, in 2017, I published an article in UXmatters giving advice about how to handle ten types of difficult usability testing participants, Wrangling Difficult Usability Testing Participants. The ten types of difficult participants were:

  • Bad Fits to the User Profile
  • Untalkative Participants
  • Overly Talkative Participants
  • Participants Who Ramble Off Topic
  • Inarticulate Participants
  • Participants Who Struggle to Think Aloud
  • Participants Who Have No Opinions
  • Uncritical Participants
  • Participants Who Blame Themselves
  • Uncooperative Participants

Four years later, I felt I had encountered enough new types of participants to write a part two, Wrangling Difficult Usability Testing Participants, Part 2. These include:

  • Happy Clickers
  • Talkers, Not Doers
  • Givers of Facts, Not Opinions
  • Representatives of the Business
  • Participants Who Take Prototypes Too Literally
  • Professional Research Participants
  • Uncomfortable, Nervous Participants
  • Participants Who Are Too Relaxed
  • Harassers

Of course, most participants are just regular people who are trying to do their best in the unusual situation of participating in a usability test. It’s up to you as the researcher to help them understand what you need them to do.

Image by Rinaldo Wurglitsch under Creative Commons License

Better UX Recommendations

Findings and recommendations spreadsheet

As UX researchers, we tend to focus more time on explaining our findings than in providing our recommendations. Yet, however well we explain the findings and recommendations, there comes a time when we’re not present, and the people who have to implement the recommended changes have to rely on the written recommendations and what they remember from your explanation. So it’s very important to ensure that your UX recommendations are understandable, concise, specific, believable, authoritative, actionable, feasible, flexible, prioritized, and easy to review. I provide advice on how to provide better recommendations in my latest article on UXmatters:

Providing Better UX Recommendations

Remember Clippy?

Clippy, the Microsoft Office assistant

In my latest article on UXmatters, Five Degrees of User Assistance, I bring up a character that people love to hate – Clippy, of course! Although I do have sort of a soft spot for the little guy, he is a great example of unwanted user assistance.

Poor Clippy! It really wasn’t his fault, he came along at a time when computers were too stupid to accurately predict when people needed help. Programmed to jump out when certain events occurred, to enthusiastically offer his assistance, instead he came across as an unwanted interruption and annoyance.

Today, as technology becomes increasingly intelligent, computers are smart enough to provide more appropriate and more accurate user assistance. In my latest article I describe these five levels of user assistance:

  • Passively providing online Help content. Here’s help if you need it.
  • Asking if the user needs help. Can I help you?
  • Proactively offering suggestions that users can accept or ignore. Is this what you want, or do you want to correct this?
  • Alerting the user that it’s going to take an action automatically, unless the user says not to. I’m going to do this, unless you tell me not to.
  • Automatically taking an action for the user, without asking for permission. I’ve got this for you. Don’t worry about it.

 

Check it out at UXmatters: Five Degrees of User Assistance

Image source: Clippy, created by J. Albert Bowden II and licensed under CC BY 2.0

Paper Prototyping: Is it still worth it?

In my latest UXmatters article, I compare the latest prototyping tools to paper prototyping. Paper has long had the advantage in allowing designers to quickly and easily create early prototypes, that look unfinished, and encourage users to honestly provide criticism. However, the latest prototyping tools have caught up to, and in some cases surpassed, paper in making it very easy and quick to create prototypes without any coding.

So, do the advantages of paper prototypes still beat these new prototyping tools? That’s what I explore in my latest article, Prototyping: Paper Versus Digital.

Image credit: Samuel Mann

Testing Your Own Designs

Usability testing session

Today I published an article in UXmatters, Testing Your Own Designs. It’s often been said that you shouldn’t conduct usability testing on your own designs, because you may be too biased, defensive, or too close to the design to be an impartial facilitator. Although that may be the ideal, often UX designers don’t have a choice. They may be the only person available to test the design, so if they don’t test it, no one will. So in this article I provide advice for those times when you have to test your own design, and I also provide advice for when someone else tests your design.

I was hesitant to write this article, because it’s been a topic that many others have written about, but I felt that as someone who has been on all sides of the issue, I had something additional to add. Here are some other good articles about this topic:

Testing Your Own Designs: Bad Idea? and Testing Your Own Designs Redux by Paul Sherman

Should Designers and Developers Do Usability? by Jakob Nielsen

BECAUSE NOBODY’S BABY IS UGLY … SHOULD DESIGNERS TEST THEIR OWN STUFF? by Cathy Carr at Bunnyfoot

This One Goes to 11

I just published an article on UXmatters, 10 User Research Myths and Misconceptions. It addresses common misunderstandings about user research that I’ve encountered over the years.

Here’s a bonus outtake from the article, Myth 11…

Myth 11: Field Research Is Better Than Usability Testing

On the other end of the spectrum from those who don’t understand the difference between user research and usability testing, are the user research elitists who think up-front, generative user research methods are far superior to usability testing. In this view, field studies take researchers out of the lab to observe people in their natural environments performing their usual activities, while usability testing takes place in the sterile, artificial environment of a usability lab and asks people to perform a limited set of artificial tasks. Instead of learning about people and what they really do, usability testing provides the limited value of learning whether people can perform your artificial tasks.

The Truth: Both Field Research and Usability Testing Have Their Places

Field studies and usability testing are two different methods used for different, but equally important, purposes. Field studies provide information to inform design, while usability testing evaluates a design. You have to make interpretations and conclusions from the user research and apply that to a design. Even after very thorough user research, you’re never completely sure that what you’ve designed will work well for the users. Usability testing is the evaluation that either confirms your decisions or points you to refinements. Both user research and usability testing are important and necessary. There’s no reason we can’t appreciate the value of both methods.

Analysis Is Cool

Affinity diagram

Analyzing the data is the most interesting part of user research. That’s where you see the trends, spot insights, and make conclusions. It’s where all the work comes together and you get the answers to your questions.

Why, then, did I publish an article in UXmatters – Analysis Isn’t Cool? All too often I’ve realized that clients, management, and project stakeholders underestimate the analysis phase and just want to get to the answers. People like to say that they did user research, but they don’t like to spend the time to analyze the data. They like the deliverables, whether they read them or not, but they don’t want to spend a lot of time on the analysis to produce those deliverables.

In this article, I discuss what analysis involves, methods for individual and group analysis, and ways to speed up the analysis process.

 

Photo by Josh Evnin on Flickr

Suspicious Minds

office scene

In previous research projects, there have been several times when participants were suspicious of our motives. This tends to happen when you’re doing research with a group of employees, trying to understand their work processes. These are the times that you’re trying to observe what they do in their jobs more than studying an existing system.

When there’s a direct connection to an application that they use, people tend to feel less suspicious. They can see that you’re trying to understand how well the application works and where it can be improved. This especially happens when it’s in a company that’s had poor previous experiences with reorganizations, layoffs, and offshoring. People tend to see us as another group of consultants coming in to study how they do their work to see what can be improved or who can be eliminated.

So how do you reassure people in these difficult situations about your true purpose? I wrote a recent article about this at UXmatters – Winning Over Wary Participants. Check it out, and if you have additional tips to make people feel more comfortable in these situations, feel free to leave a comment.

 

Image: Jake Sutton

Are Consent Forms Always Necessary?

Are consent forms always necessary? We’re told that consent forms are an indispensable part of ethical user research. Consent forms are the vehicle to give and get informed consent – they inform the participants of what the study will entail and they allow the participant to indicate consent – with a signature and date.A consent form

Yet consent forms can conflict with the informal, friendly rapport that we try to establish with participants. Anything you present for people to sign immediately looks like a legal document or liability waiver. It puts them on guard.

That’s ironic because consent forms are the opposite of legal waivers. Legal documents are created to protect the interests of the company that creates them, while consent forms are created to protect the rights of the people signing them. Yet most participants assume they are signing a typical legal waiver.

Consent forms seem acceptable in more formal user research situations, such as usability testing and focus groups, but they seem odd and even off-putting when used in more informal situations. I’ve found them to be especially awkward when doing field studies at people’s offices. You strive to set up an informal situation, such as asking someone to show you how they create reports or asking them to try out a new design for an expense report application. But when you show up with a consent form for them to sign, it shatters the informal, comfortable rapport you tried so hard to establish. I’ve had people react to consent forms in this kind of situation with, “Hey! I thought we were just talking here.” How many times in the course of your work-life have you had someone show up to a meeting with a legal document for you to sign?

So I say use your judgment. When a consent form feels like it would be overly formal, don’t use it (unless your legal department requires it). Instead, get informed consent informally by email. “Inform” with your email describing what will take place, and get “consent” from their reply email agreeing to participate. At the start of the session, you can inform them again with a summary of what you’ll be doing. They will then give consent by continuing to participate in the session.

A good guideline is how comfortable or uncomfortable you feel when giving participants the consent form. If you feel uncomfortable, you’re probably breaking a group norm. So you should find a more acceptable way of getting informed consent.

Effectively Communicating User Research Findings

I presented at UXPA 2013 today on Effectively Communicating User Research Findings.

This is the reason I’ve been way too busy to blog lately. I’ve been working non-stop on this presentation and also my UXmatters article published this week on Creating Better UX Research Videos: http://www.uxmatters.com/mt/archives/2013/07/creating-better-ux-research-videos.php

So check either of these out.