Skip to content

Tips for Delivering Bad News to Clients

Ugly software interface

Your Baby is Ugly!

That’s the title of the article I just published on UXmatters,  in which I give advice on how to soften the blow of delivering bad news to clients. Let’s face it, when we perform an expert evaluation, usability testing, or user research on an existing product – most of what we find is problems with the current product. Clients don’t pay us to tell them how great their products are. If they’ve hired us, it’s to find problems that can be fixed. But there are ways to make it easier to deliver bad news. In this article I provide the following advice:

  • Get the stakeholders to admit that it’s ugly first
  • Get everyone to buy into your research methods upfront
  • Encourage stakeholders to observe the research
  • Blame the bad news on the participants
  • Back up your findings with metrics
  • Present recordings and quotations
  • Don’t beat your audience over the head
  • Emphasize your expertise
  • Back up your findings with examples of best practices
  • Show your stakeholders they’re not alone
  • Position it as providing recommendations, not pointing out problems
  • Mention the positive aspects too
  • Deliver your findings in person
  • Prioritize the problems they should solve
  • Provide a plan for addressing the problems

You can find more details about this advice in my latest article, Your Baby is Ugly.

Advertisements

Difficult Usability Testing Participants

Usability testing session

A key skill you need for usability testing is the ability to work well with a variety of different types of people. You meet all kinds of people as usability testing participants. Over time, you get used to adjusting your approach to different personalities and characteristics. Most people are easy to deal with. However, some people present challenges.

In my latest UXmatters article, “Wrangling Difficult Usability Testing Participants,” I discuss ten types of challenging participants and how to best adjust your interaction with them to get the best testing experience.

Remember Clippy?

Clippy, the Microsoft Office assistant

In my latest article on UXmatters, Five Degrees of User Assistance, I bring up a character that people love to hate – Clippy, of course! Although I do have sort of a soft spot for the little guy, he is a great example of unwanted user assistance.

Poor Clippy! It really wasn’t his fault, he came along at a time when computers were too stupid to accurately predict when people needed help. Programmed to jump out when certain events occurred, to enthusiastically offer his assistance, instead he came across as an unwanted interruption and annoyance.

Today, as technology becomes increasingly intelligent, computers are smart enough to provide more appropriate and more accurate user assistance. In my latest article I describe these five levels of user assistance:

  • Passively providing online Help content. Here’s help if you need it.
  • Asking if the user needs help. Can I help you?
  • Proactively offering suggestions that users can accept or ignore. Is this what you want, or do you want to correct this?
  • Alerting the user that it’s going to take an action automatically, unless the user says not to. I’m going to do this, unless you tell me not to.
  • Automatically taking an action for the user, without asking for permission. I’ve got this for you. Don’t worry about it.

 

Check it out at UXmatters: Five Degrees of User Assistance

Image source: Clippy, created by J. Albert Bowden II and licensed under CC BY 2.0

Paper Prototyping: Is it still worth it?

In my latest UXmatters article, I compare the latest prototyping tools to paper prototyping. Paper has long had the advantage in allowing designers to quickly and easily create early prototypes, that look unfinished, and encourage users to honestly provide criticism. However, the latest prototyping tools have caught up to, and in some cases surpassed, paper in making it very easy and quick to create prototypes without any coding.

So, do the advantages of paper prototypes still beat these new prototyping tools? That’s what I explore in my latest article, Prototyping: Paper Versus Digital.

Image credit: Samuel Mann

Tips on Comparative Usability Testing

Usability Testing Session

I just published an article in UXmatters, Conducting Qualitative, Comparative Usability Testing. It’s about conducting usability testing with two or more designs, early in the design process to get better information and better user feedback, before settling too soon on one particular design.

When participants are able to experience multiple designs, they can provide better feedback. As a result, you can gain greater insight into the elements of each design that work well and those that cause problems.

Testing Your Own Designs

Usability testing session

Today I published an article in UXmatters, Testing Your Own Designs. It’s often been said that you shouldn’t conduct usability testing on your own designs, because you may be too biased, defensive, or too close to the design to be an impartial facilitator. Although that may be the ideal, often UX designers don’t have a choice. They may be the only person available to test the design, so if they don’t test it, no one will. So in this article I provide advice for those times when you have to test your own design, and I also provide advice for when someone else tests your design.

I was hesitant to write this article, because it’s been a topic that many others have written about, but I felt that as someone who has been on all sides of the issue, I had something additional to add. Here are some other good articles about this topic:

Testing Your Own Designs: Bad Idea? and Testing Your Own Designs Redux by Paul Sherman

Should Designers and Developers Do Usability? by Jakob Nielsen

BECAUSE NOBODY’S BABY IS UGLY … SHOULD DESIGNERS TEST THEIR OWN STUFF? by Cathy Carr at Bunnyfoot

It’s Only Usability Testing, What Could Go Wrong?

Usability Testing Observation Room

I published a new article in UXmatters this week, “What Could Possibly Go Wrong? The Biggest Mistakes in Usability Testing.”

This article came out of thinking about all of the mistakes I’ve made, and problems I’ve encountered, over the last 16 years conducting usability testing. I think it’s good to look back and think about the lessons you’ve learned. This article is jam-packed with advice learned the hard way.

Usability testing is the most highly structured user research method. Compared to field studies and interviews, the tasks and questions are usually highly planned, and you usually stick pretty close to the discussion guide. That also makes it the most repetitive method. You see the same types of people performing the same tasks and answering the same questions over and over again.

After you get some experience, you can begin to think of usability testing as routine and pretty easy. At a former company, it was the first task that we gave to new researchers, just out of college. It seemed like the easiest method to learn. That may be true, but there are still all kinds of mistakes that can occur. This article discusses the main problems and how to avoid them.

Photo by Blue Oxen Associates on Flickr