Tag Archives: user testing

Jared Spool’s EDUI 2009 Keynote

(Originally published in 2009, and still completely relevant.)

Tricks and techniques work better for improving website usability than do structured methodologies, Jared Spool (CEO, User Interface Engineering) said at the EDUI conference last week. Spool says that there is no evidence that methodologies result in quality designs, and that design can’t be institutionalized to conform to usability templates. He equates methodology with dogma, defined as “an unquestioned faith independent of any supporting evidence, using the TSA’s airport screening procedures as a prime example of methodology’s lack of value. His research found that:

  • The best teams didn’t have a methodology or dogma they followed
  • The struggling teams often tried following a methodology, withou success
  • The best teams all focused on increasing the techniques and tricks for each team member
  • They were constantly exploring new tricks and techniques for their toolbox
  • Struggling teams had limited tricks and techniques

He recommends that usability development teams focus on developing great techniques and tricks and avoid get dragged down by methodologies. Techniques, he says, are things that you can master through practice, building blocks that can be applied to any methodology.

How the best teams develop great design

Vision, feedback and culture are the most important variables in the way the best teams develop great design:

  • Vision: Can everyone on the team describe the experience of using your design five years from now? Spool says that five years is an optimal time period becaue the team’s thinking is not constrained by current legacy ( ie.technology limitiations), freeing them to focus on the aspiration of the design. Everyone on the team should have the exact same vision. Focusing the vision gives the team the chance to know where they’re going. They can deal with how to get there later on in the process.
  • Feedback: In the last six weeks, have you spent more than two hours watching someone use either your design or a competitor’s?Everyone should answer “yes” to this question. You don’t know what’s happening with your design until you watch people use it. Spool points out that great chefs go out into the dining room and ask people about the food.
  • Culture: In the last six weeks, have you rewarded a team member for creating a major design failure? Spool says that we only learn by making mistakes. He advocates making a huge investment in order to a failure, mentioning that Scott Cook, founder of Intuit (Quicken and Quickbooks), gives a big party for someone who has “screwed up big,” that celebrates all the things they learned. Spool asserts that organizations that are risk averse “produce crap.”

Spool says that user testing can be as speedy as the “Five-Second Page Test” that he uses. Have the test participant look at a page for five seconds (not the home page, but one with a specific purpose, such as providing customer support information or making a donation). Then ask the participant to write down what they remember aboutthe page and whether they would do business with the organization. The test is particularly useful when a page is too cluttered or confusing, and identifies whether pages quickly communicate their purpose.

On the other hand, Spool says that paper prototype testing is especially useful when the design is in flux. The team can participate in the sudy at a point where they can make changes before going into technical implementation. Paper Prototyping, by Carolyn Snyder, is a recommended resource.

What a couple of Jared Spool's paper prototypes look like

What a couple of Jared Spool’s paper prototypes look like

Virtual seminars and many other free resources are available on Spool’s company website and on his blog, Brainsparks.

Anatomy of a Usability Testing Plan: Dana Chisnell at EDUI2009 (Part Three)

(Originally published in 2009 and still highly relevant.)

In her all-day day session at EDUI2009, “Usability Testing Without the Scary,” Dana Chisnell laid out the entire process for usability testing. Breaking the process down into small parts that are easy to understand, Dana showed us how to be comfortable with the process, rather than intimidated. Here’s what she shared about creation of the plan for usability testing.

Someone who will try the design, somewhere to test and something to study.

As Dana puts it, the essence of testing is “sit next to someone and watch them do stuff.” Pretty straightforward. You just have to know what you want them to try out, and how to watch and ask questions so you can learn what they are thinking as they work with your design and functionality.

To support great site design, each phase of the development process should be supported by input from users. It’s also essential to have multidisciplinary teams working on development (ie. marketing, IT, management, etc.).

Although it’s highly useful to look at best practices, guidelines and conventions for usability, they aren’t enough to guarantee usability and accessibility.The nuances of implementation can conflict with best practices, and best practices can even conflict, cancel each other out, or magnify certain issues.  “Informed designs come from data,” says Dana.

The goals of usability testing should be to identify problems in design that lead to

  • misinformation
  • incomplete transactions
  • need for support from administration, management or staff

What tests and measurements to do when

–When you want to map out what the design should do, you observe and listen to users through user testing, focus groups, participatory design, surveys, heuristic evaluations and setting benchmarks.

–To figure out how the design should work, use participatory design, paper prototyping, walk-throughs, usability testing and heuristic evaluation

–To determine whether the design actually does what you want it to do, conduct usability tests, do heuristic evaluation, follow-up studies and comparison of benchmarks.

In the early part of the process, do things that help you learn, that are exploratory and formative. In the middle, do things that help you assess and summarize. And at the end, validate and verify when you’re close to launch but have enough time to incorporate changes.

The User Testing Plan

It’s important to create a test plan. It serves as your blueprint for testing, is a communication vehicle, clarifies needed resources, is a focal point for each test, and lays out milestones. It should include:

  • Goals and objectives**
  • Research questions**
  • Participant characteristics**
  • Description of method**
  • List of tasks**
  • Description of test environment
  • Say what the moderator will do
  • List of the data you’ll collect
  • Description of how the results will be reported

**If your time is limited, focus on the starred tasks.

Next: Selecting participants

EDUI 2009: Dana Chisnell on Taking the “Scary” Out of User Testing (Part Two)

(Originally published in 2009 and still highly relevant.)

In her EDUI 2009 Workshop, Usability Testing Without the Scary, Dana Chisnell discussed how to best support great site design. User testing is essential, even if you only have time to spend one hour with one person. If one person has a problem, it’s legitimate to assume that others do too, and to make design changes based on that data. It’s also important to have management buy-in to doing testing–they must be able to see the benefits of doing the research and implementing what is learned from the tests. Multi-disciplinary teams also support implementation of great design, provided they speak one anothers’ languages and understand one anothers’ skills. Finally, it’s essential to be willing to learn as you go and make changes accordingly. (Note that these criteria are completely in alignment with Jared Spool’s research showing that vision, feedback and culture are the most important elements of developing great design.)

Great designs are :

  • Useful–What’s valuable to your user about having this information or this tool.
  • Efficient–Users can accomplish their goal quickly (that means you have to know what your users’ goals are).
  • Learnable–It should be easy for users to apply what they’ve learned about how to use other sites to using your site (your site should conform to usabilities conventions as much as possible).
  • Satisfying–Your site should be one that people are happy to use, one that seems to be easy and doesn’t take too much time.
  • Accessible–Implementing accessibility guidelines actually makes your site easier for everyone to use.

You should identify design problems that lead to misinformation, incomplete transactions, and/or necessitate support from administrators or staff.

What kind of assessment to use to answer fundamental site design questions

Dana segments analysis into three areas and defines certain types of assessments to use for each area.

What should the site design do? When you want to answer this question, do usability testing, conduct focus groups, have users participate in the design process, do surveys, conduct heuristic evaluations and set benchmarks.

How should the site design work? To answer this question, use participatory design, paper prototyping, walk-throughs, usability testing and heuristic evaluation.

Does the site design do what we want it to do? Answer this via usability testing, heuristic evaluation, follow-up studies and comparison to benchmarks.

Next: Dana’s pointers on putting together a test plan

EDUI2009:Dana Chisnell Shows How to Do Usability Testing “Without the Scary” (Part One)

(Originally published in 2009 and still highly relevant.)

Dana Chisnell presented an all-day session at the EDUI 2009 Conference held in Charlottesville  September 21 & 22, 2009. She founded Usability Works and does usability research, user interface design, and technical communication consulting and development. She is co-author of the Handbook of Usability Testing.  The focus of the workshop was to show how eminently doable user testing can be, if you let go of the idea that you have to do it “by the book.”  Classic, by-the-book user testing can seem intimidating and scary. Instead, Dana spent the day showing how to identify and do the essentials and make it easier to test, and to do it  on an on-going basis. Her approach takes into account that there usually isn’t a lot of time to test. The following is a list of all elements “by the book,” and I’ve noted which ones Dana says are essential:

  • Develop a test plan ESSENTIAL
  • Choose a testing environment–Don’t test in the lab—go to where the users are
  • Find and select participants–ESSENTIAL
  • Prepare test materials
  • Conduct the sessions–ESSENTIAL
  • Debrief with participants and observers–ESSENTIAL–Debrief only with observers
  • Analyze data and observations
  • Create findings and recommendations

User testing works best when you’re early in the design process (that’s where testing using paper prototypes comes in very handy). What you need is someone who will try the design; somewhere to test (preferably where the user typically would use your site, such as at home, the office, or in a dorm room); something to study. Neat and simple. And you’ll learn things about your website interface that you would never have seen on your own.

For example, in conducting user tests of the University of Pennsylvania website,  Dana learned that users were looking for admissions “dates.” But they couldn’t find the information because the University used a different term: “timetable.” User testing provided the context in which they learned about this very simple labelling problem. Your site has to accommodate the way your users think and work, and you can’t be certain about that unless you actually watch them use your site. You must learn whether the user grasps the conceptual design of your site and if not, what the user’s concept is.

 Pointers for moderating user tests

There’s tremendous value in sitting next to someone and watching them do something. Dana recommends saying to the tester things like “tell me what you’re doing–think aloud… tell me what you’re thinking…” When the tester is quiet, be careful about when you ask a question–sometimes they’re quiet because they’re working on a problem, and you may not want to interrupt that process. At the end of the test, review the experience with the tester. Walk through what happened, what happened next. Ask “how did that go? What was confusing or frustrating.” Dana conducted a demonstration, working with an attendee. I found the moderator questions she asked to be very helpful:

  • How would you describe the information you’re looking for?
  • How far do you think you are from getting it?
  • Are you warm or cool?
  • You were hovering there–tell me what you were thinking.
  • What question do you have about the site right now?
  • What do you think the site is about?
  • What one main thing should be improved?

It’s essential for the moderator to talk with the tester and create a situation that feels natural to him or her. As moderator, you should be neutral and objective, but also friendly. Remember that this is not a psychology experiment, it’s a test of user-centered design.

NEXT: What Dana says about design.