Usability testing: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
Line 50: Line 50:


* [http://www.usabilityfirst.com/usability-methods/usability-testing/ Usability Testing]. A short introduction at usabilityfirst.com
* [http://www.usabilityfirst.com/usability-methods/usability-testing/ Usability Testing]. A short introduction at usabilityfirst.com
* [http://www.usability.gov/how-to-and-tools/methods/running-usability-tests.html Running a Usability Test] (usability.gov)


* [http://www.usabilitynet.org/tools/diagnostic.htm Diagnostic evaluation] at usabilitynet.net.
* [http://www.usabilitynet.org/tools/diagnostic.htm Diagnostic evaluation] at usabilitynet.net.

Revision as of 00:08, 14 January 2015

Draft

<pageby nominor="false" comments="false"/>

Introduction

Usability tests - also called user tests, diagnostic evaluation - measure whether users can get a task done, e.g. finding information, signing up, buying something. Such tests are conducted with a smaller set of representative users, i.e. for an e-learning platform testing, one would select students and teachers.

Typically, participants will have a solve a few tasks in a session, e.g. 6 tasks in an hour.

According to usabilityfirst.com, “tasks should represent the most common user goals (e.g. recovering a lost password) and/or the most important conversion goals from the website or application owner’s perspective. [..] On a website or web application, a conversion is any action taken by a user that satisfies the website owner’s business goals. Common examples include signing up for an email newsletter, making a purchase, or viewing an important web page.” (retrieved March 13 2011).

There are different ways of having the participant express his/her actions. According to Running a Usability Test (http://www.usability.gov/):

  • Concurrent Think Aloud (CTA) is used to understand participants’ thoughts as they interact with a product by having them think aloud while they work. The goal is to encourage participants to keep a running stream of consciousness as they work.
  • In Retrospective Think Aloud (RTA), the moderator asks participants to retrace their steps when the session is complete. Often participants watch a video replay of their actions, which may or may not contain eye-gaze patterns.
  • Concurrent Probing (CP) requires that as participants work on tasks—when they say something interesting or do something unique, the researcher asks follow-up questions.
  • Retrospective Probing (RP) requires waiting until the session is complete and then asking questions about the participant’s thoughts and actions. Researchers often use RP in conjunction with other methods—as the participant makes comments or actions, the researcher takes notes and follows up with additional questions at the end of the session.

User actions are recorded in various ways, e.g. an expert may observe user action and enter summary data (such as "missed") into an application. In more sophisticated setups, users are videotaped from two angles, screen action is recorded, etc.

Method

Synopsis of a low cost test

  • Sit next to the participant and read out a task
  • Do not help the participant, just observe and give some non-committal feedback like "go on" or "thank you".
  • If you don't work with a real usability lab setup (video-taping, observers, etc.) then write down important events, i.e. critical incidents and successes.

Thinking aloud

  • In order to learn something about the user's mental model and decision making processes, ask the user to "think aloud" what he is thinking/doing.
  • This requires screen action recording with audio-taping.

Other methods

For more methods, see: Usability Evaluation Methods - Testing at usabilityhome.com

Observer effects

According to A little known factor that could have a big effect on your next usability test (retrieved March 2014), a blog post from David Travis based on research by Jacobsen et al. (1998) and Hertzm et al. (2014), the main observer effect' in usability testing is that experts only seem to spot about 50% of usability problems when analysing usability test videos !

In other words, the main problem is not observer - user interaction, but data interpretation.

Links

Introductions and tutorials
  • My place or yours? How to decide where to run your next usability test by David Travis, May 6, 2013. Quote: “The most common types of usability test are remote usability tests, corporate lab-based tests, contextual usability tests and rented facility tests. What are the relative strengths and weaknesses of these different approaches to usability testing and how should you choose between them?”
Standards
Bibliographies

For popular standard works, see the essential Interaction design, user experience and usability bibliography.

  • Jarrett, Caroline, Better Reports: How To Communicate The Results Of Usability Testing Proceedings of STC 51st Annual Conference, Society for Technical Communication, Baltimore, MD., May 9-12, 2004
  • Butler, K.; Wichansky, A.; Laskowski, S. J.; Morse, E. L.; Scholtz, J. C., The Common Industry Format: A Way for Vendors and Customers to Talk About Software Usability Computer-Human Interaction Conference September 8-12, 2003 , Bath, England - September 01, 2003