Usability and user experience surveys

The educational technology and digital learning wiki
Jump to navigation Jump to search

Draft

<pageby nominor="false" comments="false"/>

Introduction

According to Perlman (2009), “Questionnaires have long been used to evaluate user interfaces (Root & Draper, 1983). Questionnaires have also long been used in electronic form (Perlman, 1985). For a handful of questionnaires specifically designed to assess aspects of usability, the validity and/or reliability have been established, including some in the [table below].” (retrieved 20:57, 14 March 2011 (CET))

See also: learning surveys

List of web usability questionnaires

We didn't find (yet) any specific web usability questionnaires, see below for generic usability survey instruments and that can be adapted to specific websites. Often, it is good enough to replace the word "system" by "web site", as an example, see the SUS that we present below.

List of usability and user experience questionnaires

User Interface Usability Evaluation with Web-Based Questionnaires

Author: Gary Perlman (2009)

Available through the User Interface Usability Evaluation with Web-Based Questionnaires page, either as online interface or as a a set of Perl scripts that you can install in your own server. (also from: online service at hcbib.org)

The script creates a customizable Web-based perl CGI script that allows to administer and to collect data according to a few "standard" user interface evaluation questionnaire forms. The questionnaires may be applied to web sites, but also to other software.

Online service: http://hcibib.org/perlman/question.cgi. It will send results by email.

Before you just click on the above link or the links below you should go to the original page at hcibib, scroll down and configure the questionnaire, i.e.:

  • customize system name, administrator email, etc.
  • customize rating scale such as number of points, labels, ...
  • customize number of open-ended positive/negative comments requested
  • Select the questionnaire

For your information, we below reproduce the table from the original keeping the original links....

Acronym Instrument Reference Institution Example
QUIS Questionnaire for User Interface Satisfaction

Chin et al, 1988

Maryland

27 questions

PUEU Perceived Usefulness and Ease of Use

Davis, 1989

IBM

12 questions

NAU Nielsen's Attributes of Usability

Nielsen, 1993

Bellcore

5 attributes

NHE Nielsen's Heuristic Evaluation

Nielsen, 1993

Bellcore

10 heuristics

CSUQ Computer System Usability Questionnaire

Lewis, 1995

IBM

19 questions

ASQ After Scenario Questionnaire

Lewis, 1995

IBM

3 questions

PHUE Practical Heuristics for Usability Evaluation

Perlman, 1997

OSU

13 heuristics

PUTQ Purdue Usability Testing Questionnaire

Lin et al, 1997

Purdue

100 questions

USE USE Questionnaire

Lund, 2001

Sapient

30 questions


This page seems to be the best starting point for exploring well known web-based usability evaluation questionnaires.

Purdue Usability Testing Questionnaire (PUTQ)

Author: Lin, Han X.; Choong, Yee-Yin and Salvendy, Gavriel (1997). A Proposed Index of Usability: A Method for Comparing the Relative Usability of Different Software Systems, Behaviour and Information Technology 16 n.4/5 p.267-278

The list is available through http://hcibib.org. Both the questionnaire and answer sheets are reproducible without permission provided that copyright is reproduced.

Measuring Usability with the USE Questionnaire

Author: Arnold M. Lund, Measuring Usability with the USE Questionnaire, STC Usability SIG Newsletter, orginally published in the October 2001 issue (Vol 8, No. 2)

Available: Measuring Usability with the USE Questionnaire

The questionnaire was developed over time and it started out with a large pool of items. “The questionnaires were constructed as seven-point Likert rating scales. Users were asked to rate agreement with the statements, raging from strongly disagree to strongly agree. Various forms of the questionnaires were used to evaluate user attitudes towards a variety of consumer products. Factor analyses following each study suggested that users were evaluating the products primarily using three dimensions, Usefulness, Satisfaction, and Ease of Use.”

The questionnaires were constructed as seven-point Likert rating scales, e.g. from -3 (totally disagree) to +3 (totally agree)

Usefulness
It helps me be more effective.
It helps me be more productive.
It is useful.
It gives me more control over the activities in my life.
It makes the things I want to accomplish easier to get done.
It saves me time when I use it.
It meets my needs.
It does everything I would expect it to do.
Ease of Use
It is easy to use.
It is simple to use.
It is user friendly.
It requires the fewest steps possible to accomplish what I want to do with it.
It is flexible.
Using it is effortless.
I can use it without written instructions.
I don't notice any inconsistencies as I use it.
Both occasional and regular users would like it.
I can recover from mistakes quickly and easily.
I can use it successfully every time.
Ease of Learning
I learned to use it quickly.
I easily remember how to use it.
It is easy to learn to use it.
I quickly became skillful with it.
Satisfaction
I am satisfied with it.
I would recommend it to a friend.
It is fun to use.
It works the way I want it to work.
It is wonderful.
I feel I need to have it.
It is pleasant to use.

System Usability Scale - SUS

One of the most popular questionnaires is the SUS which is short and does seem to yield reliable results across sample sizes (Tullis and Stetson, 2004).

The System Usability Scale (SUS) includes 10 items using a five-point response items (strongly disagree -- strongly agree):

  1. I think that I would like to use this system frequently
  2. I found the system unnecessarily complex
  3. I thought the system was easy to use
  4. I think that I would need the support of a technical person to be able to use this system
  5. I found the various functions in this system were well integrated
  6. I thought there was too much inconsistency in this system
  7. I would imagine that most people would learn to use this system very quickly
  8. I found the system very cumbersome to use
  9. I felt very confident using the system
  10. I needed to learn a lot of things before I could get going with this system

Adapted for websites this gives:

  1. I think that I would like to use this website frequently
  2. I found the website unnecessarily complex
  3. I thought the website was easy to use
  4. I think that I would need the support of a technical person to be able to use this website
  5. I found the various functions in this website were well integrated
  6. I thought there was too much inconsistency in this website
  7. I would imagine that most people would learn to use this website very quickly
  8. I found the website very cumbersome to use
  9. I felt very confident using the website
  10. I needed to learn a lot of things before I could get going with this website

TAM Satisfaction Questionnaire

The Technology Acceptance Model was created by Davis, 1989. The first six items measure perceived usefulness and the other six perceived ease of use. Both should explain use of a technology. There exist several small variants in terms of wording. The items below were taken from Davis (1989).

  1. Using [.....] in my job would enable me to accomplish tasks more quickly.
  2. Using [.....] would improve my job performance.
  3. Using [.....] would enhance my effectiveness on the job.
  4. Using [.....] would make it easier to do my job.
  5. I would find [.....] useful in my job.
  6. Learning to operate [.....] would be easy for me.
  7. I would find it easy to get [.....] to do what I want it to do.
  8. My interaction with [.....] would be clear and understandable.
  9. I would find [.....] to be flexible to interact with.
  10. It would be easy for me to become skillful at using [.....].
  11. I would find [.....] easy to use.

Response items use a 7-point likely - unlikely scale: extremely - quite - slightly - neither - slightly - quite - extremely

Fun questionnaire

This questionnaire was used in Afke Donker, Human factors in educational software for young children, PhD thesis, Vrije Universiteit, Netherlands http://hdl.handle.net/1871/9782

  1. Do you work with the program without someone telling you to?
  2. Would you like to work with the program when other children can decide for themselves what to do?
  3. Do you think it is boring to work with the program?
  4. When you started working with the program, did you want to continue working with it?
  5. Do you think your friends would like the program?
  6. Do you think the program is childish?
  7. Is the program is too difficult to play with?
  8. When you have worked with the program once, does it remain fun?
  9. Do you enjoy yourself when you are working with the program?
  10. Does the program contain many surprises?
  11. Would you like to work with the program more often?
  12. Do you perform well on the exercises in the program?
  13. Would you like to have the program at home?
  14. Do you make many mistakes while you are working with the program?

Geneva Appraisal Questionnaire (GAQ)

The Geneva Appraisal Questionnaire (GAQ) has been developed by the members of the Geneva Emotion Research Group on the basis of Klaus R. Scherer's Component Process Model of Emotion (CPM). Its purpose is to assess, as much as is possible through recall and verbal report, the results of an individual's appraisal process in the case of a specific emotional episode.

WEBLEI

“This instrument is designed to capture students' perception of web based learning environment. Apart from demographics and background information sections, there are four core aspects in the instrument. The first three aspects are adapted from Tobin's (1998) work on Connecting Communities Learning (CCL) and the final aspect focuses on information structure and the design aspect of the web based material. Each of these aspects is explained in the following section.” (Chang, V. ,1999, retrieved march 2014).

(1) WEBLEI Scale I: Emancipatory activities

  1. I can access the learning activities at times convenient to me.
  2. The online material is available at locations suitable for me.
  3. I can use time saved in travelling and on campus class attendance for study and other commitments.
  4. I am allowed to work at my own pace to achieve learning objectives.
  5. I decide how much I want to learn in a given period.
  6. I decide when I want to learn.

(2) WEBLEI Scale II: Co-participatory activities

  1. The flexibility allows me to meet my learning goals.
  2. The flexibility allows me to explore my own areas of interest.
  3. I am encouraged to explore concepts beyond my regular web based lessons.
  4. The asynchronous nature of the interactions enables me to reflect and respond when I had formulated an appropriate response.
  5. This mode of learning enables me to interact with other students and the tutor asynchronously.
  6. I communicate with other students in this subject electronically (via email, fax, bulletin boards, chat line).
  7. I have the autonomy to ask my tutor what I do not understand.
  8. The tutor responds promptly to my queries.
  9. The tutor addresses my queries adequately.
  10. The tutor sends me comprehensive feedback on my assignment.
  11. I have the autonomy to ask other students what I do not understand.
  12. Other students respond promptly to my queries.
  13. In this learning environment, I have to be self-disciplined in order to learn.
  14. I regularly participate in self-evaluations.
  15. I regularly participate in peer-evaluations.
  16. It is easy to organise a group for a project.
  17. It is easy to work collaboratively with other students involved in a group project.

(3) WEBLEI Scale III: Qualia

  1. I felt a sense of satisfaction and achievement about this learning environment.
  2. I enjoy learning in this environment.
  3. I could learn more in this environment.
  4. The technology resources enhance learning.
  5. I was supported by positive attitude from my peers.
  6. I was able to access the materials without much difficulty.
  7. I had no difficulty using the technology.
  8. I am confident in using the technology.
  9. I have no problems going through the materials on my own.
  10. I was in control of my progress as I moved through the material.
  11. It was easy to move about in the material.
  12. The web based learning environment held my interest throughout my course of study.
  13. I felt a sense of boredom towards the end of my course of study.
  14. I felt isolated towards the end of my course of study.

(4) WEBLEI Scale IV: Information structure and design activities

  1. The learning objectives are clearly stated in each lesson.
  2. The scope of the lesson is clearly stated.
  3. The organisation of each lesson is easy to follow.
  4. The structure keeps me focused on what is to be learned.
  5. Expectations of assignments are clearly stated in my subject.
  6. Activities are planned carefully.
  7. The subject content is appropriate for delivery on the Web.
  8. There is a logical sequence of presentation of the subject content.
  9. The presentation of the subject content is clear.
  10. The quiz in the web based materials enhances my learning process.
  11. The material shows evidence of originality and creativity in the visual design and layout.
  12. The graphics used in the material are appropriate.
  13. The colours used in the material are appropriate.
  14. The multimedia technology (eg animation, graphics, sound, video) contributes to the affective appeal of the material.
  15. The links provided in the material are clearly visible and logical.
  16. The links provided are relevant and appropriate to the document.
  17. The links provided are reliable ie no inactive links.
  18. The 'Help' system included in the material is context sensitive.
  19. The web based learning approach can substitute traditional classroom approach.
  20. The web based learning approach can be used to supplement traditional classroom approach.

Links

Bibliography

  • Chang, V. (1999). Evaluating the effectiveness of online learning using a new web based learning instrument. Proceedings Western Australian Institute for Educational Research Forum 1999. http://www.waier.org.au/forums/1999/chang.html
  • Chandra Vinesh, Darrell L. Fisher, Students’ perceptions of a blended web-based learning environment, Learning Environments Research, April 2009, Volume 12, Issue 1, pp 31-44.
  • Chandra, V., D. Fisher, and V. S. Chang. 2011. “Investigating higher education and secondary school web-based learning environments using the WEBLEI.” In Technologies for enhancing pedagogy, engagement and empowerment in education: creating learning-friendly environments, ed. Thao Le and Quynh Le, 93-104. USA: Information Science Reference, IGI Global.
  • Chang, V., & Fisher, D. (2003). The validation and application of a new learning environment instrument for online learning in higher education. Technology-rich learning environments: A future perspective, 1-18.
  • Tobin, K. (1998). Qualititative perceptions of learning environments on the world wide web In B. J. Fraser & K. G. Tobin (eds.), International handbook of science education, Kluwer Academic Publishers, United Kingdom: 139-162.

Bibliography

(not complete at all so far ...)

  • F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly 13 (1989) (3), pp. 319–340
  • Root, Robert W. and Draper, Steve (1983). Questionnaires as a Software Evaluation Tool Interface Design 4 -- Analyses of User Inputs / Proceedings of ACM CHI'83 Conference on Human Factors in Computing Systems 1983-12-12 p.83-87
  • Perlman, Gary (1985). Electronic Surveys, Behavior Research Methods, Instruments, and Computers v.17 n.2 p.203-205
  • Tullis, Tom and Albert, Bill (2008). Measuring the User Experience : Collecting, Analyzing, and Presenting Usability Metrics p.317 Morgan Kaufmann Publishers. ISBN 0-12-373558-0