Usability and user experience surveys
Introduction
According to Perlman (2009), “Questionnaires have long been used to evaluate user interfaces (Root & Draper, 1983). Questionnaires have also long been used in electronic form (Perlman, 1985). For a handful of questionnaires specifically designed to assess aspects of usability, the validity and/or reliability have been established, including some in the [table below].” (retrieved 20:57, 14 March 2011 (CET))
See also: learning surveys
List of web usability questionnaires
We didn't find (yet) any specific web usability questionnaires, see below for generic usability survey instruments and that can be adapted to specific websites. Often, it is good enough to replace the word "system" by "web site", as an example, see the SUS that we present below.
List of usability and user experience questionnaires
User Interface Usability Evaluation with Web-Based Questionnaires
Author: Gary Perlman (2009)
Available through the User Interface Usability Evaluation with Web-Based Questionnaires page, either as online interface or as a a set of Perl scripts that you can install in your own server. (also from: online service at hcbib.org)
The script creates a customizable Web-based perl CGI script that allows to administer and to collect data according to a few "standard" user interface evaluation questionnaire forms. The questionnaires may be applied to web sites, but also to other software.
Online service: http://hcibib.org/perlman/question.cgi. It will send results by email.
Before you just click on the above link or the links below you should go to the original page at hcibib, scroll down and configure the questionnaire, i.e.:
- customize system name, administrator email, etc.
- customize rating scale such as number of points, labels, ...
- customize number of open-ended positive/negative comments requested
- Select the questionnaire
For your information, we below reproduce the table from the original keeping the original links....
Acronym | Instrument | Reference | Institution | Example |
---|---|---|---|---|
QUIS | Questionnaire for User Interface Satisfaction | Maryland | ||
PUEU | Perceived Usefulness and Ease of Use | IBM | ||
NAU | Nielsen's Attributes of Usability | Bellcore | ||
NHE | Nielsen's Heuristic Evaluation | Bellcore | ||
CSUQ | Computer System Usability Questionnaire | IBM | ||
ASQ | After Scenario Questionnaire | IBM | ||
PHUE | Practical Heuristics for Usability Evaluation | OSU | ||
PUTQ | Purdue Usability Testing Questionnaire | Purdue | ||
USE | USE Questionnaire | Sapient |
This page seems to be the best starting point for exploring well known web-based usability evaluation questionnaires.
Purdue Usability Testing Questionnaire (PUTQ)
Author: Lin, Han X.; Choong, Yee-Yin and Salvendy, Gavriel (1997). A Proposed Index of Usability: A Method for Comparing the Relative Usability of Different Software Systems, Behaviour and Information Technology 16 n.4/5 p.267-278
The list is available through http://hcibib.org. Both the questionnaire and answer sheets are reproducible without permission provided that copyright is reproduced.
Measuring Usability with the USE Questionnaire
Author: Arnold M. Lund, Measuring Usability with the USE Questionnaire, STC Usability SIG Newsletter, orginally published in the October 2001 issue (Vol 8, No. 2)
Available: Measuring Usability with the USE Questionnaire
The questionnaire was developed over time and it started out with a large pool of items. “The questionnaires were constructed as seven-point Likert rating scales. Users were asked to rate agreement with the statements, raging from strongly disagree to strongly agree. Various forms of the questionnaires were used to evaluate user attitudes towards a variety of consumer products. Factor analyses following each study suggested that users were evaluating the products primarily using three dimensions, Usefulness, Satisfaction, and Ease of Use.”
The questionnaires were constructed as seven-point Likert rating scales, e.g. from -3 (totally disagree) to +3 (totally agree)
- Usefulness
- It helps me be more effective.
- It helps me be more productive.
- It is useful.
- It gives me more control over the activities in my life.
- It makes the things I want to accomplish easier to get done.
- It saves me time when I use it.
- It meets my needs.
- It does everything I would expect it to do.
- Ease of Use
- It is easy to use.
- It is simple to use.
- It is user friendly.
- It requires the fewest steps possible to accomplish what I want to do with it.
- It is flexible.
- Using it is effortless.
- I can use it without written instructions.
- I don't notice any inconsistencies as I use it.
- Both occasional and regular users would like it.
- I can recover from mistakes quickly and easily.
- I can use it successfully every time.
- Ease of Learning
- I learned to use it quickly.
- I easily remember how to use it.
- It is easy to learn to use it.
- I quickly became skillful with it.
- Satisfaction
- I am satisfied with it.
- I would recommend it to a friend.
- It is fun to use.
- It works the way I want it to work.
- It is wonderful.
- I feel I need to have it.
- It is pleasant to use.
System Usability Scale - SUS
One of the most popular questionnaires is the SUS which is short and does seem to yield reliable results across sample sizes (Tullis and Stetson, 2004).
The System Usability Scale (SUS) includes 10 items using a five-point response items (strongly disagree -- strongly agree):
- I think that I would like to use this system frequently
- I found the system unnecessarily complex
- I thought the system was easy to use
- I think that I would need the support of a technical person to be able to use this system
- I found the various functions in this system were well integrated
- I thought there was too much inconsistency in this system
- I would imagine that most people would learn to use this system very quickly
- I found the system very cumbersome to use
- I felt very confident using the system
- I needed to learn a lot of things before I could get going with this system
Adapted for websites this gives:
- I think that I would like to use this website frequently
- I found the website unnecessarily complex
- I thought the website was easy to use
- I think that I would need the support of a technical person to be able to use this website
- I found the various functions in this website were well integrated
- I thought there was too much inconsistency in this website
- I would imagine that most people would learn to use this website very quickly
- I found the website very cumbersome to use
- I felt very confident using the website
- I needed to learn a lot of things before I could get going with this website
TAM Satisfaction Questionnaire
The Technology Acceptance Model was created by Davis, 1989. The first six items measure perceived usefulness and the other six perceived ease of use. Both should explain use of a technology. Of this original simple version, exist several small variants in terms of wording. The items below were taken from Davis (1989).
- Using [.....] in my job would enable me to accomplish tasks more quickly.
- Using [.....] would improve my job performance.
- Using [.....] would enhance my effectiveness on the job.
- Using [.....] would make it easier to do my job.
- I would find [.....] useful in my job.
- Learning to operate [.....] would be easy for me.
- I would find it easy to get [.....] to do what I want it to do.
- My interaction with [.....] would be clear and understandable.
- I would find [.....] to be flexible to interact with.
- It would be easy for me to become skillful at using [.....].
- I would find [.....] easy to use.
Response items use a 7-point likely - unlikely scale: extremely - quite - slightly - neither - slightly - quite - extremely
More complex models also exist, e.g. UTAUT below.
UTAUT
The Unified Theory on Acceptance and Use of Technology (UTAUT) was created by Venkatesh et al. (2003). As the word "unified" suggests, it “integrates eight theories of technology adoption and provides a comprehensive view of the factors affecting users’ adoption behavior. The UTAUT model consisted of four main constructs – performance expectancy, effort expectancy, social influence, and facilitating conditions – and four moderating variables: gender, age, experience, and voluntariness of use.” ([Soo Kang, 2017:Abstract])
Fun questionnaire
This questionnaire was used in Afke Donker, Human factors in educational software for young children, PhD thesis, Vrije Universiteit, Netherlands http://hdl.handle.net/1871/9782
- Do you work with the program without someone telling you to?
- Would you like to work with the program when other children can decide for themselves what to do?
- Do you think it is boring to work with the program?
- When you started working with the program, did you want to continue working with it?
- Do you think your friends would like the program?
- Do you think the program is childish?
- Is the program is too difficult to play with?
- When you have worked with the program once, does it remain fun?
- Do you enjoy yourself when you are working with the program?
- Does the program contain many surprises?
- Would you like to work with the program more often?
- Do you perform well on the exercises in the program?
- Would you like to have the program at home?
- Do you make many mistakes while you are working with the program?
Geneva Appraisal Questionnaire (GAQ)
The Geneva Appraisal Questionnaire (GAQ) has been developed by the members of the Geneva Emotion Research Group on the basis of Klaus R. Scherer's Component Process Model of Emotion (CPM). Its purpose is to assess, as much as is possible through recall and verbal report, the results of an individual's appraisal process in the case of a specific emotional episode.
- Download from Geneva Emotion Research Group.
WEBLEI
“This instrument is designed to capture students' perception of web based learning environment. Apart from demographics and background information sections, there are four core aspects in the instrument. The first three aspects are adapted from Tobin's (1998) work on Connecting Communities Learning (CCL) and the final aspect focuses on information structure and the design aspect of the web based material. Each of these aspects is explained in the following section.” (Chang, V. ,1999, retrieved march 2014).
(1) WEBLEI Scale I: Emancipatory activities
- I can access the learning activities at times convenient to me.
- The online material is available at locations suitable for me.
- I can use time saved in travelling and on campus class attendance for study and other commitments.
- I am allowed to work at my own pace to achieve learning objectives.
- I decide how much I want to learn in a given period.
- I decide when I want to learn.
(2) WEBLEI Scale II: Co-participatory activities
- The flexibility allows me to meet my learning goals.
- The flexibility allows me to explore my own areas of interest.
- I am encouraged to explore concepts beyond my regular web based lessons.
- The asynchronous nature of the interactions enables me to reflect and respond when I had formulated an appropriate response.
- This mode of learning enables me to interact with other students and the tutor asynchronously.
- I communicate with other students in this subject electronically (via email, fax, bulletin boards, chat line).
- I have the autonomy to ask my tutor what I do not understand.
- The tutor responds promptly to my queries.
- The tutor addresses my queries adequately.
- The tutor sends me comprehensive feedback on my assignment.
- I have the autonomy to ask other students what I do not understand.
- Other students respond promptly to my queries.
- In this learning environment, I have to be self-disciplined in order to learn.
- I regularly participate in self-evaluations.
- I regularly participate in peer-evaluations.
- It is easy to organise a group for a project.
- It is easy to work collaboratively with other students involved in a group project.
(3) WEBLEI Scale III: Qualia
- I felt a sense of satisfaction and achievement about this learning environment.
- I enjoy learning in this environment.
- I could learn more in this environment.
- The technology resources enhance learning.
- I was supported by positive attitude from my peers.
- I was able to access the materials without much difficulty.
- I had no difficulty using the technology.
- I am confident in using the technology.
- I have no problems going through the materials on my own.
- I was in control of my progress as I moved through the material.
- It was easy to move about in the material.
- The web based learning environment held my interest throughout my course of study.
- I felt a sense of boredom towards the end of my course of study.
- I felt isolated towards the end of my course of study.
(4) WEBLEI Scale IV: Information structure and design activities
- The learning objectives are clearly stated in each lesson.
- The scope of the lesson is clearly stated.
- The organisation of each lesson is easy to follow.
- The structure keeps me focused on what is to be learned.
- Expectations of assignments are clearly stated in my subject.
- Activities are planned carefully.
- The subject content is appropriate for delivery on the Web.
- There is a logical sequence of presentation of the subject content.
- The presentation of the subject content is clear.
- The quiz in the web based materials enhances my learning process.
- The material shows evidence of originality and creativity in the visual design and layout.
- The graphics used in the material are appropriate.
- The colours used in the material are appropriate.
- The multimedia technology (eg animation, graphics, sound, video) contributes to the affective appeal of the material.
- The links provided in the material are clearly visible and logical.
- The links provided are relevant and appropriate to the document.
- The links provided are reliable ie no inactive links.
- The 'Help' system included in the material is context sensitive.
- The web based learning approach can substitute traditional classroom approach.
- The web based learning approach can be used to supplement traditional classroom approach.
Links
- Questionnaires in Usability Engineering, A List of Frequently Asked Questions (3rd Ed.), Compiled by: Jurek Kirakowski, 2000.
- Surveys (Online). at Usability.gov
- Should you use 5 or 7 point scales? by Jeff Sauro August 25, 2010
Bibliography
- Chandra Vinesh, Darrell L. Fisher, Students’ perceptions of a blended web-based learning environment, Learning Environments Research, April 2009, Volume 12, Issue 1, pp 31-44.
- Chandra, V., D. Fisher, and V. S. Chang. 2011. “Investigating higher education and secondary school web-based learning environments using the WEBLEI.” In Technologies for enhancing pedagogy, engagement and empowerment in education: creating learning-friendly environments, ed. Thao Le and Quynh Le, 93-104. USA: Information Science Reference, IGI Global.
- Chang, V. (1999). Evaluating the effectiveness of online learning using a new web based learning instrument. Proceedings Western Australian Institute for Educational Research Forum 1999. http://www.waier.org.au/forums/1999/chang.html
- Chang, V., & Fisher, D. (2003). The validation and application of a new learning environment instrument for online learning in higher education. Technology-rich learning environments: A future perspective, 1-18.
- F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly 13 (1989) (3), pp. 319–340
- Myung Soo Kang, Il Im, Seongtae Hong (2017). Testing Robustness of UTAUT Model: An Invariance Analysis, Journal of Global Information Management (JGIM) 25(3), http://www.igi-global.com/article/testing-robustness-of-utaut-model/181
- Perlman, Gary (1985). Electronic Surveys, Behavior Research Methods, Instruments, and Computers v.17 n.2 p.203-205
- Root, Robert W. and Draper, Steve (1983). Questionnaires as a Software Evaluation Tool Interface Design 4 -- Analyses of User Inputs / Proceedings of ACM CHI'83 Conference on Human Factors in Computing Systems 1983-12-12 p.83-87
- Simeonova, B., Bogolyubov, P., Blagov, E., & Kharabsheh, R. (2014). Cross-cultural validation of UTAUT: the case of University VLEs in Jordan, Russia and the UK. Electronic Journal of Knowledge Management, 12(1), 25-34.
- Tobin, K. (1998). Qualititative perceptions of learning environments on the world wide web In B. J. Fraser & K. G. Tobin (eds.), International handbook of science education, Kluwer Academic Publishers, United Kingdom: 139-162.
- Tullis, Tom and Albert, Bill (2008). Measuring the User Experience : Collecting, Analyzing, and Presenting Usability Metrics p.317 Morgan Kaufmann Publishers. ISBN 0-12-373558-0
- Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS quarterly, 425-478.