Rubric for eLearning Tool Evaluation: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
== Introduction ==
[[fr:Grille d'évaluation pour outils elearning]]


'''Rubric for E-Learning Tool Evaluation''' has been created by Lauren M. Anstey & Gavan P.L. Watson (2018), Centre for Teaching and Learning, Western University. It is made available under the terms of the [https://creativecommons.org/licenses/by-nc-sa/4.0/ Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License].
'''Rubric for E-Learning Tool Evaluation''' has been created by Lauren M. Anstey & Gavan P.L. Watson (2018) <ref name="anstey2018"> Anstey, L., & Watson, G. (2018). A rubric for evaluating e-learning tools in higher education. Educause Review, (September). Retrieved from https://er.educause.edu/articles/2018/9/a-rubric-for-evaluating-e-learning-tools-in-higher-education</ref>
, Centre for Teaching and Learning, Western University. It is made available under the terms of the [https://creativecommons.org/licenses/by-nc-sa/4.0/ Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License].


{{quotation|The Rubric for E-Learning Tool Evaluation offers educators a framework, with criteria and levels of achievement, to assess the suitability of an e-learning tool for their learners' needs and for their own learning outcomes and classroom context.}} <ref> </ref>
{{quotation|The Rubric for E-Learning Tool Evaluation offers educators a framework, with criteria and levels of achievement, to assess the suitability of an e-learning tool for their learners' needs and for their own learning outcomes and classroom context.}} <ref name="anstey2018"/>


The rubric includes eight categories. {{quotation|Each category has a specific set of characteristics, or criteria, against which e-learning tools are evaluated, and each criterion is assessed against three standards: works well, minor concerns, or serious concerns. Finally, the rubric offers individual descriptions of the qualities an e-learning tool must have to achieve a standard.}} <ref name="anstey2018"/>:
# Functionality: Does the tool serve its intended purpose well?
# Accessibility, broadly defined as in [[Universal Design for Instruction]]
# Technical: considers the basic technologies needed to make a tool work.
# Mobile Design: How does it work on mobile devices ?
# Privacy, Data Protection, and Rights
# Social Presence as defined in the [[Community of inquiry model]]
# Teaching Presence as defined in the [[Community of inquiry model]]
# Cognitive Presence as defined in the [[Community of inquiry model]]
See also: The SECTIONS models brievly described in the [[Educational_technology#From_a_media_selection_perspective|educational technology]] article.


== The Rubric for eLearning Tool Evaluation ==
== The Rubric for eLearning Tool Evaluation ==

Latest revision as of 09:18, 7 September 2020

Introduction

Rubric for E-Learning Tool Evaluation has been created by Lauren M. Anstey & Gavan P.L. Watson (2018) [1] , Centre for Teaching and Learning, Western University. It is made available under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

“The Rubric for E-Learning Tool Evaluation offers educators a framework, with criteria and levels of achievement, to assess the suitability of an e-learning tool for their learners' needs and for their own learning outcomes and classroom context.” [1]

The rubric includes eight categories. “Each category has a specific set of characteristics, or criteria, against which e-learning tools are evaluated, and each criterion is assessed against three standards: works well, minor concerns, or serious concerns. Finally, the rubric offers individual descriptions of the qualities an e-learning tool must have to achieve a standard.” [1]:

  1. Functionality: Does the tool serve its intended purpose well?
  2. Accessibility, broadly defined as in Universal Design for Instruction
  3. Technical: considers the basic technologies needed to make a tool work.
  4. Mobile Design: How does it work on mobile devices ?
  5. Privacy, Data Protection, and Rights
  6. Social Presence as defined in the Community of inquiry model
  7. Teaching Presence as defined in the Community of inquiry model
  8. Cognitive Presence as defined in the Community of inquiry model

See also: The SECTIONS models brievly described in the educational technology article.

The Rubric for eLearning Tool Evaluation

Text below as well as the rubric has be reproduced from the Github archive " acciptrid / Rubric-for-E-Learning-Tool-Evaluation" with only very minor modifications (e.g. a title removed).

This rubric has been designed for instructors and staff as a formative tool to evaluate eLearning tools in higher education. eLearning tools are defined as any digital technology, mediated through the use of a computing device, deliberately selected to support student learning. The rubric supports a multi-dimensional evaluation of functional, technical, and pedagogical aspects of eLearning Tools.

Not all rubric criteria are necessarily applicable to all eLearning tools and those using the rubric are encouraged to assess irrelevant criterion as “not applicable”. The rubric does not identify a discrete threshold that an eLearning tool needs to cross before a tool should be used; the rubric is a formative tool intended to offer insight into the relative strengths and weaknesses of an eLearning Tool, as evaluated against a set of criteria.

Rubric for eLearning Tool Evaluation (Anstey and Watson, 2018)
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Functionality Scale The tool can be scaled to accommodate any size class with the flexibility to create smaller sub-groups or communities of practice The tool can scaled to accommodate any size class but lacks flexibility to create smaller sub-groups or communities of practice The tool is restrictive to a limited number of users and cannot be scaled
Ease of Use The tool has a user-friendly interface and it is easy for instructors and students to become skillful with in a personalized and intuitive manner. The tool has an interface that may be confusing to either instructor or learner; there is limited opportunity for personalization. The interface is not user-friendly for either the instructor or learner; it is cumbersome, unintuitive, rigid, and inflexible.
Tech Support / Help Availability Campus-based technical support and /or help documentation is readily available and aids users in troubleshooting tasks or solving problems experienced; or, the tool provider offers a robust support platform Technical support and help documentation is available but limited, incomplete, or not user-friendly Technological support and help documentation is not available
Hypermediality The tool allows users to communicate through different channels (audio, visual, textual) and allows for non-sequential, flexible/adaptive engagement with material The tool allows users to communicate through different channels (audio, visual, textual) but is limited in its ability to provide non-sequential, flexible/adaptive engagement with material The tool is restrictive in terms of the communication channels employed (audio, visual, textual) and presents information sequentially in a rigid, inflexible format
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Accessibility Accessibility standards The tool meets accessibility guidelines (e.g. local accessibility legislation and/or W3C WCAG 2.0 standards) The tool has some limited capacity to meet accessibility guidelines The tool fails meet accessibility guidelines or no information of compliance has been made available for the tool
User-focused participation The tool is designed to address the needs of diverse users, their various literacies, and capabilities, thereby widening opportunities for participation in learning The tool has some limited capacity to address the needs of diverse users, their various literacies, and capabilities The tool is restrictive in meeting the diversity of needs reflective in the student body. The tool likely restricts some learners from fully participating.
Required Equipment Proper use of the tool does not require equipment beyond what is typically available to instructors and students (computer with built-in speakers and microphone, internet connection, etc.) Proper use of the tool requires specialized equipment (e.g. unique device) that likely requires purchase at a low cost Proper use of the tool requires specialized equipment requiring moderate to significant financial investment
Cost of Use All aspects of the tool can be used free of charge. Limited aspects of the tool can be used for free with other elements requiring payment of a fee, membership, or subscription. Use of the tool requires a fee, membership, or subscription Use of the tool requires a purchase that is likely to pose a financial burden on students (exceeding $50 for a single half term course)
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Technical Integration/ Embedding within a Learning Management System (LMS) The tool can be embedded (as an object via HTML code) or fully integrated (e.g. LTI-compliant tools) into an LMS while maintaining full functionality of the tool. The tool can be embedded within an LMS, perhaps with with limited functionality, but can not be fully integrated. The tool can only be accessed in an LMS through a hyperlink or static representations of the tool (e.g file export), rather than a functional version of the tool itself
Desktop / Laptop Operating Systems Users can effectively utilize the tool with any standard, up-to-date operating system. Users may encounter limited or altered functionality depending on the up-to-date operating system being used Users are limited to using the tool with one specific, up-to-date operating system.
Browser Users can effectively utilize the tool with any standard, up-to-date browser Users may encounter limited or altered functionality depending on the up-to-date browser being used Users are limited to using the tool through one specific browser
Additional Downloads Users do not need to download additional software or browser extensions. The tool uses a browser extension or software that requires a download and / or user permission to run. The tool requires a past or version of a browser extension or software.
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Mobile Design Access The tool can be accessed, either through the download of an app or via a mobile browser, regardless of the mobile operating system and device. Design of the mobile tool fully takes into consideration the constraints of a smaller-sized screen. The tool offers an app, but only for a limited set of mobile operating systems. Tool is not accessible through a mobile browser. Design of the mobile tool constrained by the limitations of the mobile device. Access to the tool is limited or absent on a mobile device.
Functionality There is little to no functional difference between the mobile and the desktop version, regardless of the device used to access it. No difference in functionality between apps designed for different mobile operating systems. Core features of the main tool are functional on the mobile app but advanced features are limited. Some difference in functionality between apps designed for different mobile operating systems, but has limited impact on learners' use of the tool. The mobile app functions poorly such that core features are not reliable or non-existent. Significant difference in functionality depending on the mobile device's operating system used to access the tool.
Offline Access Offers an offline mode: Core features of the tool can be accessed and utilized even when offline, maintaining functionality and content. Offers a kind of offline mode, where the tool can be used offline but core functionality and content are affected. The mobile platform cannot be used in any capacity offline.
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Privacy, Data Protection, and Rights Sign Up/Sign In Use of the tool does not require the creation of an external account or additional login, such that no personal user information is collected and shared. Either instructors are the only users required to provide personal information to set up an account; or the tool has been vetted through appropriate channels to ensure strict adherence to local, institutional, or personal policies/standards for protecting the collection and use of student personal data by a third party group. All users (instructors and learners) must provide personal information to a third party in creating an account and there is some question or concern of the adherence to local, institutional, or personal policies/standards for protecting the collection and use of such data by the third party group.
Data Privacy and Ownership Users maintain ownership and copyright of their intellectual property/data; the user can keep data private and decide if / how data is to be shared Users maintain ownership and copyright of their intellectual property/data; data is shared publically and cannot be made private Users forfeit ownership and copyright of data; data is shared publically and cannot be made private, or no details provided.
Archiving, Saving, and Exporting Data Users can archive, save, or import and export content or activity data in a variety of formats There are limitations to archiving, saving, or importing/exporting content or activity data Content and activity data cannot be archived, saved, or imported exported
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Social Presence Collaboration The tool has the capacity to support a community of learning through both asynchronous and synchronous opportunities for communication, interactivity, and transfer of meaning between users The tool has the capacity to support a community of learning through asynchronous but not synchronous opportunities for communication, interactivity, and transfer of meaning between users Communication, interactivity, and transfer of meaning between users is not supported or significantly limited
User Accountability Instructors can control learner anonymity; the tool provides technical solutions for holding learners accountable for their actions Instructors cannot control learner anonymity but the tool provides some solution for holding learners accountable for their actions Instructors cannot control learner anonymity and there is no technical solution for holding users accountable to their actions
Diffusion The tool is widely known and popular, it's likely that most learners are familiar with the tool and have basic technical competence with it Learners' familiarity with the tool is likely mixed, some will lack basic technical competence with its functions The tool is not well known/foreign, it is likely that learners are not familiar with the tool and lack basic technical competence with its functions
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Teaching Presence Facilitation The tool has easy-to-use features that would significantly improve an instructor's ability to be present with learners via active management, monitoring, engagement, and feedback The tool has limited functionality to effectively support an instructor's ability to be present with learners via active management, monitoring, engagement, and feedback The tool has not been designed to support an instructor's an instructor's ability to be present with learners via active management, monitoring, engagement, and feedback
Customization Tool is adaptable to its environment: easily customized to suit the classroom context and targeted learning outcomes Limited aspects of the tool can be customized to suit the classroom context and learning outcomes The tool cannot be customized
Learning Analytics Instructor can monitor learners' performance on a variety of responsive measures. These measures can be accessed through a user-friendly dashboard Instructor can monitor learners' performance on limited measures; or data is not presented in a format that is easily interpreted The tool does not support the collection of learning analytics
Category Criteria Works Well Minor Concerns Serious Concerns Not applicable
Cognitive Presence Enhancement of Cognitive Task(s) The tool enhances engagement in targeted cognitive task(s) that were once overly complex or inconceivable through other means The tool enables functional improvement to engagement in the targeted cognitive task(s) The tool acts as a direct tool substitute with no functional change to engagement in the targeted cognitive task(s)
Higher Order Thinking Use of the tool easily facilitates learners to exercise higher order thinking skills (given consideration to design, facilitation, and direction from instructor) The tool may engage learners in higher order thinking skills (given significant consideration to design, facilitation, and direction from instructor) The tool likely does not engage learners in higher order thinking skills (despite significant consideration to design, facilitation, and direction from instructor)
Metacognitive Engagement Through the tool, learners can regularly receive formative feedback on learning (i.e. they can track their performance, monitor their improvement, test their knowledge) Opportunities for receiving formative feedback on learning are available, but infrequent or limited (i.e. poor opportunities for tracking performance, monitoring improvement, testing knowledge on a regular basis) There are no opportunities for formative feedback on learning (i.e. lacking opportunities for tracking performance, monitoring improvement, testing knowledge on a regular basis)

Copyright modification

Rubric for E-Learning Tool Evaluation by Lauren M. Anstey & Gavan P.L. Watson, copyright 2018 Centre for Teaching and Learning, Western University is made available under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Therefore, contents of this page are available under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, http://creativecommons.org/licenses/by-nc-sa/4.0/

References

Anstey, L., & Watson, G. (2018). A rubric for evaluating e-learning tools in higher education. PDF version of the rubric: https://teaching.uwo.ca/pdf/elearning/Rubric-for-eLearning-Tool-Evaluation.pdf (retrieved Sept 1 2020).

Cited

  1. 1.0 1.1 1.2 Anstey, L., & Watson, G. (2018). A rubric for evaluating e-learning tools in higher education. Educause Review, (September). Retrieved from https://er.educause.edu/articles/2018/9/a-rubric-for-evaluating-e-learning-tools-in-higher-education