Assessment management system: Difference between revisions
mNo edit summary |
m (→Standards) |
||
(15 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
== Introduction == | == Introduction == | ||
In education, | In education, {{quotation|The term assessment management refers in its broadest sense to ways of making assessment and feedback processes more efficient and effective for institutions, teachers and students alike. From an institutional perspective, ease of access to accurate, up-to-date assessment data is essential for the effective running of a range of business processes: from quality assurance to marketing, from course information for prospective students to long-term curriculum planning. Course information containing, for example, the type and frequency of assessments and the nature of the learning outcomes is especially important in larger, multi-site institutions running modular programmes where there is an increased risk of duplication, bunching of assignments or variation in workload from module to module. }} ([http://jiscdesignstudio.pbworks.com/w/page/52947117/Assessment%20management Assessment management], JISC Design Studio] <ref>JISC, The Design Studio / Assessment management. (n.d.). Retrieved May 16, 2019, from http://jiscdesignstudio.pbworks.com/w/page/52947117/Assessment management </ref>) | ||
According to [https://www.jisc.ac.uk/guides/electronic-assessment-management Ferell & Grey (2013)] <ref>Ferell, G., & Gray, L. (2013). Electronic management of assessment | Jisc. Retrieved May 16, 2019, from https://www.jisc.ac.uk/guides/electronic-assessment-management </ref>, '''electronic management of assessment''' (EMA) is increasingly being used to describe the way in which technology is used across the assessment life cycle to support the electronic submission of assignments, as well as marking and feedback. In that context, the term "'''assessment management system'''" (AMS) usually refers to an online platform that facilitates student assessment. In higher education, {{quotation|An AMS supports, to a greater or lesser extent,a key set of university processes supporting the academic and administrative activity of managing assessments.}} <ref>JISC (nd). PROPOSAL FOR AN ASSESSMENT MANAGEMENT SYSTEM ‐ FINAL REPORT https://jiscinfonetcasestudies.pbworks.com/w/file/fetch/47310124/Assessment%20Management%20System%20Final%20Report%20i1.pdf Retrieved May 2019.</ref>. The term is also used design all sorts of tracking and evaluation systems, e.g. work performance, or compliance with rules. | |||
{{quotation|An assessment management system (or AMS, as they are often termed) is an “electronic system or structure״ that facilitates the gathering and reporting of assessment data on student learning outcomes (Shupe 2007, 51).}}, cited by [http://www.ala.org/acrl/sites/ala.org.acrl/files/content/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/tipsandtrends/2012summer.pdf Instruction and Assessment Management] | |||
There are several AMS types, e.g. systems that allow defining, testing and administering standardized quizzes (e.g. [[Docimo]], tacit performance assessment systems (where judges evaluate behavior), simulation/game based assessment, or e-learning platforms that allow evaluating a collection of student productions. | |||
See also: | |||
* [[Learner assessment]] | |||
{{stub}} | {{stub}} | ||
== | == Requirements == | ||
According to [https://www.researchgate.net/profile/Salvador_Oton/publication/228741711_E-Learning_Model_for_Assessment/links/0fcfd50815c0235fa1000000.pdf Barchino et al.] <ref>Barchino, R., Gutiérrez, J. M., Otón, S., Martínez, J. J., Hilera, J. R., & Gutiérrez, J. A. (2006). E-learning model for assessment. In IADIS Virtual Multiconference on Computer Science and Information Systems, MCCSIS 2006. </ref> the EMFA model <ref>EMFA, 2005, Educational model for Assessment Version 1.0, Educational Technology Expertise Centre (OTEC) Open University of the Netherlands. Secretary Development Programme P.O. Box 2960 6401 DL Heerlen the Netherlands. </ref> defines the following requirements. | |||
1. Flexibility: The assessment model must be able to describe assessments that are based on different theories and models. | |||
2. Formalisation: The assessment model must be able to describe assessments and its processes in a formal way, so that it is machine-readable and automatic processing is possible. | |||
3. Reusability: The assessment model must make it possible to identify, isolate, decontextualize and exchange useful objects (e.g. items, assessment units, competencies, assessment plans), and to reuse these in other contexts. | |||
4. Interoperability and sustainability: Separation between the description standards and interpretation technique, thus becoming resistant to technical changes and conversion problems. | |||
5. Completeness: The assessment model must cover the whole assessment process, including all the typed objects, the relationship between the objects and workflow. | |||
6. Explicitly typed objects: The assessment model must be able to express the semantic meaning of different objects within the context of an assessment. 7. Reproducibility: The assessment model must describe assessments so that repeated execution is possible. 8. Medium neutrality: The description of an assessment, where possible, must be medium neutral, so that it can be used in different (publication) formats, like the web, or paper and pencil tests. 9. Compatibility: The assessment model must fit in available standards and specifications. | |||
== Standards == | |||
[[QTI]] is an [[IMS]] standard for standardized test items that is fairly popular in the sense that may systems (including [[LMS]]) can import QTI data. Exporting seems to be more difficult.... | |||
== Landscape of available systems == | |||
Ferrel (2014) <ref>Ferrell, G. (2014). Electronic management of assessment (EMA); a landscape review. Bristol: JISC http://repository. jisc. ac. uk/5599/1/EMA_REPORT. p df. Accessed, May, 2019.</ref> conducted a larger UK survey about the use of Electronic Management of Assessment (EMA) and concluded that {{quotation|the use of technology is now a fundamental part of the support for assessment and feedback practice across the sector but there are a few examples of fully integrated approaches to supporting the whole assessment and feedback life-cycle.}} | |||
Some example systems: | |||
* [[Docimo]] | * [[Docimo]] | ||
Line 33: | Line 63: | ||
Mitri, M. (2003). Applying tacit knowledge management techniques for performance assessment. Computers & Education, 41(2), 173-189. | Mitri, M. (2003). Applying tacit knowledge management techniques for performance assessment. Computers & Education, 41(2), 173-189. | ||
Nicol, D. (2007). E‐assessment by design: using multiple‐choice tests to good effect. ''Journal of Further and higher Education'', ''31''(1), 53-64. | |||
Yorke, M. (1998). The management of assessment in higher education. Assessment & Evaluation in Higher Education, 23(2), 101–116. doi:10.1080/0260293980230201 | Yorke, M. (1998). The management of assessment in higher education. Assessment & Evaluation in Higher Education, 23(2), 101–116. doi:10.1080/0260293980230201 | ||
=== References === | |||
<references /> | |||
[[Category:Learner assessment and evaluation]] |
Latest revision as of 18:37, 16 May 2019
Introduction
In education, “The term assessment management refers in its broadest sense to ways of making assessment and feedback processes more efficient and effective for institutions, teachers and students alike. From an institutional perspective, ease of access to accurate, up-to-date assessment data is essential for the effective running of a range of business processes: from quality assurance to marketing, from course information for prospective students to long-term curriculum planning. Course information containing, for example, the type and frequency of assessments and the nature of the learning outcomes is especially important in larger, multi-site institutions running modular programmes where there is an increased risk of duplication, bunching of assignments or variation in workload from module to module.” (Assessment management, JISC Design Studio] [1])
According to Ferell & Grey (2013) [2], electronic management of assessment (EMA) is increasingly being used to describe the way in which technology is used across the assessment life cycle to support the electronic submission of assignments, as well as marking and feedback. In that context, the term "assessment management system" (AMS) usually refers to an online platform that facilitates student assessment. In higher education, “An AMS supports, to a greater or lesser extent,a key set of university processes supporting the academic and administrative activity of managing assessments.” [3]. The term is also used design all sorts of tracking and evaluation systems, e.g. work performance, or compliance with rules.
“An assessment management system (or AMS, as they are often termed) is an “electronic system or structure״ that facilitates the gathering and reporting of assessment data on student learning outcomes (Shupe 2007, 51).”, cited by Instruction and Assessment Management
There are several AMS types, e.g. systems that allow defining, testing and administering standardized quizzes (e.g. Docimo, tacit performance assessment systems (where judges evaluate behavior), simulation/game based assessment, or e-learning platforms that allow evaluating a collection of student productions.
See also:
Requirements
According to Barchino et al. [4] the EMFA model [5] defines the following requirements.
1. Flexibility: The assessment model must be able to describe assessments that are based on different theories and models.
2. Formalisation: The assessment model must be able to describe assessments and its processes in a formal way, so that it is machine-readable and automatic processing is possible.
3. Reusability: The assessment model must make it possible to identify, isolate, decontextualize and exchange useful objects (e.g. items, assessment units, competencies, assessment plans), and to reuse these in other contexts.
4. Interoperability and sustainability: Separation between the description standards and interpretation technique, thus becoming resistant to technical changes and conversion problems.
5. Completeness: The assessment model must cover the whole assessment process, including all the typed objects, the relationship between the objects and workflow.
6. Explicitly typed objects: The assessment model must be able to express the semantic meaning of different objects within the context of an assessment. 7. Reproducibility: The assessment model must describe assessments so that repeated execution is possible. 8. Medium neutrality: The description of an assessment, where possible, must be medium neutral, so that it can be used in different (publication) formats, like the web, or paper and pencil tests. 9. Compatibility: The assessment model must fit in available standards and specifications.
Standards
QTI is an IMS standard for standardized test items that is fairly popular in the sense that may systems (including LMS) can import QTI data. Exporting seems to be more difficult....
Landscape of available systems
Ferrel (2014) [6] conducted a larger UK survey about the use of Electronic Management of Assessment (EMA) and concluded that “the use of technology is now a fundamental part of the support for assessment and feedback practice across the sector but there are a few examples of fully integrated approaches to supporting the whole assessment and feedback life-cycle.”
Some example systems:
- Docimo
- Texas assessment
- Taskstream by Watermark
- TouchStone, example of a non-educational assessment system
Links
Bibliography
Banta, T. W., & Palomba, C. A. (2014). Assessment essentials: Planning, implementing, and improving assessment in higher education. John Wiley & Sons.
Barrett, H. (2004). Differentiating electronic portfolios and online assessment management systems. In Society for Information Technology & Teacher Education International Conference (pp. 46-50). Association for the Advancement of Computing in Education (AACE).
Barrett, H. (2005). Storytelling in higher education: A theory of reflection on practice to support deep learning. In Society for Information Technology & Teacher Education International Conference (pp. 1878-1883). Association for the Advancement of Computing in Education (AACE).
Miller,A.H., B.W. Imrie, K. Cox (1998). Student assessment in higher education: a handbook for assessing performance, Kogan Page, Ltd, London (1998)
Douce, Christopher, David Livingstone, and James Orwell. 2005. Automatic test-based assessment of programming: A review. J. Educ. Resour. Comput. 5, 3, Article 4 (September 2005). DOI=http://dx.doi.org/10.1145/1163405.1163409
Mitri, Michel (2003) A Knowledge Management Framework for Curriculum Assessment, Journal of Computer Information Systems, 43:4, 15-2
Mitri, M. (2003). Applying tacit knowledge management techniques for performance assessment. Computers & Education, 41(2), 173-189.
Nicol, D. (2007). E‐assessment by design: using multiple‐choice tests to good effect. Journal of Further and higher Education, 31(1), 53-64.
Yorke, M. (1998). The management of assessment in higher education. Assessment & Evaluation in Higher Education, 23(2), 101–116. doi:10.1080/0260293980230201
References
- ↑ JISC, The Design Studio / Assessment management. (n.d.). Retrieved May 16, 2019, from http://jiscdesignstudio.pbworks.com/w/page/52947117/Assessment management
- ↑ Ferell, G., & Gray, L. (2013). Electronic management of assessment | Jisc. Retrieved May 16, 2019, from https://www.jisc.ac.uk/guides/electronic-assessment-management
- ↑ JISC (nd). PROPOSAL FOR AN ASSESSMENT MANAGEMENT SYSTEM ‐ FINAL REPORT https://jiscinfonetcasestudies.pbworks.com/w/file/fetch/47310124/Assessment%20Management%20System%20Final%20Report%20i1.pdf Retrieved May 2019.
- ↑ Barchino, R., Gutiérrez, J. M., Otón, S., Martínez, J. J., Hilera, J. R., & Gutiérrez, J. A. (2006). E-learning model for assessment. In IADIS Virtual Multiconference on Computer Science and Information Systems, MCCSIS 2006.
- ↑ EMFA, 2005, Educational model for Assessment Version 1.0, Educational Technology Expertise Centre (OTEC) Open University of the Netherlands. Secretary Development Programme P.O. Box 2960 6401 DL Heerlen the Netherlands.
- ↑ Ferrell, G. (2014). Electronic management of assessment (EMA); a landscape review. Bristol: JISC http://repository. jisc. ac. uk/5599/1/EMA_REPORT. p df. Accessed, May, 2019.