Learner assessment: Difference between revisions
m (→Plagiarism) |
mNo edit summary |
||
Line 7: | Line 7: | ||
== First principles == | == First principles == | ||
There are three purposes of assessment: | |||
# Formative: Assist learning | |||
# Summative: Measure individual achievement | |||
# Accountability: Evaluate programs (e.g. see [[course evaluation]] | |||
* Pedagogical objectives can be very different. It is not the same to require from someone to able to recall "data or information" or to be able to "to pull together knowledge and to apply in a real world project", or even to be able to acquire knowledge that is useful for solving a given task ! (See [[learning type]] and [[learning level]]). | * Assessment should always be based on clear pedagocial objectives, that also should be communicated to the learners. Pedagogical objectives can be very different. It is not the same to require from someone to able to recall "data or information" or to be able to "to pull together knowledge and to apply in a real world project", or even to be able to acquire knowledge that is useful for solving a given task ! (See [[learning type]] and [[learning level]]). | ||
According to Pellegrino, there are major three elements that should lead to an assessment design and that must be coordinated : | |||
* cognition (a model of how students represent knowledge and develop competence) | |||
* observation (of learner tasks) | |||
* interpretation (methods for making sense of the data) | |||
Based on this model, '''evidence-centered''' design for assessment attempts to define: | |||
* The claim space: Exactly what knowledge do we want students to have ? This requires that a domain is unpacked, i.e. that a panel of various stakeholders build an [[ontology]] of the domain that should be taught/learnt in a program. | |||
* An evidence model: What is accepted as evidence, i.e. what features of student productions will provide evidence and how will we analyze it. | |||
* A task model: What tasks will students perform to communicate their knowledge. | |||
The claim space for professional training for example can be firstly defined in terms of various foci. Pellegrino suggests the following table: | |||
Foci for explanation rationalist sociocultural | |||
performance task/knowledge analysis communal practices | |||
development trajectories of learning traj. of participation | |||
knowledge mental representations forms of mediated activitiy | |||
== Assessment in a transformative perspective == | |||
According to Pellegrino, assessment can be trojan horse for educational and training programs, and in both a negative and positive sense: | |||
* negative when it becomes de-factor driver of curriculum and instruction (in particular large scale evaluations) | |||
* positive when it breaks down barriers (nature of compentence and expertise), similar in perspective to edutech's role. | |||
== Formative evaluation == | == Formative evaluation == |
Revision as of 12:15, 26 March 2009
Definition
This article will deal with assessment of student performance, i.e. student evaluation - we also will focus on more sophisticated evaluation rubrics for activity-based instructional designs... I know that assessment is weak in this wiki, but that can't be helped for the moment, it's not in the center of my interests - Daniel K. Schneider 15:32, 15 August 2007 (MEST).
First principles
There are three purposes of assessment:
- Formative: Assist learning
- Summative: Measure individual achievement
- Accountability: Evaluate programs (e.g. see course evaluation
- Assessment should always be based on clear pedagocial objectives, that also should be communicated to the learners. Pedagogical objectives can be very different. It is not the same to require from someone to able to recall "data or information" or to be able to "to pull together knowledge and to apply in a real world project", or even to be able to acquire knowledge that is useful for solving a given task ! (See learning type and learning level).
According to Pellegrino, there are major three elements that should lead to an assessment design and that must be coordinated :
- cognition (a model of how students represent knowledge and develop competence)
- observation (of learner tasks)
- interpretation (methods for making sense of the data)
Based on this model, evidence-centered design for assessment attempts to define:
- The claim space: Exactly what knowledge do we want students to have ? This requires that a domain is unpacked, i.e. that a panel of various stakeholders build an ontology of the domain that should be taught/learnt in a program.
- An evidence model: What is accepted as evidence, i.e. what features of student productions will provide evidence and how will we analyze it.
- A task model: What tasks will students perform to communicate their knowledge.
The claim space for professional training for example can be firstly defined in terms of various foci. Pellegrino suggests the following table:
Foci for explanation rationalist sociocultural performance task/knowledge analysis communal practices development trajectories of learning traj. of participation knowledge mental representations forms of mediated activitiy
Assessment in a transformative perspective
According to Pellegrino, assessment can be trojan horse for educational and training programs, and in both a negative and positive sense:
- negative when it becomes de-factor driver of curriculum and instruction (in particular large scale evaluations)
- positive when it breaks down barriers (nature of compentence and expertise), similar in perspective to edutech's role.
Formative evaluation
According to Nicol and Milligan (2006) and Nicol and Macfarlane (2006), good formative evaluation:
- helps clarify what good performance is (goals, criteria, expected standards);
- facilitates the development of self-assessment (reflection) in learning;
- delivers high quality information to students about heir learning;
- encourages teacher and peer dialogue around learning;
- encourages positive motivational beliefs and self-esteem;
- provides opportunities to close the gap between current and desired performance;
- provides information to teachers that can be used to help shape teaching. (Nicol & Macfarlane-
Dick, 2006, p. 205).
Evaluation Tools
Rubrics
- Rubystar is a free tool to help teachers create quality rubrics.
Product-based evaluation
Plagiarism
There is a risk that learners use the Internet to copy/paste contents. In particular, when teachers do not apply project-oriented step-wise learning designs.
- See plagiarism
Checklists for auto-evaluation
- PBL Checklists, age-appropriate, customizable project checklists for written reports, multimedia projects, oral presentations, and science projects. The use of these checklists keeps students on track and allows them to take responsibility for their own learning through peer- and self-evaluation.
Links
- Practical Assessment, Research & Evaluation, A peer-reviewed electronic journal. ISSN 1531-7714.
- The Journal of Technology, Learning and Assessment (JTLA) is a peer-reviewed, scholarly on-line journal.
- Assessment and Evaluation (a short table associating learning levels to performance measures)
- E-Assessment Examples (Referred from Wikipedia: E-Assessment, to learn what E-assessments are)
References
- Nicol, D., & Milligan, C. (2006). Rethinking technology supported assessment practices in relation to the seven principles of good feedback practice. In C. Bryan & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp. 64-77). London: Routledge. ISBN 0415356423
- Nicol, D. J., & Macfarlane Dick, D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback parctice. Studies in Higher Education, 31(2), 199-21.
- Ross, Magnus and Mary Welsh (2007). Formative Feedback to Improve Learning on a Teacher Education Degree using a Personal Learning Environment, International Journal of Emerging Technologies in Learning (iJET), Vol 2, No 3 Abstract/PDF
- Scriven, Michael (1999). The nature of evaluation part i: relation to psychology. Practical Assessment, Research & Evaluation, 6(11). Retrieved March 7, 2006 from [1]