Educational software evaluation: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
Line 2: Line 2:


See [[instructional design method]] for the moment.
See [[instructional design method]] for the moment.
There are several ways to evaluate software:
== Technical evaluation ==
E.g. Dalgarno (2004)
Geissinger (1997) starts with the question Can this product actually teach what it is supposed to?" and uses Barker & King's (1993:309) four categories:
<table border="1">
<tr><td valign="top" width="33%">
<b><font face="Times,Times New Roman" size="2"></font></b><p><b><font face="Times,Times New Roman" size="2">Category</font></b></p></td>
<td valign="top" width="67%">
<b><font face="Times,Times New Roman" size="2"></font></b><p><b><font face="Times,Times New Roman" size="2">Discussion</font></b></p></td>
</tr>
<tr><td valign="top" width="33%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Quality of end-user interface design</font></p></td>
<td valign="top" width="67%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Investigation shows that the designers of the most highly-rated products follow well-established rules &amp; guidelines. This aspect of design affects usersí perception of the product, what they can do with it and how completely it engages them.</font></p></td>
</tr>
<tr><td valign="top" width="33%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Engagement</font></p></td>
<td valign="top" width="67%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Appropriate use of audio &amp; moving video segments can contribute greatly to usersí motivation to work with the medium.</font></p></td>
</tr>
<tr><td valign="top" width="33%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Interactivity</font></p></td>
<td valign="top" width="67%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Usersí involvement in participatory tasks helped make the product meaningful and provoke thought.</font></p></td>
</tr>
<tr><td valign="top" width="33%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Tailorability</font></p></td>
<td valign="top" width="67%">
<font face="Times,Times New Roman" size="2"></font><p><font face="Times,Times New Roman" size="2">Products which allow users to configure them and change them to meet particular individual needs contribute well to the quality of the educational experience.</font></p></td>
</tr>
</table>


== References ==
== References ==


Peter R Albion, Heuristic evaluation of educational multimedia: from theory to practice [http://www.usq.edu.au/users/albion/papers/ascilite99.html HTML]
* Barker (1995). Evaluating a model of learning design. In H. Maurer (Ed.) Proceedings, World Conference in Educational Multimedia & Hypermedia. Graz, Austria: Association for the Advancement of Computing in Education.
 
* Barker, P. & King, T. (1993). Evaluating interactive multimedia courseware -- a methodology. Computers in Education 21 (4), 307-319.
 
* Baumgartner, P. & Payr, S. (1996). Learning as action: A social science approach to the evaluation of interactive media. In Carlson, P. & Makedom, F. (Eds.) Proceedings, World Conference in Educational Multimedia & Hypermedia. Boston: Association for the Advancement of Computing in Education.
 
* Peter R Albion, Heuristic evaluation of educational multimedia: from theory to practice [http://www.usq.edu.au/users/albion/papers/ascilite99.html HTML]
 
* Dalgarno, B. (2004). A classification scheme for learner-computer interaction. In R.Atkonson, C.McBeath, D. Jones-Dwyer and R.Phillips (eds) Beyond the comfort zone, 21st annual conference of the Australasian Society for Computers in Learning in Tertiary Education, Perth, Australia. Available: [http://www.ascilite.org.au/conferences/perth04/procs/pdf/dalgarno.pdf PDF]. (This paper describes environments, but is useful for deciding on which criteria you will select a tool)
 
* Geissinger H (1997) "Educational Software: Criteria for Evaluation". ASCILE '97 [http://www.ascilite.org.au/conferences/perth97/papers/Geissinger/Geissinger.html HTML].


* Reiser, R.A. & Kegelmann, H.W. (1994). Evaluating instructional software: A review and critique of current methods. Educational Technology, Research & Development 42(3), 63-69.


[[Category:Design_methodologies]]
[[Category:Design_methodologies]]
[[Category: Educational technologies]]
[[Category:Evaluation methods and grids]]
[[Category:Evaluation methods and grids]]

Revision as of 17:03, 29 June 2007

Draft

See instructional design method for the moment.

There are several ways to evaluate software:

Technical evaluation

E.g. Dalgarno (2004)

Geissinger (1997) starts with the question Can this product actually teach what it is supposed to?" and uses Barker & King's (1993:309) four categories:

Category

Discussion

Quality of end-user interface design

Investigation shows that the designers of the most highly-rated products follow well-established rules & guidelines. This aspect of design affects usersí perception of the product, what they can do with it and how completely it engages them.

Engagement

Appropriate use of audio & moving video segments can contribute greatly to usersí motivation to work with the medium.

Interactivity

Usersí involvement in participatory tasks helped make the product meaningful and provoke thought.

Tailorability

Products which allow users to configure them and change them to meet particular individual needs contribute well to the quality of the educational experience.

References

  • Barker (1995). Evaluating a model of learning design. In H. Maurer (Ed.) Proceedings, World Conference in Educational Multimedia & Hypermedia. Graz, Austria: Association for the Advancement of Computing in Education.
  • Barker, P. & King, T. (1993). Evaluating interactive multimedia courseware -- a methodology. Computers in Education 21 (4), 307-319.
  • Baumgartner, P. & Payr, S. (1996). Learning as action: A social science approach to the evaluation of interactive media. In Carlson, P. & Makedom, F. (Eds.) Proceedings, World Conference in Educational Multimedia & Hypermedia. Boston: Association for the Advancement of Computing in Education.
  • Peter R Albion, Heuristic evaluation of educational multimedia: from theory to practice HTML
  • Dalgarno, B. (2004). A classification scheme for learner-computer interaction. In R.Atkonson, C.McBeath, D. Jones-Dwyer and R.Phillips (eds) Beyond the comfort zone, 21st annual conference of the Australasian Society for Computers in Learning in Tertiary Education, Perth, Australia. Available: PDF. (This paper describes environments, but is useful for deciding on which criteria you will select a tool)
  • Geissinger H (1997) "Educational Software: Criteria for Evaluation". ASCILE '97 HTML.
  • Reiser, R.A. & Kegelmann, H.W. (1994). Evaluating instructional software: A review and critique of current methods. Educational Technology, Research & Development 42(3), 63-69.