Clicker: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
 
(12 intermediate revisions by the same user not shown)
Line 8: Line 8:
==Affordances==
==Affordances==
The principle advantage of clicker technology lies in its ability to facilitate interaction between students and instructor, albeit in a mediated format (Lennox Terrion & Aceti, 2012), thus providing instant feedback to both students and instructor on how well the class as a whole understands the concepts presented (Kay, 2009). These benefits have been primarily observed in large classrooms where personal feedback and interaction with instructors is more challenging (Hill & Babbitt, 2013). Use of clickers in large lecture classrooms is especially helpful in overcoming challenges such as student inattention and distraction (Lennox Terrion & Aceti, 2012).  
The principle advantage of clicker technology lies in its ability to facilitate interaction between students and instructor, albeit in a mediated format (Lennox Terrion & Aceti, 2012), thus providing instant feedback to both students and instructor on how well the class as a whole understands the concepts presented (Kay, 2009). These benefits have been primarily observed in large classrooms where personal feedback and interaction with instructors is more challenging (Hill & Babbitt, 2013). Use of clickers in large lecture classrooms is especially helpful in overcoming challenges such as student inattention and distraction (Lennox Terrion & Aceti, 2012).  
The principle feature of clicker technology is that it allows instructors to pose multiple choice format questions by means of slides projected onto a screen to which students click their responses (Laxman, 2011). When all responses are received, the results are projected onto the screen for the entire class to see—typically anonymously although it is also possible to identify respondents (Nielsen, Hansen, & Stav, 2013). Individual student and class data can be saved from each session, allowing responses to be analyzed and displayed in various graphical forms (Hill & Babbitt, 2013). As put forward by Lennox Terrion and Aceti (2012), clickers make it possible for large lecture class activities to mirror to some extent the type of exercises, quizzes and team-based activities that are successfully used to engage students in small classrooms. Quizzes are often employed at the beginning of lectures in order to assess student understanding of prior material and/or readings (Lennox Terrion & Aceti, 2012), to emphasize key concepts, and to enable students to focus and settle down at the start of the class (Laxman, 2011). Additionally, instructors may quiz students following particularly challenging sections of lectures to check student understanding and respond by providing additional examples or more detailed explanations if necessary (Nielsen et al., 2013). Additionally, clickers are helpful in sustaining attention by punctuating content delivery (Laxman, 2011).  
The principle feature of clicker technology is that it allows instructors to pose multiple choice format questions by means of slides projected onto a screen to which students click their responses (Laxman, 2011). When all responses are received, the results are projected onto the screen for the entire class to see—typically anonymously although it is also possible to identify respondents (Nielsen, Hansen, & Stav, 2013). Individual student and class data can be saved from each session, allowing responses to be analyzed and displayed in various graphical forms (Hill & Babbitt, 2013). As put forward by Lennox Terrion and Aceti (2012), clickers make it possible for large lecture class activities to mirror to some extent the type of exercises, quizzes and team-based activities that are successfully used to engage students in small classrooms. Quizzes are often employed at the beginning of lectures in order to assess student understanding of prior material and/or readings (Lennox Terrion & Aceti, 2012), to emphasize key concepts, and to enable students to focus and settle down at the start of the class (Laxman, 2011). Additionally, instructors may quiz students following particularly challenging sections of lectures to check student understanding and respond by providing additional examples or more detailed explanations if necessary (Nielsen et al., 2013). Additionally, clickers are helpful in sustaining attention by punctuating content delivery (Laxman, 2011).  
Importantly, clickers—simply by being clicked—allow students to anonymously voice their lack of comprehension of material, allowing instructors to respond without drawing attention to the student (Gachago, Morris, & Simon, 2011). Thus clickers encourage student participation through the anonymity they offer—which is especially valuable when the language of instruction is not the student’s primary language (Gachago, Morris, & Simon, 2011). As such, clickers facilitate immediate feedback to students from instructors in large classrooms and allow students to assess their knowledge of concepts as well as their level of understanding relative to their peers (Lennox Terrion & Aceti, 2012). As indicated by Laxman (2005), students have positive attitudes towards clicker technology; and the use of clickers is also associated with increased student attention, and higher interest and engagement levels during classes.
Importantly, clickers—simply by being clicked—allow students to anonymously voice their lack of comprehension of material, allowing instructors to respond without drawing attention to the student (Gachago, Morris, & Simon, 2011). Thus clickers encourage student participation through the anonymity they offer—which is especially valuable when the language of instruction is not the student’s primary language (Gachago, Morris, & Simon, 2011). As such, clickers facilitate immediate feedback to students from instructors in large classrooms and allow students to assess their knowledge of concepts as well as their level of understanding relative to their peers (Lennox Terrion & Aceti, 2012). As indicated by Laxman (2005), students have positive attitudes towards clicker technology; and the use of clickers is also associated with increased student attention, and higher interest and engagement levels during classes.


==Constraints==
==Constraints==
Clicker technology in itself does not guarantee engaged, active students; it is how clickers are used in conjunction with the instructor's teaching methodology that dictate the level of interactivity (Trees & Jackson, 2007) As such, it is necessary that instructors be motivated, have precise goals and be consistent in the manner in which clickers are employed (Kay & Knaack, 2009). It is essential also that instructors do the necessary preparation in terms of methodology, software and quizzes (Nielsen et al., 2013) as it is through thorough and purposeful implementation that they are most likely to be effective (Lennox Terrion & Aceti, 2012). Hill and Babbitt (2013) found that incorporating clickers into instruction—at least initially—makes teaching more challenging. Additionally, inefficient use of clickers, setting up the system and handing out clickers, coupled with increased class discussion and students having to adjust to a new style of teaching can all contribute to student confusion and frustration (Trees & Jackson, 2007). It is also necessary for instructors to create new educational materials (Lennox Terrion & Aceti, 2012) as there are few collections of good clicker questions available for most educational fields (Laxman, 2011). Moreover, as indicated by Kay and Knaack (2009), it is yet to be established whether employing clicker technology translates to more effective instruction across subject areas and age ranges, and warn that younger children "may be too immature to handle the initial excitement created" by clickers (para. 5).  
Clicker technology in itself does not guarantee engaged, active students; it is how clickers are used in conjunction with the instructor's teaching methodology that dictate the level of interactivity (Trees & Jackson, 2007) As such, it is necessary that instructors be motivated, have precise goals and be consistent in the manner in which clickers are employed (Kay & Knaack, 2009). It is essential also that instructors do the necessary preparation in terms of methodology, software and quizzes (Nielsen et al., 2013) as it is through thorough and purposeful implementation that they are most likely to be effective (Lennox Terrion & Aceti, 2012). Hill and Babbitt (2013) found that incorporating clickers into instruction—at least initially—makes teaching more challenging. Additionally, inefficient use of clickers, setting up the system and handing out clickers, coupled with increased class discussion and students having to adjust to a new style of teaching can all contribute to student confusion and frustration (Trees & Jackson, 2007). It is also necessary for instructors to create new educational materials (Lennox Terrion & Aceti, 2012) as there are few collections of good clicker questions available for most educational fields (Laxman, 2011). Moreover, as indicated by Kay and Knaack (2009), it is yet to be established whether employing clicker technology translates to more effective instruction across subject areas and age ranges, and warn that younger children "may be too immature to handle the initial excitement created" by clickers (para. 5).  
As with any technology, glitches do occur (Kay & Knaack, 2009), computers freeze, clickers stop responding (Hill & Babbitt, 2013) and clicker batteries die, resulting in distractions for both instructor and students (MacGeorge et al., 2008). In particular, technical glitches can be a particularly stressful experience for students during summative evaluation (Lennox Terrion & Aceti, 2012) and can result in students developing resistance to a new method of learning (Kay & Knaack, 2009).  
As with any technology, glitches do occur (Kay & Knaack, 2009), computers freeze, clickers stop responding (Hill & Babbitt, 2013) and clicker batteries die, resulting in distractions for both instructor and students (MacGeorge et al., 2008). In particular, technical glitches can be a particularly stressful experience for students during summative evaluation (Lennox Terrion & Aceti, 2012) and can result in students developing resistance to a new method of learning (Kay & Knaack, 2009).  
It should be noted that employing clicker technology exclusively as an attendance-taking device is unpopular with students (Stagg & Lane, 2010), as are clicker responses used for summative assessment (Nielsen et al., 2013). While these may seem like time-saving strategies, they tend to result in resistance to use of this technology (Kay & Knaack, 2009). The data suggest that students prefer formative assessment over summative assessment when using clickers (Kay & Knaack, 2009).
It should be noted that employing clicker technology exclusively as an attendance-taking device is unpopular with students (Stagg & Lane, 2010), as are clicker responses used for summative assessment (Nielsen et al., 2013). While these may seem like time-saving strategies, they tend to result in resistance to use of this technology (Kay & Knaack, 2009). The data suggest that students prefer formative assessment over summative assessment when using clickers (Kay & Knaack, 2009).


==Links==
==Links==
[http://www.edudemic.com/2011/12/student-blogs/ 30 Incredible Blogs Written By Students]<br />
[http://cft.vanderbilt.edu/teaching-guides/technology/clickers/ Classroom Response Systems (“Clickers”)]<br />
[http://cft.vanderbilt.edu/teaching-guides/technology/clickers/ Classroom Response Systems (“Clickers”)]<br />
[http://www.einstruction.com/srs-overview Student Response Systems]<br />
[http://www.einstruction.com/srs-overview Student Response Systems]<br />
[http://www.polleverywhere.com/ars-comparison ARS Vendor comparison]<br />
[http://www.polleverywhere.com/ars-comparison ARS Vendor Comparison]<br />
[http://www.youtube.com/watch?v=M7RWteJyeVw&list=PLpblAv7_U6uK_cjvKp8t9O9IOh953w6yk&index=1 Elmo Student Response System Training Video 1: Setting up the SRS]<br />
[http://www.youtube.com/watch?v=M7RWteJyeVw&list=PLpblAv7_U6uK_cjvKp8t9O9IOh953w6yk&index=1 Elmo Student Response System Training Video 1: Setting up the SRS]<br />
[http://www.slideshare.net/Andreatej/tutorial-for-smart-student-response-system Tutorial for SMART Student Response System]<br />
[http://www.slideshare.net/Andreatej/tutorial-for-smart-student-response-system Tutorial for SMART Student Response System]<br />
Line 28: Line 29:
==Works Cited==
==Works Cited==


Desrochers, M. N., & Shelnutt, J. M. (2012). Effect of answer format and review method on college students' learning. ''Computers & Education'', 59(3), 946-951. doi:10.1016/j.compedu.2012.04.002<br />
Gachago, D., Morris, A., & Simon, E. (2011). Engagement levels in a graphic design clicker class: Students’ perceptions around attention, participation and peer learning. ''Journal of Information Technology Education: Research'', 10(1), 253-269.<br />
Hill, A., & Babbitt, B. (2013). Examining the efficacy of personal response devices in army training. ''Journal of Information Technology Education: Innovations in Practice'', 12(1), 1-11. <br />
Kay, R. H. (2009). Examining gender differences in attitudes toward interactive classroom communications systems (ICCS). ''Computers & Education'', 52(4), 730-740. doi:10.1016/j.compedu.2008.11.015 <br />
Kay, R., & Knaack, L. (2009). Exploring individual differences in attitudes toward audience response systems. ''Canadian Journal of Learning and Technology/La Revue Canadienne de l’Apprentissage et de la Technologie'', 35(1). <br />
Laxman, K. (2011). A study on the adoption of clickers in higher education. ''Australasian Journal of Educational Technology'', 27(8), 1291-1303. <br />
Lennox Terrion, J., & Aceti, V. (2012). Perceptions of the effects of clicker technology on student learning and engagement: A study of freshmen chemistry students. ''Research in Learning Technology'', 20. <br />
Lin, Y. C., Liu, T. C., & Chu, C. C. (2011). Implementing clickers to assist learning in science lectures: The clicker-assisted conceptual change model. ''Australasian Journal of Educational Technology'', 27(6), 979-996. <br />
Lundeberg, M. A., Kang, H., Wolter, B., Armstrong, N., Borsari, B., Boury, N., ... & Herreid, C. F. (2011). Context matters: Increasing understanding with interactive clicker case studies. ''Educational Technology Research and Development'', 59(5), 645-671. doi:10.1007/s11423-010-9182-1<br />
MacGeorge, E. L., Homan, S. R., Dunning Jr, J. B., Elmore, D., Bodie, G. D., Evans, E., ... & Geddes, B. (2008). Student evaluation of audience response technology in large lecture classes. ''Educational Technology Research and Development'', 56(2), 125-145. doi:10.1007/s11423-007-9053-6<br />
Nielsen, K. L., Hansen, G., & Stav, J. B. (2013). Teaching with student response systems (SRS): Teacher-centric aspects that can negatively affect students’ experience of using SRS. ''Research in Learning Technology'', 21.doi:10.3402/rlt.v21i0.18989 <br />
Stagg, A., & Lane, M. (2010). Using clickers to support information literacy skills development and instruction in first-year business students. ''Journal of Information Technology Education: Research'', 9(1), 197-215. <br />
Tong, V. C. (2012). Using asynchronous electronic surveys to help in‐class revision: A case study. ''British Journal of Educational Technology'', 43(3), 465-473. doi:10.1111/j.1467-8535.2011.01207.x <br />
Trees, A. R., & Jackson, M. H. (2007). The learning environment in clicker classrooms: Student processes of learning and involvement in large university‐level courses using student response systems. ''Learning, Media and Technology'', 32(1), 21-40. doi:10.1080/17439880601141179 <br />
Turban, J. (2011). Students prefer audience response system for lecture evaluation. ''International Journal of Emerging Technologies in Learning (iJET)'',6(4), 52-55. <br />




[[Category:educational technologies]][[Category:Affordances and constraints of learning technologies]]
[[Category:educational technologies]][[Category:Affordances and constraints of learning technologies]]

Latest revision as of 16:26, 7 November 2013

Clicker

David Clarke, Memorial University of Newfoundland

Definitions and background

Clickers are synchronous electronic voting devices (Tong, 2012) that are particularly popular with instructors of large lecture classes (MacGeorge et al., 2008). In the United States, clickers are commonly referred to as key-pads, and as handsets or zappers in the United Kingdom (Laxman, 2011). However, they are also known by various other names such as audience response systems (ARSs), personal response systems (PRSs) and electronic voting systems (Laxman, 2011). Physically, these small handheld wireless devices (Tong, 2012) resemble a television remote control (Lundeberg et al., 2011), typically with a numeric keypad, function keys that allow for text entry, a send button and a power switch (Laxman, 2011). Students' clickers connect to a receiver attached to the instructor's computer that runs what is essentially presentation software (Lundeberg et al., 2011). This technology therefore allows instructors to pose questions, receive immediate responses from students and project student responses on to a screen in a variety of different formats all in real time (Turban, 2011).

Affordances

The principle advantage of clicker technology lies in its ability to facilitate interaction between students and instructor, albeit in a mediated format (Lennox Terrion & Aceti, 2012), thus providing instant feedback to both students and instructor on how well the class as a whole understands the concepts presented (Kay, 2009). These benefits have been primarily observed in large classrooms where personal feedback and interaction with instructors is more challenging (Hill & Babbitt, 2013). Use of clickers in large lecture classrooms is especially helpful in overcoming challenges such as student inattention and distraction (Lennox Terrion & Aceti, 2012).

The principle feature of clicker technology is that it allows instructors to pose multiple choice format questions by means of slides projected onto a screen to which students click their responses (Laxman, 2011). When all responses are received, the results are projected onto the screen for the entire class to see—typically anonymously although it is also possible to identify respondents (Nielsen, Hansen, & Stav, 2013). Individual student and class data can be saved from each session, allowing responses to be analyzed and displayed in various graphical forms (Hill & Babbitt, 2013). As put forward by Lennox Terrion and Aceti (2012), clickers make it possible for large lecture class activities to mirror to some extent the type of exercises, quizzes and team-based activities that are successfully used to engage students in small classrooms. Quizzes are often employed at the beginning of lectures in order to assess student understanding of prior material and/or readings (Lennox Terrion & Aceti, 2012), to emphasize key concepts, and to enable students to focus and settle down at the start of the class (Laxman, 2011). Additionally, instructors may quiz students following particularly challenging sections of lectures to check student understanding and respond by providing additional examples or more detailed explanations if necessary (Nielsen et al., 2013). Additionally, clickers are helpful in sustaining attention by punctuating content delivery (Laxman, 2011).

Importantly, clickers—simply by being clicked—allow students to anonymously voice their lack of comprehension of material, allowing instructors to respond without drawing attention to the student (Gachago, Morris, & Simon, 2011). Thus clickers encourage student participation through the anonymity they offer—which is especially valuable when the language of instruction is not the student’s primary language (Gachago, Morris, & Simon, 2011). As such, clickers facilitate immediate feedback to students from instructors in large classrooms and allow students to assess their knowledge of concepts as well as their level of understanding relative to their peers (Lennox Terrion & Aceti, 2012). As indicated by Laxman (2005), students have positive attitudes towards clicker technology; and the use of clickers is also associated with increased student attention, and higher interest and engagement levels during classes.

Constraints

Clicker technology in itself does not guarantee engaged, active students; it is how clickers are used in conjunction with the instructor's teaching methodology that dictate the level of interactivity (Trees & Jackson, 2007) As such, it is necessary that instructors be motivated, have precise goals and be consistent in the manner in which clickers are employed (Kay & Knaack, 2009). It is essential also that instructors do the necessary preparation in terms of methodology, software and quizzes (Nielsen et al., 2013) as it is through thorough and purposeful implementation that they are most likely to be effective (Lennox Terrion & Aceti, 2012). Hill and Babbitt (2013) found that incorporating clickers into instruction—at least initially—makes teaching more challenging. Additionally, inefficient use of clickers, setting up the system and handing out clickers, coupled with increased class discussion and students having to adjust to a new style of teaching can all contribute to student confusion and frustration (Trees & Jackson, 2007). It is also necessary for instructors to create new educational materials (Lennox Terrion & Aceti, 2012) as there are few collections of good clicker questions available for most educational fields (Laxman, 2011). Moreover, as indicated by Kay and Knaack (2009), it is yet to be established whether employing clicker technology translates to more effective instruction across subject areas and age ranges, and warn that younger children "may be too immature to handle the initial excitement created" by clickers (para. 5).

As with any technology, glitches do occur (Kay & Knaack, 2009), computers freeze, clickers stop responding (Hill & Babbitt, 2013) and clicker batteries die, resulting in distractions for both instructor and students (MacGeorge et al., 2008). In particular, technical glitches can be a particularly stressful experience for students during summative evaluation (Lennox Terrion & Aceti, 2012) and can result in students developing resistance to a new method of learning (Kay & Knaack, 2009).

It should be noted that employing clicker technology exclusively as an attendance-taking device is unpopular with students (Stagg & Lane, 2010), as are clicker responses used for summative assessment (Nielsen et al., 2013). While these may seem like time-saving strategies, they tend to result in resistance to use of this technology (Kay & Knaack, 2009). The data suggest that students prefer formative assessment over summative assessment when using clickers (Kay & Knaack, 2009).

Links

Classroom Response Systems (“Clickers”)
Student Response Systems
ARS Vendor Comparison
Elmo Student Response System Training Video 1: Setting up the SRS
Tutorial for SMART Student Response System

Works Cited

Desrochers, M. N., & Shelnutt, J. M. (2012). Effect of answer format and review method on college students' learning. Computers & Education, 59(3), 946-951. doi:10.1016/j.compedu.2012.04.002
Gachago, D., Morris, A., & Simon, E. (2011). Engagement levels in a graphic design clicker class: Students’ perceptions around attention, participation and peer learning. Journal of Information Technology Education: Research, 10(1), 253-269.
Hill, A., & Babbitt, B. (2013). Examining the efficacy of personal response devices in army training. Journal of Information Technology Education: Innovations in Practice, 12(1), 1-11.
Kay, R. H. (2009). Examining gender differences in attitudes toward interactive classroom communications systems (ICCS). Computers & Education, 52(4), 730-740. doi:10.1016/j.compedu.2008.11.015
Kay, R., & Knaack, L. (2009). Exploring individual differences in attitudes toward audience response systems. Canadian Journal of Learning and Technology/La Revue Canadienne de l’Apprentissage et de la Technologie, 35(1).
Laxman, K. (2011). A study on the adoption of clickers in higher education. Australasian Journal of Educational Technology, 27(8), 1291-1303.
Lennox Terrion, J., & Aceti, V. (2012). Perceptions of the effects of clicker technology on student learning and engagement: A study of freshmen chemistry students. Research in Learning Technology, 20.
Lin, Y. C., Liu, T. C., & Chu, C. C. (2011). Implementing clickers to assist learning in science lectures: The clicker-assisted conceptual change model. Australasian Journal of Educational Technology, 27(6), 979-996.
Lundeberg, M. A., Kang, H., Wolter, B., Armstrong, N., Borsari, B., Boury, N., ... & Herreid, C. F. (2011). Context matters: Increasing understanding with interactive clicker case studies. Educational Technology Research and Development, 59(5), 645-671. doi:10.1007/s11423-010-9182-1
MacGeorge, E. L., Homan, S. R., Dunning Jr, J. B., Elmore, D., Bodie, G. D., Evans, E., ... & Geddes, B. (2008). Student evaluation of audience response technology in large lecture classes. Educational Technology Research and Development, 56(2), 125-145. doi:10.1007/s11423-007-9053-6
Nielsen, K. L., Hansen, G., & Stav, J. B. (2013). Teaching with student response systems (SRS): Teacher-centric aspects that can negatively affect students’ experience of using SRS. Research in Learning Technology, 21.doi:10.3402/rlt.v21i0.18989
Stagg, A., & Lane, M. (2010). Using clickers to support information literacy skills development and instruction in first-year business students. Journal of Information Technology Education: Research, 9(1), 197-215.
Tong, V. C. (2012). Using asynchronous electronic surveys to help in‐class revision: A case study. British Journal of Educational Technology, 43(3), 465-473. doi:10.1111/j.1467-8535.2011.01207.x
Trees, A. R., & Jackson, M. H. (2007). The learning environment in clicker classrooms: Student processes of learning and involvement in large university‐level courses using student response systems. Learning, Media and Technology, 32(1), 21-40. doi:10.1080/17439880601141179
Turban, J. (2011). Students prefer audience response system for lecture evaluation. International Journal of Emerging Technologies in Learning (iJET),6(4), 52-55.