E-tutorial: Difference between revisions
Line 4: | Line 4: | ||
==Definitions and background== | ==Definitions and background== | ||
An E-tutor (or an automated tutor) as a software tool that offers “students guidance in undertaking specific tasks” (Albert & Thomas, 2000, p. 141). Curilem, Barbosa and Azevedo (2007) determined three different approaches to e-tutor design; the first approach uses the e-tutor as a guide where the software has control of the lesson; the second approach has the student in control of where the lesson will lead, and the third approach has a mixture of the first two, where the system will determine the level of intervention based on student responses (p. 548). | An E-tutor (or an automated tutor) as a software tool that offers “students guidance in undertaking specific tasks” (Albert & Thomas, 2000, p. 141). Curilem, Barbosa and Azevedo (2007) determined three different approaches to e-tutor design; the first approach uses the e-tutor as a guide where the software has control of the lesson; the second approach has the student in control of where the lesson will lead, and the third approach has a mixture of the first two, where the system will determine the level of intervention based on student responses (p. 548). | ||
Some tutorials will simply guide the user through a set a resources to help complete a specific assignment (Albert & Thomas, 2000). Resources such as videos are becoming a popular resource in these tutorial programs (van der Meij & van der Meij, 2014). Other e-tutors can be much more complex and interactive. Many e-tutors are interactive, can adjust for skill level, and will offer random content to challenge the learner (Adams, Yin, Vargas, Luis, & Mullen, 2014). Other programs will offer “scaffolding”, which attempts to mimic a human tutor by assisting the learner (Albert & Thomas, 2000, p.143). This assistance can be enhanced by smart ‘learning’ e-tutors, that build on previous data to improve suggestions in the future (Barnes & Stamper, 2010, p.11). Whetstone, Clark, and Flake (2014) noted that e-tutors are often used to assist a human teacher to adjust their instruction based on the data collected by the e-tutor. | Some tutorials will simply guide the user through a set a resources to help complete a specific assignment (Albert & Thomas, 2000). Resources such as videos are becoming a popular resource in these tutorial programs (van der Meij & van der Meij, 2014). Other e-tutors can be much more complex and interactive. Many e-tutors are interactive, can adjust for skill level, and will offer random content to challenge the learner (Adams, Yin, Vargas, Luis, & Mullen, 2014). Other programs will offer “scaffolding”, which attempts to mimic a human tutor by assisting the learner (Albert & Thomas, 2000, p.143). This assistance can be enhanced by smart ‘learning’ e-tutors, that build on previous data to improve suggestions in the future (Barnes & Stamper, 2010, p.11). Whetstone, Clark, and Flake (2014) noted that e-tutors are often used to assist a human teacher to adjust their instruction based on the data collected by the e-tutor. | ||
Revision as of 17:49, 7 October 2014
E-tutorial
Christopher Warren, Memorial University of Newfoundland
Definitions and background
An E-tutor (or an automated tutor) as a software tool that offers “students guidance in undertaking specific tasks” (Albert & Thomas, 2000, p. 141). Curilem, Barbosa and Azevedo (2007) determined three different approaches to e-tutor design; the first approach uses the e-tutor as a guide where the software has control of the lesson; the second approach has the student in control of where the lesson will lead, and the third approach has a mixture of the first two, where the system will determine the level of intervention based on student responses (p. 548).
Some tutorials will simply guide the user through a set a resources to help complete a specific assignment (Albert & Thomas, 2000). Resources such as videos are becoming a popular resource in these tutorial programs (van der Meij & van der Meij, 2014). Other e-tutors can be much more complex and interactive. Many e-tutors are interactive, can adjust for skill level, and will offer random content to challenge the learner (Adams, Yin, Vargas, Luis, & Mullen, 2014). Other programs will offer “scaffolding”, which attempts to mimic a human tutor by assisting the learner (Albert & Thomas, 2000, p.143). This assistance can be enhanced by smart ‘learning’ e-tutors, that build on previous data to improve suggestions in the future (Barnes & Stamper, 2010, p.11). Whetstone, Clark, and Flake (2014) noted that e-tutors are often used to assist a human teacher to adjust their instruction based on the data collected by the e-tutor.
Affordances
According to Dalal (2014), an e-tutorial allows for multimedia to be used to introduce complex topics in a manner that is convenient for the learner (p.366). For example, in computer technology training, video tutorials can allow for sequential steps to be displayed on a screen at a pace good for the learner (van der Meij & van der Meij, 2014). Video tutorials can also be tailored and adjusted as needed; narration can accompany video, which Winslow, Dickerson and Lee (2012) argue can improve learning. W u, Lin, and Yang (2013) also observed that a text-based e-tutorial is convenient for the learner; text-based discussion allow for the student to pick a convenient time and place to get assistance (p.53). Computer-guided oral reading allows for more time practising reading aloud than with a human being (Mostow, Nelson-Taylor & Beck, 2013).
A study by Curilem et al. (2007) noted that “A key function of any ITS (Intelligent Tutor System) is the ability to adapt, as closely as possible, pedagogical activities to individual student/learner needs” (p.546). Tutoring programs can “assess, guide, and provide advice to learners without human input” (Fournier-Viger, Faghihi, Nkambou & Engelbert, 2010, p.17). Crosby and Iding (1997) found adaptive tutorials that can be designed around certain personality types of the learner. Conversational computerized tutoring systems can adapt during a lesson in an objective manner (Latham, Crockett, McLean & Edmonds, 2012). An e-tutor can easily resist the human temptation to give an answer, and patiently wait for the student to think for themselves and give the answer (Latham et al., 2012). Computer programs can avoid human bias when tutoring, especially “stigmatizing below-grade-level readers” (Mostow et al., 2013, p.251). E-tutorials allow for tutoring to be tailored to an English Language Learner’s cultural background so that they have culturally-familiar examples when learning English, allowing for students to focus on learning a new language rather than a new language and culture simultaneously, which can overwhelm an ELL student (Poulsen, Hastings, Allbritton , 2007).
Van Laarhoven et al. (2008) observed teachers viewing e-tutorials for assistive technologies, so that they can evaluate and become comfortable with the technology before classroom deployment. Some e-tutors can mark student assessments (such as writing summaries) for human teachers so that the teacher can offer more practice for the student, while freeing up the teacher for other tasks in the teaching-learning process (He, Hui & Quan, 2009). Automated software can help human tutors practice their tutoring skills with a virtual tutor (Walker, Rummel, Koedinger, 2011).
Constraints
Albert and Thomas (2000) found that an e-tutorial was more successful when a “clear objective” is given, and “when the outcome of a computer application will be assessed by tutors” (p.142). Human tutors can be better at giving more complex hints to the tutee than software (Walker et al., 2011, p.300). Human tutors can also do a better job at picking up and correcting miscues than an e-tutor in certain situations (Mostow, Aist & Burkhead, 2003). Wu et al. (2012) found that learners were disappointed in the levels of constructive criticism given by the e-tutor, and that the feedback given was sometimes “confusing” (p.58). The same study also found that motivation can be an issue, especially in circumstances where the learner had a “passive attitude” towards learning and had not made an effort to transition to a more active approach needed with certain e-tutor programs (p.60).
Albert and Thomas (2000) also found that the teacher still needs to be involved in the creation and maintenance of the e-tutorial on a regular basis; they found that the “academic wants to teach, it (the software development) demands the academic’s input and creative suggestions” (p.147). While the teacher needs to give input into the material used by the intelligent tutor, the programmer also needs to have some familiarity with the subject matter in order to be able to provide and suggest the “By the same token, the programmer must understand some of the basic material in order to suggest and recommend the “most efficient and effective way the computer can translate that concept (Albert & Thomas, 2000, p.147). An e-tutorial session can have unintended consequences not easily observable to the e-tutor creators; narration can be added to a video with accompanying text call-outs, which can overload and distract the learner (Winslow et al., 2012, p.314). Jones and Alice (2010) also noted that the instructions for learners that guide in the use of e-tutoring systems may need to be written with more than one language in mind; the instructions can be interpreted differently around the world (p. 1199).
Physical restraints are also evident, according to DeVaney (2009); the user needs to be located at a computer or device with internet access in order to view tutorials (p.3). Furthermore, Dalal (2014) noted that institutions find it “difficult to adapt” when special hardware and software are needed to deliver e-tutoring sessions (p. 366). This problem is exasperated when complex systems are used, as they have high hardware requirements (Curilem et al., 2007 p.559).
Links
Discover Adaptive eLearning eTutorials.org Intelligent Tutoring Systems That Teach and Assess Online English tutor a lesson in adaptive advertising Tutorials Point
Works Cited
Adams, C., Yin, Y., Vargas, M., Luis, F. & Mullen, S. (2014). A phenomenology of learning large: the tutorial sphere of xMOOC video lectures. Distance Education, 35(2), 202-216. doi:10.1080/01587919.2014.917701
Albert, S. & Thomas, C. (2000). A new approach to computer-aided distance learning: the 'automated tutor'. Open Learning, 15(2), 141-150. doi 10.1080/02680510050050846
Barnes, T. & Stamper, J. (2010). Automatic hint generation for logic proof tutoring using historical data. Journal of Educational Technology & Society, 13(1), 3-12.
Crosby, M. & Iding, M. (1997). The influence of cognitive styles on the effectiveness of a multimedia tutor. Computer Assisted language Learning, 10(4), 375-386.
Curilem, G., Barbosa, A. & Azevedo, F. (2007). Intelligent tutoring systems: Formalization as automata and interface design using neural networks. Computers & Education, 49(3), 545-561. doi 10.1016/j.compedu.2005.10.005
Dalal, M. (2014). Impact of Multi-media Tutorials in a Computer Science Laboratory Course -- An Empirical Study. Electronic Journal of e-Learning, 12(4), 366-374.
DeVaney, T. (2009). Impact of Video Tutorials in an Online Educational Statistics Course. Journal of Online Learning and Teaching, 5(4), 1-9.
Fournier-Viger, P., Faghihi, U.,Nkambou, R. & Engelbert, M. (2010). Exploiting Sequential Patterns Found in Users' Solutions and Virtual Tutor Behavior to Improve Assistance in ITS. Journal of Educational Technology & Society, 13(1), 13-24.
He, Y., Hui, S. & Quan, T. (2009). Automatic summary assessment for intelligent tutoring systems. Computers & Education, 53(3), 890-899. doi 10.1016/j.compedu.2009.05.008
Jones, M. & Alice, Y. (2010). Comparison of teaching and learning outcomes between video-linked, web-based, and classroom tutorials: An innovative international study of profession education in physical therapy. Computers and Education, 54(4), 1193-1201. doi 10.1016/j.compedu.2009.11.005
Latham, A., Crockett, K., McLean, D. & Edmonds, B. (2012). A conversational intelligent tutoring system to automatically predict learning styles. Computers and Education, 59(1), 95-109. doi 10.1016/j.compedu.2011.11.001
Mostow, J., Aist, G. & Burkhead, P. (2003). Evaluation of an automated reading tutor that listens: comparison to human tutoring and classroom instruction. Journal of Educational Computing Research, 29(1), 61-117. doi 10.2190/06AX-QW99-EQ5G-RDCF
Mostow, J., Nelson-Taylor, J. & Beck, J. (2013). Computer-guided oral reading versus independent practice: comparison of sustained silent reading to an automated reading tutor that listens. Journal of Educational Computing Research, 49(2), 249-276. doi: 10.2190/EC.49.2.g
Poulsen, R., Hastings, P. & Allbritton, D. (2007). Tutoring bilingual students with an automated reading tutor that listens. Journal of Educational Computing Research, 36(2), 191-221. doi 10.2190/A007-367T-5474-8383
Van der Meij, H. & van der Meij, J. (2014). A comparison of paper-based and video tutorials for software learning. Computers and Education, 78, 150-159. doi 10.1016/j.compedu.2014.06.003
Van Laarhoven, T., Munk, D., Zurita, L., Lynch, K., Zurita, B., Smith, T. & Chandler, L. (2008). The Effectiveness of Video Tutorials for Teaching Preservice Educators to Use Assistive Technologies. Journal of Special Education Technology, 23(4), 31-45.
Walker, E., Rummel, N. & Koedinger, K. (2011). Designing automated adaptive support to improve student helping behaviors in a peer tutoring activity. International Journal of Computer-Supported Collaborative Learning, 6(2), 279-306. doi 10.1007/s11412-011-9111-2
Whetstone, P., Clark, A. & Flake, M. (2014). Teacher perceptions of an online tutoring program for elementary mathematics. Educational Media International, 51(1), 79-90. doi:10.1080/09523987.2013.863552
Winslow, J., Dickerson, J. & Lee, C. (2012). Design Effects of Screen-Captured Tutorials on Student Achievement. International Journal of Instructional Media, 39(4), 309-317.
Wu, E., Lin, W. & Yang, S. (2013). An experimental study of cyber face-to-face vs. cyber text-based English tutorial programs for low-achieving university students. Computers & Education, 63, 52-61. doi 10.1016/j.compedu.2012.11.018