Educational technology research approaches: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
Line 29: Line 29:
(retrieved 23:03, 12 September 2006 (MEST))
(retrieved 23:03, 12 September 2006 (MEST))
This journal is most explicit and the ''The Scope and Standards of the Journal of Interactive Learning Research'' is signed Thomas C. Reeves.
This journal is most explicit and the ''The Scope and Standards of the Journal of Interactive Learning Research'' is signed Thomas C. Reeves.
{{quotationbox | Many researchers fail to distinguish clearly between the goals of their research and the methods they employ. Figures 1 and 2 present a classification scheme intended to distinguish between the goals and the methods of research. Most research studies submitted to JILR should be able to be classified according to the six research goals represented in Figure 1. This scheme reflects the debate about research "paradigms" that has dominated social science research literature for decades. For example, Soltis (1992) claims there are currently "three major paradigms, or three different ways of investigating important aspects of education" (p. 620): 1) the positivist or quantitative paradigm, 2) the interpretivist or qualitative paradigm, and 3) the critical theory or neomarxist paradigm. The three categories presented by Soltis (1992) fail to capture the full breadth of research goals in the fields of inquiry relevant to JILR, and therefore the scheme in Figure 1 includes more categories. However, the goals of inquiry represented in Figure 1 are not intended to be a complete and final listing of research goals.
{{quotationbox | Many researchers fail to distinguish clearly between the goals of their research and the methods they employ. Figures 1 and 2 present a classification scheme intended to distinguish between the goals and the methods of research. Most research studies submitted to JILR should be able to be classified according to the six research goals represented in Figure 1. This scheme reflects the debate about research "paradigms" that has dominated social science research literature for decades. For example, Soltis (1992) claims there are currently "three major paradigms, or three different ways of investigating important aspects of education" (p. 620): 1) the positivist or quantitative paradigm, 2) the interpretivist or qualitative paradigm, and 3) the critical theory or neomarxist paradigm. The three categories presented by Soltis (1992) fail to capture the full breadth of research goals in the fields of inquiry
relevant to JILR, and therefore the scheme in Figure 1 includes more categories. However, the goals of inquiry represented in Figure 1 are not intended to be a complete and final listing of research goals.


A methodology classification scheme is represented in Figure 2. There are numerous methods available to researchers in areas as diverse as cognitive psychology, instructional technology, and computer science (cf., Driscoll, 1995), but for the sake of simplicity, these five methodological groupings provide sufficient discrimination to represent the major approaches likely to be used in investigations reported in JILR. This journal will be especially open to submissions that involve alternative methods (e.g., qualitative and critical theory) which seem to be underrepresented in more traditional publications.
A methodology classification scheme is represented in Figure 2. There are numerous methods available to researchers in areas as diverse as cognitive psychology, instructional technology, and computer science (cf., Driscoll, 1995), but for the sake of simplicity, these five methodological groupings provide sufficient discrimination to represent the major approaches likely to be used in investigations reported in JILR. This journal will be especially open to submissions that involve alternative methods (e.g., qualitative and critical theory) which seem to be underrepresented in more traditional publications.
Line 110: Line 111:
== Typical research and publication subjects ==
== Typical research and publication subjects ==


{{Stub}}
Justus Randolph tried to answer the four following questions:
* What are the methodological factors that need to be taken into consideration when designing and conducting educational technology research?
* What types of research questions do educational technology researchers tend to ask?
* How do educational technology researchers tend to conduct research? – What approaches do they use? What variables do they examine? What types of measures do they use? How do they report their research?
* How can the state of educational technology research be improved?


As and example we provide a few example research question that Randolph (2007:17) identified:
; Knowledge-based (literature analysis) questions:
* What is known about best practices in user-centered design?
* Across studies, what are the academic effects of tools that help students visualize algorithms?
* What variables are known to influence the effectiveness of educational interventions?
; General empirical research questions
* What are the effects of a new technological intervention on the long-term and short-term memory retention of vocabulary words?
* To what degree do students and teachers report that they are satisfied with a new intervention?
* In what ways do teachers and students report that a new intervention can be improved?
; Sub-questions example
The question: "What is the essence of the experience of sense of community in online courses?" This question then could be further specified with these three sub-questions:
* What do teachers experience in terms of the phenomenon of sense of community in online learning?
* What do students experience in terms of the phenomenon of sense of community in online learning?
* What medium-related contexts influence stakeholder’s experience of community in online learning?


== Some debates ==
== Some debates ==

Revision as of 16:46, 10 July 2009

Draft

Introduction

This article will aim to give an overview on different research approaches popular in educational technology and also present some of the debates.

At this stage, I rather suggest to consult this:

  • Randolph, Justus J. (2007). Multidisciplinary Methods in Educational Technology Research and Development, HAMK Press, HTML / PDF. This is a free E-Book.

Research approaches

What do journals require ?

The Journal of the Learning Sciences

(retrieved 23:03, 12 September 2006 (MEST))

  • is not very explicit:
The Journal of the Learning Sciences (JLS) provides a multidisciplinary forum for the presentation of research on learning and education. It seeks to foster new ways of thinking about learning that will allow our understanding of cognition and social cognition to have impact in education. JLS publishes research articles that advance our understanding of learning in real-world situations and of promoting learning in such venues, including articles that report on the roles technology can play in promoting deep and lasting learning and in promoting engaged and thoughtful participation in learning activities, and articles reporting on new methodologies that enable rigorous investigation of learning in real-world situations. Scope
Educational Technology Research and Development

(retrieved 23:03, 12 September 2006 (MEST))

The Research Section assigns highest priority in reviewing manuscripts to rigorous original quantitative, qualitative, or mixed methods studies on topics relating to applications of technology or instructional design in educational settings. Such contexts include K-12, higher education, and adult learning (e.g., in corporate training settings). Analytical papers that evaluate important research issues related to educational technology research and reviews of the literature on similar topics are also published. This section features well documented articles on the practical aspects of research as well as applied theory in educational practice and provides a comprehensive source of current research information in instructional technology.

The Development Section publishes research on planning, implementation, evaluation and management of a variety of instructional technologies and learning environments. Empirically-based formative evaluations and theoretically-based instructional design research papers are welcome, as are papers that report outcomes of innovative approaches in applying technology to instructional development. Papers for the Development section may involve a variety of research methods and should focus on one or more aspect of the instructional development process; when relevant and possible, papers should discuss the implications of instructional design decisions and provide evidence linking outcomes to those decisions. See: About his Journal
Journal of Interactive Learning Research

(retrieved 23:03, 12 September 2006 (MEST)) This journal is most explicit and the The Scope and Standards of the Journal of Interactive Learning Research is signed Thomas C. Reeves.

Many researchers fail to distinguish clearly between the goals of their research and the methods they employ. Figures 1 and 2 present a classification scheme intended to distinguish between the goals and the methods of research. Most research studies submitted to JILR should be able to be classified according to the six research goals represented in Figure 1. This scheme reflects the debate about research "paradigms" that has dominated social science research literature for decades. For example, Soltis (1992) claims there are currently "three major paradigms, or three different ways of investigating important aspects of education" (p. 620): 1) the positivist or quantitative paradigm, 2) the interpretivist or qualitative paradigm, and 3) the critical theory or neomarxist paradigm. The three categories presented by Soltis (1992) fail to capture the full breadth of research goals in the fields of inquiry

relevant to JILR, and therefore the scheme in Figure 1 includes more categories. However, the goals of inquiry represented in Figure 1 are not intended to be a complete and final listing of research goals.

A methodology classification scheme is represented in Figure 2. There are numerous methods available to researchers in areas as diverse as cognitive psychology, instructional technology, and computer science (cf., Driscoll, 1995), but for the sake of simplicity, these five methodological groupings provide sufficient discrimination to represent the major approaches likely to be used in investigations reported in JILR. This journal will be especially open to submissions that involve alternative methods (e.g., qualitative and critical theory) which seem to be underrepresented in more traditional publications.

Scope

See below for Figure 1 (Research goal classification scheme) and Figure 2 (Research methods classification scheme). This journal accepts a very broad spectrum but requires both rigor and responsibility of course. In addition, Reeves also provides a few examples of possible submission types.

Thomas C. Reeves' Research classification for the Journal of Interactive Learning

Theoretical research focused on explaining phenomena through the logical analysis and synthesis of theories, principles, and the results of other forms of research such as empirical studies.
Empirical research focused on determining how education works by testing conclusions related to theories of communication, learning, performance, and technology.
Interpretivist research focused on portraying how education works by describing and interpreting phenomena related to human communication, learning, performance, and the use of technology.
Postmodern research focused on examining the assumptions underlying applications of technology in human communication, learning, and performance with the ultimate goal of revealing hidden agendas and empowering disenfranchised minorities.
Developmental research focused on the invention and improvement of creative approaches to enhancing human communication, learning, and performance through the use of technology and theory.
Evaluation research focused on a particular program, product, or method, usually in an applied setting, for the purpose of describing it, improving it, or estimating its effectiveness and worth.
Figure 1. Research goal classification scheme
Quantitative experimental, quasi-experimental, correlational, and other methods that primarily involve the collection of quantitative data and its analysis using inferential statistics.
Qualitative observation, case-studies, diaries, interviews, and other methods that primarily involve the collection of qualitative data and its analysis using grounded theory and ethnographic approaches.
Critical Theory deconstruction of "texts" and the technologies that deliver them through the search for binary oppositions, hidden agendas, and the disenfranchisement of minorities.
Literature Review various forms of research synthesis that primarily involve the analysis and integration of other forms of research, e.g., frequency counts and meta-analyses.
Mixed-methods research approaches that combine a mixture of methods, usually quantitative and qualitative, to triangulate findings.
Figure 2. Research methods classification scheme


The role of technology

See also the overview article on educational technology

Generally speaking, DSchneider belives that there is a trend away from content-driven approaches and global instructional designs towards design and study of cognitive tools. A good example is Sawyer's introduction chapter in the Cambridge handbook of the learning sciences. Sawer (2006:9) states that “Learning scientists emphasize the powerful role that computers can play in transforming all learning. But their vision rejects instructionism and behaviorism and the CAI systems based on it, and presents a new vision of computers in schools”. Accordingly this authors lists "blended" technology scenarios that focus on "deep" and "higher-order" learning:

  • Computers can represent abstract knowledge in concrete form
  • Computer tools can allow learners to articulate their developing knowledge in a visual and verbal way
  • Computers can allow learners to manipulate and revise their developing knowledge via the user interface, in a complex process of design that supports simultaneous articulation, reflection, and learning
  • Computers support reflection in a combination of visual and verbal modes
  • Internet-based networks of learners can share and combine their developing understandings and benefit from the power of collaborative learning

On the other hand, there is an increased trend toward simple designs in e-learning. However, e-learning doesn't represent a very large share of publications within the best educational technology journals.

Typical research and publication subjects

Justus Randolph tried to answer the four following questions:

  • What are the methodological factors that need to be taken into consideration when designing and conducting educational technology research?
  • What types of research questions do educational technology researchers tend to ask?
  • How do educational technology researchers tend to conduct research? – What approaches do they use? What variables do they examine? What types of measures do they use? How do they report their research?
  • How can the state of educational technology research be improved?

As and example we provide a few example research question that Randolph (2007:17) identified:

Knowledge-based (literature analysis) questions
  • What is known about best practices in user-centered design?
  • Across studies, what are the academic effects of tools that help students visualize algorithms?
  • What variables are known to influence the effectiveness of educational interventions?
General empirical research questions
  • What are the effects of a new technological intervention on the long-term and short-term memory retention of vocabulary words?
  • To what degree do students and teachers report that they are satisfied with a new intervention?
  • In what ways do teachers and students report that a new intervention can be improved?
Sub-questions example

The question: "What is the essence of the experience of sense of community in online courses?" This question then could be further specified with these three sub-questions:

  • What do teachers experience in terms of the phenomenon of sense of community in online learning?
  • What do students experience in terms of the phenomenon of sense of community in online learning?
  • What medium-related contexts influence stakeholder’s experience of community in online learning?

Some debates

Draft

Lack of quality

Research comes under fire from very different angles

From evaluation research (in distance education [1] but also present in educational technology): Issues raised include:

  • lack of experimental control
  • lack of procedures for randomly selecting research participants
  • lack of random assignment of participants to treatment conditions
  • poorly designed dependent measures that lack reliability and validity
  • failure to account for a variety of variables related to the attitudes of students and instructors.

From design-oriented research:

The problem with experimental research

“Papert (1993) sums up the inadequacy of these traditional evaluation designs: "The method of controlled experimentation that evaluates an idea by implementing it, taking care to keep everything else the same, and measuring the result, may be an appropriate way to evaluate the effects of a small modification. However, it can tell us nothing about ideas that might lead to deep change" (p. 27).” (Reeves, 1997).

Quantitative vs. qualitative research

Links

References

  • Anderson, T. & Kanuka, H. (2002). E-Research: Issues, Strategies and Methods. Allyn Bacon.
  • Robert M. Bernard, Yiping Lou, Philip C. Abrami, Lori Wozney, Evgueni Borokhovski, Peter Andrew Wallet, Anne Wade & Manon Fiset. How Does Distance Education Compare to Classroom Instruction? A Meta-analysis of the Empirical Literature. Centre for the Study of Learning and Performance, Concordia University, Montreal, Canada, and Louisiana State University. Presented as a Symposium at the Annual Meeting of The American Educational Research Association, Chicago, IL, April 2003.
  • Passi, B. K. & Sudarshan Mishra, Selecting Research Areas and Research Design Approaches in Distance Education: Process Issues, The International Review of Research in Open and Distance Learning, Vol 5, No 3 (2004), ISSN: 1492-3831. HTML
  • Johnson, S. D., Aragon, S. R., Shaik, N., & Palma-Rivas, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face-to-face learning environments. Journal of Interactive Learning Research, 11(1), 29-49.
  • Randolph, Justus J. (2007). Multidisciplinary Methods in Educational Technology Research and Development, HAMK Press, HTML / PDF. This is a free E-Book.
  • Reeves, T. C. (1992). Research foundations for interactive multimedia. In Promaco Conventions (Ed.), Proceedings of the International Interactive Multimedia Symposium, 177-190. Perth, Western Australia, 27-31 January. Promaco Conventions. HTML
  • Reeves, T. C. (1993). Pseudoscience in computer-based instruction: The case of learner control research. Journal of Computer-Based Instruction, 20(2), 39-46.
  • Reeves, Thomas, C. (1997). Evaluating What Really Matters in Computer-Based Education. HTML - HTML copy
  • Reeves, T. C., (2000) Enhancing the Worth of Instructional Technology Research through "Design Experiments" and Other Development Research Strategies, Paper presented on April 27, 2000 at Session 41.29, "International Perspectives on Instructional Technology Research for the 21st Century," a Symposium sponsored by SIG/Instructional Technology at the Annual Meeting of the American Educational Research Association, New Orleans, LA, USA. PDF
  • Reeves, Thomas C. (1999) The Scope and Standards of the Journal of Interactive Learning Research, Journal of Interactive Learning Research (JILR) HTML, retrieved 19:43, 11 September 2006 (MEST).
  • Reeves, T. C., (1999). A Research Agenda for Interactive Learning in the New Millennium, HTML
  • Sawyer, Keith (2006), Introduction, in Sawyer, Keith (ed.) The Cambridge Handbook of the Learning Sciences, Cambridge UniversityPress, ISBN 0521607773. PDF (This chapter only).
  • Sawyer, Keith (ed.) (2006), The Cambridge Handbook of the Learning Sciences, Cambridge UniversityPress, ISBN 0521607773

To sort out

  • Berge, Z. L., and Mrozowski, S. (2001). Review of Research in Distance Education, 1990 to 1999. The American Journal of Distance Education, 15(3), 5 - 19.
  • Clements Douglas H. and Julie Sarama, Strip Mining for Gold: Research and Policy in Educational Technology-A Response to 'Fool's Gold'. PDF
  • Garrison, R & Anderson, T. (2003). E-Learning in the 21st Century: A framework for research and practice. Routledge
  • Kirkpatrick, D. (1979). Techniques for evaluating training programs. Training and Development Journal. 33(6), p. 78-92.
  • Honey, M., Culp, K. M., & Carrigg, F. (1999). Perspectives on technology and education research: Lessons from the past and present. New York: Center for Children and Technology. HTML summary, retrieved 19:43, 11 September 2006 (MEST).
  • Luppicini, Rocci (2003), Towards a Cyber-Constructivist Perspective (CCP) of Educational Design, Canadian Journal of Learning and Technology, Volume 29(1) Winter / hiver, 2003.
  • Nettles, K., Dziuban C., Cioffe, D., Moskal, P., & Moskal, P. (2000). Technology and learning: The 'No Significant Difference' phenomenon: A structural analysis of research on technology enhanced instruction. Distributed Learning Impact Evaluation. Dziuban & Moskal (Eds.) Orlando: University of Central Florida.
  • Perraton, Hilary (2000), The International Review of Research in Open and Distance Learning, Vol 1, No 1 (2000), ISSN: 1492-3831 HTML Abstract and PDF
  • Reeves, Thomas C. (1995), Questioning the Questions of Instructional Technology Research, Instructional Technology Research Online, HTML
  • Saba, F. (2000). Research in Distance Education: A Status Report. International Review of Research in Open and Distance Learning, 1(1). HTML
  • Author ???, Title (Design Research ??), PDF