Methodology tutorial - qualitative data acquisition methods

The educational technology and digital learning wiki
Jump to navigation Jump to search

This article or section is currently under construction

In principle, someone is working on it and there should be a better version in a not so distant future.
If you want to modify this page, please discuss it with the person working on it (see the "history")

<pageby nominor="false" comments="false"/>

Qualitative data acquisition methods

This is part of the methodology tutorial (see its table of contents).

In educational technology (as well in most other social sciences) one works with a variety of qualitative data.

Since qualitative research most often focuses on "rich" data, sampling is more difficult than for quantitative research and we shall start with this issue.

Sampling strategies in qualitative research

Icon-hand-right.png Often you only work with 1-2 big cases (i.e. classes, organizations). The reason is that qualitative analysis is highly labor intensive

Icon-hand-right.png But within each case you also have to think about sampling !. Let's look at and example: A innovation researcher when looking at organizations may interact with various people and study/observe various processes:

  • informants within the organization
  • external experts (domain/subject experts/practitioners)
  • clients and other interacting organizations
  • observed processes (e.g. workflow analysis)
  • texts (e.g. written decisions, files, ...)

An other example: Impact of an initiative on a public space (e.g. publicly accessible computer rooms)

  • external decision makers and interest groups
  • organized local groups (e.g. parent’s associations)
  • population of the area
  • events and behaviors associated with this initiative

Icon-hand-right.png Sampling is often multi-stage (by waves): Research in progress can show new phenomena that need investigation and therefore sampling

General sampling strategies

Miles & Huberman (1994:28)

Type of case

Usage

Broad categories

maximal variation

will give better scope to your result s
(but needs more complex models !!)

major strategies

homogeneous

provides better focus and conclusions will be "safer" since it will be easier to identify explaining variables and to test relations

critical

exemplify a theory with a "natural" example

according to theory,
i.e. your research questions

will give you better guarantees that you will be able to answer your questions ....

confirming / infirming

test the limits of an explanation

validation

extremes and deviant cases

test the boundaries of your explanations, seek new adventures

typical

Show what is “normal” or “mean” or "typical"

intense

complete a quantitative study with an in-depth study

specialization

according to dimension

Study of particular phenomena

“snow ball”

According to information received during study

inductive approach

“opportune”

Follow new “leads”

all

(rarely possible)

representativeness

quota

selection of subgroups

according to reputation

recommendations of experts

comparative method

according to operative variables

according to criteria

according to criteria you want to study

convenient

those who are willing ...

bad

polititical

Exclusion/inclusion for political reasons

Icon-light-bulb.png Use this big list to think about your own strategy

There are no general rules, but we can formulate a few heuristics and recommended practices !

  • Use this table to think the kind of sampling you need for your own research.
  • Choose well your cases = avoid trouble later ...
  • Avoid adopting a sampling-by-induction strategy (more difficult)
  • Look at your research questions !!
    • Can you answer all of them (measure concepts, find causalities, etc.) ??
  • Understand the scope of the sampling task (see also below):
    • roles (functions organization),
    • groups, organizations, institutions, ....
    • “programs”,
    • processes,
    • ....
Advice for intra-case sampling
  • identify types of informations you need.
  • sample all categories (activities, processes, events, dates, locations, agents, ...)
  • again: think about your the theory you want to produce and its scope
  • reduce your ambitions (research questions) when your sampling lists get to large
  • you always can add cases (snow-ball strategy)
Advice for inter-case sampling

It’s a good strategy to adopt a kind of similar systems design:

  • Select similar cases that have a nice variance within your operative variables

(dependant and independent).

  • E.g. to test the variants of e-learning design, select relatively similar domains, or relatively similar target populations.
  • You then can add contrasted (extreme) cases to test the external validity (generalization potential) of your analysis

Remember: qualitative research is very expensive

  • 2-3 big cases (e.g. courses, schools, designs) are enough for a master thesis
  • 12-30 cases within all cases (e.g. people, processes) are enough for a master thesis
  • else complete qualitative strategies with quantitative approaches.

Data gathering techniques (empirical measures)

Here is an overview of various data gathering techniques:

activity

medium

principal objective

look

observation

Global observation of an organization, culture, activity, etc. See: Observation, transcription and text analysis

examine activities

transcriptions of natural activities

In-depth study of activities and interactions in context. See: Observation, transcription and text analysis

provoked activities

transcriptions
of provoked
activities

In-depth study of formal activities you engage somebody in. See: Observation, transcription and text analysis

study

texts

Written traces of activities (e.g. decision protocols, guidelines) See: #Observation, transcription and text analysis

ask

interviews

Extraction of information in peoples headsee: See Interviews

participate

share

Participatory observation shares research and work

Different roles for qualitative technology

Icon-warning.png Don’t confuse the "technique" and "approach" levels when you talk about qualitative methods. Qualitative methods can just refer to specific data-gathering techniques but also to more global designs.

In the following table we show the different status of qualitative data acquisition technology in quantitative vs. qualitative research.

Some different objectives and preferred techniquesfor different kinds methodologies (approaches)

method

quantitative

qualitative

look

  • preliminary work for questionnaire design
  • "Deep understanding of an institution’s or culture’s working

examine activities

  • quick studies of work activities and interactions to prepare initial design specifications
  • systematic usability studies
  • dialogue analysis

provoked activities

  • understanding of reasoning processes

study

  • formal content analysis
  • most often work counting or more sophisticated like LSA
  • categorization and understanding of concepts

ask

  • fixed questions to systematically gather relatively complex attitudes, opinions and descriptions of behaviors
  • open interviews or semi-structured interviews to engage subjects in

This table is not very complete, but it shows that qualitative designs are more geared towards going in depth whereas mostly quantitative designs put more emphasis on scale or preparation of quantitative studies, ...

Observation, transcription and text analysis

Observation of behaviors in natural contexts

Observation in natural contexts is an essential instrument for in-depth studies of cultures and/or organizations

  • Takes time and requires skills (see below)
  • Needs assessment:
    • of the researcher’s role in the organization, group, culture, ...
    • on investigation methods, research goals (in order to focus observations), etc.
  • Needs a good “field notes” technique:
    • notational conventions for sessions
    • notational conventions after session notes
    • a journaling technique

Example of a field note technique:

Marks

Usage

“...”

verbatim quotations

‘ ... ’

paraphrases

( ... )

contextual data (or researchers interpretations)

< ... >

Analytical categories ) derived from the subject’s conceptual frameworks

/ ...

Analytical categories ) derived from the researcher’s conceptual frameworks

____

time elapsed


Computer mediated transcriptions

  • ... are very popular in educational technology
  • Media: experimental artifacts, portals, CSCL, CSCW
  • Tools are sometimes rigged to register detailed user acts for research purposes
  • Types of activities observed:
    • user-machine interactions
    • mediated user-user interactions
  • In addition, screen activities can be filmed or electronically registered
    • give extra informations, also allows to register non CMC-mediated user-user communication
Data

Analysis of transcriptions take an enormous amount of time

  • either you have to spend days/weeks for manual coding (preferably using specialized software adapted to the media type)
  • or you need high technical skills to write scripts to reduce and "massage" data

Likely you also have to invent your own data analysis and visualization techniques. Be sure to search the literature for coding and analysis techniques !

Advice
  • Think very hard about the concepts you need to measure !
  • Be as superficial as possible, e.g. use quantitative data reduction techniques if you can find out how to do so.

Elicitation of cognitive processes

  • The “ thinking aloud ” method combined with protocol analysis (Ericsson & Simon, 1983) is a popular method in cognitive science and expert system design
  • Used to collect relatively "objective" data about thinking processes, problem solving in particular.

There can be important experimentation effects:

  • ex-post rationalization of behavior,
  • analytical thinking instead of case-based/pattern matching
  • influence of experimenter
  • subject may become silent and confused ...

Basic principle: Users are given tasks and are asked to think aloud what they do.

The Ecrisson & Simon procedure for elicitation cognitive processes

  • Experimenter is completely silent...
  • ...except when subject is ± 15s silent
  • “Keep talking”

Boren & Ramey: Usability testing practice is different:

  • Subjects asks for help,
  • Testers ask questions (clarification, opinion, ...),
  • ‘Push’ subjects in certain directions.

Transcriptions of user activities in semi-formal situations

Icon-hand-right.png Usually audio or video recordings

  • Take time to analyze (like above) !
  • Ask permission to use a tape-recorder or a camera if you do this in a work context
  • Can also modify user’s behaviors

Texts

  • Text analysis (other than "texts" mentioned above) concerns artifacts like official documents, student/teacher paper productions, etc.
  • Don’t ask for everything when you start your research
    • People don’t always like to give away written traces of their activities, and therefore you need to establish a confidence relation first.
  • There is a large amount of analysis techniques (not covered in these tutorials for the moment)

Interviews

There exist several kinds of interview types, each has its purpose.

Type

composition

function / advantages

Information interviews

check-list

Initial studies

Semi-structured interviews

list of questions and “probes”

Main interview type in qualitative research

Structured (directive)interviews

list of fixed questions

Semi-quantitative studies:

Interviews with a fixed list of questions and closed questions(see quantitative modules)

list of questions with response items

Quantitative studies

  • fast interview
  • reliable
  • easy to analyze
  • needs good understanding of the studied phenomenon


General advice for interviews

Icon-hand-right.png Interviewing is a well documented technique (in most textbooks)

Interviewees (in natural settings) don’t have time to loose

  • focus on theessential
  • check if some information is available in other forms (e.g. written memos, rules, etc.)
  • learn the jargon
  • consult all other available information before the interview

The information interview

Possible Objectives
  • Determine your research goals, e.g. you need to find out if your potential research subject is of any interest, etc. ;
  • prepare your research questions ;
  • prepare field research, e.g. you need information about the workings of an organization, process, procedure, about people and their roles, etc.
Find the person
  • often you may first interview a domain specialist ;
  • sometimes any person that has knowledge on your subject area and time will also do .
  • In "natural contexts" avoid to "over-tax" key actors: You must make sure that key actors will agree to in-depth semi-structured interviews in later stages, interviewing twice may not please some of them.

The structured interview

  • Definition: A list of questions and open responses (usually a few sentences)
    • Useful to systematically gather comparable informations about relatively complex

variables (beliefs, behaviors, etc.)

  • The questionnaire needs a lot of preparation !
    • make sure that each concept can reliably be measured and lead to valid indicators.
  • To prepare the questionnaire you ought to do 2-3 semi-structured interviews (or at least some information interviews)
  • In addition, make pre-tests with 2-3 subjects in order to be sure that your questions are understandable
  • You have to think about analysis methods beforehand
    • manual or machine coding?
    • code books
    • cost estimations, remember that any sort of text analysis is very costly (!)
    • etc.
  • .... Consider surveys with closed response items as cheaper alternative !

The semi-structured interview

  • This is preferred type of interview in typical qualitative research.
  • You will get answers for your questions.
  • Concurrently, this inteview type allows the interviewee to reason.
General remarks
  • (again): preparation !
  • (again): read your research questions and identify the ones that need interviewing
Usual structure of the interview

You must prepare questions in two layers:

  1. Firstly a list of general question
  2. For each of these questions you then write down a "secret" list of points ("probes") that need to be covered. During the interview you must "probe" the interviewee for all those points
Interviewer’s behavior
  • Let the person talk !!!... and cover your detailed probes later if the person doesn't address these him/herself !
  • It is important that the interviewee is allowed to develop chains of reasoning (e.g. perceptions of causality, associations between concepts, etc.).
  • The goal is to extract "meaning", i.e. so called "deep" or "thick" structures.
Carefully word your questions
  • Watch out for sensitive questions
    • put them at the end
    • if you are lucky the subject will mention them anyhow.
  • Use indirect questions that project the interviewee into a situation

Examples:

  • don’t ask: “do you work well with person A ?”
    • but: “do you have frequent contacts with A”, “how do you coordinate”, etc.
  • don’t ask: "do you know how to use this software" ?
    • but: "how frequently do you use this software", etc. ?

When appropriate, ask about concrete cases

    • e.g. present a hypothetical case and ask how they solve it.
    • e.g. (in usability testing) give them tasks to solve

En résumé:

  • rather ask what people do than what they feel
  • in many situations, it is useful to present the interviewee with a scenario and use it

also to let people reflect on more general issues