PISA

The educational technology and digital learning wiki
Jump to navigation Jump to search

Draft

Definition

PISA = Programme for International Student Assessment

“PISA assesses how far students near the end of compulsory education have acquired some of the knowledge and skills that are essential for full participation in society. In all cycles, the domains of reading, mathematical and scientific literacy are covered not merely in terms of mastery of the school curriculum, but in terms of important knowledge and skills needed in adult life. [..] n the PISA 2003 cycle, an additional domain of problem solving was introduced to continue the examination of cross-curriculum competencies.” ([1], retrieved 18:31, 30 September 2006 (MEST)).

PISA is often quoted in discussions about educational policy. Most people do not really know what is measured and just quote rankings.

The achievement studies

PISA 2000 achievement studies usually included three modules and that measured applicable skills in three areas: “Are students well prepared to meet the challenges of the future? Are they able to analyse, reason and communicate their ideas effectively? Do they have the capacity to continue learning throughout life?” (Adams & Vu, 2002: Introduction)

  • Mathematics: 32 items, representing approximately 60 minutes of testing time
  • Sciences: 35 items, representing approximately 60 minutes of testing time for each.
  • Reading: 141 items representing approximately 270 minutes of testing time for each.

“PISA 2000 was a paper-and-pencil test, with each student undertaking two hours of testing (i.e., answering one of the nine booklets). Pencils, erasers, rulers, and, in some cases, calculators, were provided. [...] The 141 main study reading items were organised into nine separate clusters, each with an estimated administration time of 30 minutes. The 32 mathematics items and the 35 science items were organised into four 15-minute mathematics clusters and four 15-minute science clusters respectively. These clusters were then combined in various groupings to produce nine linked two hour test booklets.” (Adams & Vu, 2002: 23).

These booklets (also called cognitive booklets (as opposed to the questionnaires) tested different modules. “Reading items occur in all nine booklets, and there are linkages between the reading in all booklets. This permits all sampled students to be assigned reading scores on common scales. Mathematics items occur in five of the nine booklets, and there are links between the five booklets, allowing mathematics scores to be reported on a common scale for five-ninths of the sampled students. Similarly, science material occurs in five linked booklets, allowing science scores to be reported on a common scale for five-ninths of the sampled students.” (Adams & Vu, 2002: 23).

PISA 2003 added:

  • Problem solving (cross-curicular competencies)

Options:

  • Countries can choose and add their own tests and surveys. E.g. in Switzerland there was a survey of 9th graders (in addition to 15 year olds).

The survey questionnaires

In addition to tests, PISA also administers questionnaires to both individual learners and schools. Datasets that combine questionnaire results and test results are available. “A Student and a School Questionnaire were used in PISA 2000 to collect data that could be used in constructing indicators pointing to social, cultural, economic and educational factors that are thought to influence, or to be associated with, student achievement.” (Adams & Vu, 2002: 33)

The student questionnaire included items for:

  • Basic demographics
  • Family background and measures of socioeconomic status
  • Student description of school/instructional processes
  • Student attitudes towards reading and reading habits
  • Student access to educational resources outside school
  • Institutional patterns of participation and programme orientation
  • Student career and educational expectations

The school questionnaire included:

  • Basic school characteristics
  • School policies and practices
  • School climate
  • School resources

Optional:

Mathematics skills

Design of the tests

According to quotations from PISA (2003):


In total, 85 mathematics items were used in PISA 2003. These tasks, and also those in reading, science and problem solving, were arranged into half-hour clusters. Each student was given a test booklet with four clusters of items - resulting in two hours of individual assessment time.

These clusters were rotated in combinations that ensured that each mathematics item appeared in the same number of test booklets, and that each cluster appeared in each of the four possible positions in the booklets.

Such a design makes it possible to construct a scale of mathematical performance, to associate each assessment item with a point score on this scale according to its difficulty and to assign each student a point score on the same scale representing his or her estimated ability.

The relative ability of students taking a particular test can be estimated by considering the proportion of test items they answer correctly. The relative difficulty of items in a test can be estimated by considering the proportion of test takers getting each item correct.

Once the difficulty of individual items was given a rating on the scale, student performance could be described by giving each student a score according to Each student was given a subset from a broad pool of mathematics tasks... ...and their performance was established on a scale... A Profile of Student Performance in Mathematics

the hardest task that they could be predicted to perform.

The six proficiency levels of the PISA 2003 mathematical skills

PISA (2003:47):


At Level 6, students can conceptualise, generalise, and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understanding, along with a mastery of symbolic and formal mathematical operations and relationships, to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.

At Level 5, students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare, and evaluate appropriate problem-solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriately linked representations, symbolic and formal characterisations, and insight pertaining to these situations. They can reflect on their actions and can formulate and communicate their interpretations and reasoning.

At Level 4, students can work effectively with explicit models for complex concrete situations that may involve constraints or call for making assumptions. They can select and integrate different representations, including symbolic ones, linking them directly to aspects of realworld situations. Students at this level can utilise well-developed skills and reason flexibly, with some insight, in these contexts. They can construct and communicate explanations and arguments based on their interpretations, arguments and actions.

At Level 3, students can execute clearly described procedures, including those that require sequential decisions. They can select and apply simple problem-solving strategies. Students at this level can interpret and use representations based on different information sources and reason directly from them. They can develop short communications reporting their interpretations, results and reasoning.

At Level 2, students can interpret and recognise situations in contexts that require no more than direct inference. They can extract relevant information from a single source and make use of a single representational mode. Students at this level can employ basic algorithms, formulae, procedures or conventions. They are capable of direct reasoning and making literal interpretations of the results.

At Level 1, students can answer questions involving familiar contexts where all relevant information is present and the questions are clearly defined. They are able to identify information and to carry out routine procedures according to direct instructions in explicit situations. They

can perform actions that are obvious and follow immediately from the given stimuli.

See also learning level

The six proficiency levels of the PISA 2006 science skills

According to the PISA 2006 technical manual (OECD 2009), the six proficiency levels on the science scale are the following:

At Level 6 (1.3%), students can consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they demonstrate willingness to use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.

At Level 5 (9%), students can identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.

At Level 4 (29.3%), students can work effectively with situations and issues that may involve explicit phenomena requiring them to make inferences about the role of science or technology. They can select and integrate explanations from different disciplines of science or technology and link those explanations directly to aspects of life situations. Students at this level can reflect on their actions and they can communicate decisions using scientific knowledge and evidence.

At Level 3 (56.7%), students can identify clearly described scientific issues in a range of contexts. They can select facts and knowledge to explain phenomena and apply simple models or inquiry strategies. Students at this level can interpret and use scientific concepts from different disciplines and can apply them directly. They can develop short statements using facts and make decisions based on scientific knowledge.

At Level 2 (80.8%), students have adequate scientific knowledge to provide possible explanations in familiar contexts or draw conclusions based on simple investigations. They are capable of direct reasoning and making literal interpretations of the results of scientific inquiry or technological problem solving.

At Level 1 (94.8%), students have such a limited scientific knowledge that it can only be applied to a few, familiar situations. They can present scientific explanations that are obvious and that follow explicitly from given

evidence.

These levels can then be described at a further operational level with other scales, e.g. the summary descriptions of the six proficiency levels for using science (OECD 2009: 300) defines level 5 perfomance with these terms:

General proficiencies students should have: Students at this level are able to interpret data from related datasets presented in various formats. They can identify and explain differences and similarities in the datasets and draw conclusions based on the combined evidence presented in those datasets.

Tasks a student should be able to do:

  • Compare and discuss the characteristics of different datasets graphed on the one set of axes.
  • Recognise and discuss relationships between datasets (graphical and otherwise) in which the measured variable differs.
  • Based on an analysis of the sufficiency of the data, make judgements about the validity of conclusions.
Test item example: GREENHOUSE Question 4

Setting up SPSS files

PISA 2012

You can download the data sets from Database - PISA 2012. You then can use SPSS or SAS control files to get nice ready-to-go data files.

Dealing with the student questionnaire (tested on Jan 2014 with SPSS 22)
  • Download Student questionnaire data file (about 230 MB compressed)
  • Uncompress. This will give a 1.17GB INT_STU12_DEC03.txt file (or similar). We suggest to uncompress to a simple file location on Windows (e.g. c:\pisa\INT_STU12_DEC03.txt). Do not copy to "My Documents". On Linux, it doesn't matter since you probably know how to use a computer ...
  • Download SPSS syntax to read in student questionnaire data file
  • Open the *.SPS file in SPSS. It should open in the Syntax Editor
  • Fix the second line (specify the location of the *.txt file, e.g. c:\pisa\INT_STU12_DEC03.txt) !!!
  • Since the data uses a English . separator, you may have to keep the first line, but I had to remove it when I used the french speaking interface of SPSS. I finally switched SPSS to English, something I do anyhow after installing a new version.
  • Run the SPS file. Make sure to run all the commands to the bitter end or you only will import partial information. On a decent computer this won't last for very long, max. a few minutes on a multi-core system with enough memory and an SSD card.
Click the Green button
If SPSS seems to be happy, do the following:
Menu: Run -> To End
If SPSS is unhappy, then just try to get the first two lines right. E.g. if you can't switch to English, learn how to tell SPSS how to use local ways (e.g. search for "SPSS set decimal dot PISA")
Menu File -> Save

To do country-specific analysis, we suggest to create new data files, e.g. just consider splitting the file (see the data menu) with the first variable:

Menu: Data -> Split into Files
Add Country Code
Define an output directory (optional)
Click ok
The file names will use: ISO-name.sav, e.g. CHE.sav (but you can change that)

... Ready to go :)

Links

General

Australian websites - have additional stuff like sample questions or a mirror for the datasets

PISA 2012

PISA 2012 Assessment and Analytical Framework
PISA 2012 test items
PISA 2012 Student questionnaires
PISA 2012 included one main questionnaiare and several other ones. As of Jan 2014, we did not find any single documentation
  • All the questionnaires are available from the Database - PISA 2012] page. There are four variants of the student questionnaire. However, we found that Student Questionnaire (main survey) found on mypisa is the best starting point. But you do need in addition the ICT and Educational career (EC) ones in addition.
  • Watch out for the questions codes (in grey on top of each question)
  • If you plan to conduct data analysis you also might download the code book (i.e. a dump of information that you will have in your SPSS/SAS file)
PISA 2012 Student Questionnaires (alternative)
PISA 2012 Data
  • These are available as compressed txt file plus SPSS/SAS Control files that you will have to run
  • Database - PISA 2012
  • The questionnaire file includes about 500'000 entries.
PISA 2012 Other questionnaires

PISA 2015 =

DataSet for download (including SAS / SPSS files)

There are several separate datasets for:

  • Student questionnaire
  • School questionnaire
  • Education career questionnaire for students (optional for countries)
  • ICT familiarity questionnaire for students (optional for countries)
  • Parents questionnaire (optional for countries)
  • Teacher questionnaire (optional for countries)

Download page:

Questionnaires

Country specific

Swiss pisa data
Canada

Bibliography

To do (there is a whole lot of literature).

  • PISA test items and school textbooks related to science: A textual comparison (2008). PISA test items and school textbooks related to science: A textual comparison, Science Education, Abstract
  • McGaw, B. (2002, October). Raising the bar and reducing failures: A possible dream. Invited paper given at the ACER conference “Providing world-class education: What can Australia learn from international achievement studies?”, Sydney.
  • Raymond J. Adams, Margaret Wu, Programme for International Student Assessment, Organisation for Economic Co-operation and Development. OECD Publishing, 2002, ISBN 9264199519. This publication is also avaible as: PISA (2000) Technical Report (PDF/English version)
  • PISA (2003), Learning for Tomorrow's World. PDF
  • Organisation for Economic Co-Operation and Development (2001). Knowledge and Skills for Life: First Results from PISA 2000. Paris: OECD Publications.
  • Organisation for Economic Co-Operation and Development (2002a). Manual for the PISA 2000 Database. Paris OECD Publications.
  • Organisation for Economic Co-Operation and Development (2002b). Sample Tasks from the PISA 2000 Assessment. Paris OECD Publications.
  • OECED (Organisation for Economic Co-Operation and Development) (2009, PISA 2006 TECHNICAL REPORT. ISBN 978-92-64-04808-9