Programme for International Student Assessment (2000 to 2012)

Last updated

The Programme for International Student Assessment has had several runs before the most recent one in 2012. The first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years. 470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010. [1]

Contents

Every period of assessment focuses on one of the three competence fields of reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading was again the main domain in 2009.

PeriodFocusOECD countriesPartner countriesParticipating studentsNotes
2000Reading284 + 11265,000The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003Mathematics3011275,000UK disqualified from data analysis. Also included test in problem solving.
2006Science3027400,000Reading scores for US excluded from analysis due to misprint in testing materials. [2]
2009 [3] Reading3441 + 10470,00010 additional non-OECD countries took the test in 2010. [4]
2012 [5] Mathematics3431510,000

Results

PISA 2012

PISA 2009

PISA 2006

PISA 2003

The results for PISA 2003 were released on 14 December 2004. This PISA cycle tested 275,000 15 year-olds on mathematics, science, reading and problem solving and involved schools from 30 OECD member countries and 11 partner countries. [15] Note that for Science and Reading, the means displayed are for "All Students", but for these two subjects (domains), not all of the students answered questions in these domains. In the 2003 OECD Technical Report (pages 208, 209), there are different country means (different than those displayed below) available for students who had exposure to these domains. [16]

PISA 2000

The results for the first cycle of the PISA survey were released on 14 November 2001. 265,000 15 year-olds were tested in 28 OECD countries and 4 partner countries on mathematics, science and reading. An additional 11 countries were tested later in 2002. [17]

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. European Economic Area countries perform slightly better in PISA; the Commonwealth of Independent States and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation. [18]

Reception

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman". [19]

China

Education professor Yong Zhao has noted that PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong." [20]

Students from Shanghai, China, had the top scores of every category (Mathematics, Reading and Science) in PISA 2009. In discussing these results, PISA spokesman Andreas Schleicher, Deputy Director for Education and head of the analysis division at the OECD’s directorate for education, described Shanghai as a pioneer of educational reform in which "there has been a sea change in pedagogy". Schleicher stated that Shanghai abandoned its "focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving." [21]

Schleicher also states that PISA tests administered in rural China have produced some results approaching the OECD average: Citing further, as-yet-unpublished OECD research, Schleicher said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average." [22] Schleicher says that for a developing country, China's 99.4% enrollment in primary education is "the envy of many countries". He maintains that junior secondary school participation rates in China are now 99%; and in Shanghai, not only has senior secondary school enrollment attained 98%, but admissions into higher education have achieved 80% of the relevant age group. Schleicher believes that this growth reflects quality, not just quantity, which he contends the top PISA ranking of Shanghai's secondary education confirms. [22] Schleicher believes that China has also expanded school access and has moved away from learning by rote. [23] According to Schleicher, Russia performs well in rote-based assessments, but not in PISA, whereas China does well in both rote-based and broader assessments. [22]

Denmark

University of Copenhagen Professor Svend Kreiner, who examined in detail PISA's 2006 reading results, noted that in 2006 only about ten percent of the students who took part in PISA were tested on all 28 reading questions. "This in itself is ridiculous," Kreiner told Stewart. "Most people don't know that half of the students taking part in PISA (2006) do not respond to any reading item at all. Despite that, PISA assigns reading scores to these children." [24]

Finland

The stable, high marks of Finnish students have attracted a lot of attention. According to Hannu Simola [25] the results reflect a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers' academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland's changing, but still quite paternalistic culture. Others advance Finland's low poverty rate as a reason for its success. [26] [27] Finnish education reformer Pasi Sahlberg attributes Finland's high educational achievements to its emphasis on social and educational equality and stress on cooperation and collaboration, as opposed to the competition among teachers and schools that prevails in other nations. [28]

India

Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and mathematics, and 73rd and 74th in science. In Himachal Pradesh, 57.9 percent of 15 year olds in school cannot be distinguished from not having learned any science at all and in TN 43.6 percent all in this category - ten times as many as the USA. The estimate of the fraction of Tamil Nadu or Himachal Pradesh students at level 6 in science proficiency was zero. Their estimate of the fraction at level 5: also zero. Compared to this, about 20 percent of Singapore students reach at least level 5 or 6, while those below level 1 is only 2.8 percent. [29] India's poor performance may not be linguistic as some suggested. 12.87% of US students, for example, indicated that the language of the test differed from the language spoken at home. while 30.77% of Himachal Pradesh students indicated that the language of the test differed from the language spoken at home, a significantly higher percent [30] However, unlike American students, those Indian students with a different language at home did better on the PISA test than those with the same language. [30] India's poor performance on the PISA test is consistent with India's poor performance in the only other instance when India's government allowed an international organization to test its students [31] and consistent with India's own testing of its elite students in a study titled Student Learning in the Metros 2006. [32] These studies were conducted using TIMSS questions. The poor result in PISA was greeted with dismay in the Indian media. [33] The BBC reported that as of 2008, only 15% of India's students reach high school. [34]

Italy / South Tyrol

In 2003 South Tyrol (Provincia Autonoma di Bolzano / Autonome Provinz Bozen), a predominantly German-speaking province in the north of Italy, took part in the PISA project for the first time in order to have a regional result as an adjudicated region. In the rest of Italy PISA is conducted by INVALSI (Istituto nazionale per la valutazione del sistema educativo di istruzione e di formazione), a formally independent research institution affiliated to the Ministry of Education, whereas in South Tyrol PISA was carried out by the regional Education Authority itself (Intendenza scolastica / Schulamt, since 2018 renamed into Bildungsdirektion), [35] which is part of the South Tyrolean regional government. At the end of 2004, in the months prior to the announcement of the test results, the regional Education Authority in Bolzano / Bozen downplayed the validity of the PISA assessment and commissioned alternative school evaluations, preparing the public for a mediocre test result. According to the official PISA report 2003, however, South Tyrol seemed to even beat the PISA world champion Finland.

Critique

Right from the beginning, there was scepticism as to how South Tyrol succeeded in outdoing the neighbouring Italian and Austrian provinces. On the front page of its weekend edition for 29/30 January 2005, the South Tyrolean newspaper Neue Südtiroler Tageszeitung published a harsh critique and revealed that the South Tyrolean Education Authority had secretly eliminated more than 300 students from the 1500 students officially drawn as South Tyrolean test sample by the PISA Consortium, and soon more inconsistencies were to surface:

Comparison with similar assessments

The stunning South Tyrolean 2003 PISA results can hardly be reconciled with similar high school evaluations, which were not conducted or influenced by the South Tyrolean Education Authority itself. Three international or national large scale assessment projects painted a gloomy picture of the South Tyrolean students’ performance.

United States

Two studies have compared high achievers in mathematics on the PISA and those on the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA. [46] [47]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings. [48] This can likely be traced to the different material being covered and the United States teaching mathematics in a style less harmonious with the "Realistic Mathematics Education" which forms the basis of the exam. [49] Countries that commonly use this teaching method score higher on PISA, and less highly on TIMSS and other assessments. [50]

Poverty

Stephen Krassen, professor emeritus at the University of Southern California, [51] and Mel Riddile of the NASSP attributed the relatively low performance of students in the United States to the country's high rate of child poverty, which exceeds that of other OECD countries. [26] [27] However, individual US schools with poverty rates comparable to Finland's (below 10%), as measured by reduced-price school lunch participation, outperform Finland; and US schools in the 10–24% reduced-price lunch range are not far behind. [52]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren. In the United States, schools in locations in which less than 10% of the students qualified for free or reduced-price lunch averaged PISA scores of 551 (higher than any other OECD country). This can be compared with the other OECD countries (which have tabled figures on children living in relative poverty): [27]

CountryPercent of reduced school lunches (US) [27]

Percent of relative child poverty (Other OECD countries) [53]

PISA score [54]
United States< 10%551
Finland3.4%536
Netherlands9.0%508
Belgium6.7%506
United States10%–24.9%527
Canada13.6%524
New Zealand16.3%521
Japan14.3%520
Australia11.6%515
United States25–49.9%502
Estonia40.1%501
United States50–74.9%471
Russian Federation58.3%459
United States> 75%446

Sampling errors

In 2013 Martin Carnoy of the Stanford University Graduate School of Education and Richard Rothstein of the Economic Policy Institute released a report, "What do international tests really show about U.S. student performance?", analyzing the 2009 PISA data base. Their report found that U.S. PISA test scores had been lowered by a sampling error that over-represented adolescents from the most disadvantaged American schools in the test-taking sample. [55] The authors cautioned that international test scores are often "interpreted to show that American students perform poorly when compared to students internationally" and that school reformers then conclude that "U.S. public education is failing." Such inferences, made before the data has been carefully analyzed, they say, "are too glib" [56] and "may lead policymakers to pursue inappropriate and even harmful reforms." [57]

Carnoy and Rothstein observe that in all countries, students from disadvantaged backgrounds perform worse than those from advantaged backgrounds, and the US has a greater percentage of students from disadvantaged backgrounds. The sampling error on the PISA results lowered U.S. scores for 15-year-olds even further, they say. The authors add, however, that in countries such as Finland, the scores of disadvantaged students tends to be stagnant, whereas in the U.S the scores of disadvantaged students have been steadily rising over time, albeit still lagging behind their those of their more advantaged peers. When the figures are adjusted for social class, the PISA scores of all US students would still remain behind those of the highest scoring countries, nevertheless, the scores of US students of all social backgrounds have shown a trajectory of improvement over time, notably in mathematics, a circumstance PISA's report fails to take into account.

Carnoy and Rothstein write that PISA spokesman Schleicher has been quoted saying that "international education benchmarks make disappointing reading for the U.S." and that "in the U.S. in particular, poverty was destiny. Low-income American students did (and still do) much worse than high-income ones on PISA. But poor kids in Finland and Canada do far better relative to their more privileged peers, despite their disadvantages" (Ripley 2011). [58] Carnoy and Rothstein state that their report's analysis shows Schleicher and Ripley's claims to be untrue. They further fault the way PISA's results have persistently been released to the press before experts have time to evaluate them; and they charge the OECD reports with inconsistency in explaining such factors as the role of parental education. Carnoy and Rothstein also note with alarm that the US secretary of education Arne Duncan regularly consults with PISA's Andreas Schleicher in formulating educational policy before other experts have been given a chance to analyze the results. [59] Carnoy and Rothstein's report (written before the release of the 2011 database) concludes:

We are most certain of this: To make judgments only on the basis of national average scores, on only one test, at only one point in time, without comparing trends on different tests that purport to measure the same thing, and without disaggregation by social class groups, is the worst possible choice. But, unfortunately, this is how most policymakers and analysts approach the field.

The most recent test for which an international database is presently available is PISA, administered in 2009. A database for TIMSS 2011 is scheduled for release in mid-January 2013. In December 2013, PISA will announce results and make data available from its 2012 test administration. Scholars will then be able to dig into TIMSS 2011 and PISA 2012 databases so they can place the publicly promoted average national results in proper context. The analyses we have presented in this report should caution policymakers to await understanding of this context before drawing conclusions about lessons from TIMSS or PISA assessments. [60]

References

  1. PISA 2009 Technical Report, 2012, OECD, http://www.oecd.org/dataoecd/60/31/50036771.pdf
  2. Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (10 December 2007), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (PDF), NCES , retrieved 14 December 2013, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.
  3. 1 2 PISA 2009 Results: Executive Summary (PDF), OECD, 7 December 2010
  4. 1 2 ACER releases results of PISA 2009+ participant economies, ACER, 16 December 2011, archived from the original on 8 October 2014, retrieved 15 April 2016
  5. 1 2 3 4 5 6 PISA 2012 Results in Focus (PDF), OECD, 3 December 2013, retrieved 4 December 2013
  6. CB Online Staff. "PR scores low on global report card" [ usurped ], Caribbean Business , 26 September 2014. Retrieved on 3 January 2015.
  7. OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V), http://www.oecd-ilibrary.org/education/pisa-2012-results-skills-for-life-volume-v_9789264208070-en
  8. 1 2 3 4 PISA 2012 Results OECD. Retrieved 4 December 2013
  9. 1 2 Sedghi, Ami; Arnett, George; Chalabi, Mona (3 December 2013), "Pisa 2012 results: which country does best at reading, maths and science?", The Guardian , retrieved 14 February 2013
  10. Adams, Richard (3 December 2013), "Swedish results fall abruptly as free school revolution falters", The Guardian , retrieved 3 December 2013
  11. Kärrman, Jens (3 December 2013), "Löfven om Pisa: Nationell kris", Dagens Nyheter , retrieved 3 December 2013
  12. Multi-dimensional Data Request, OECD, 2010, archived from the original on 14 July 2012, retrieved 28 June 2012
  13. PISA 2009 Results: Executive Summary (Figure 1 only) (PDF), OECD, 2010, retrieved 28 June 2012
  14. Walker, Maurice (2011), PISA 2009 Plus Results (PDF), OECD, archived from the original (PDF) on 22 December 2011, retrieved 28 June 2012
  15. Learning for Tomorrow's World First Results from PISA 2003 (PDF), OECD, 14 December 2004, retrieved 6 January 2014
  16. PISA 2003 Technical Report (PDF), OECD
  17. Literacy Skills for the World of Tomorrow: Further Results from PISA 2000 (PDF), OECD, 2003, retrieved 6 January 2014
  18. M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March 2008.
  19. "Waiting for "Superman" trailer". YouTube . 7 May 2010. Retrieved 8 October 2010.
  20. Yong Zhao (10 December 2010), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance
  21. Gumbel, Peter (7 December 2010), "China Beats Out Finland for Top Marks in Education", Time , archived from the original on 10 December 2010, retrieved 27 June 2012
  22. 1 2 3 Cook, Chris (7 December 2010), "Shanghai tops global state school rankings", Financial Times , retrieved 28 June 2012
  23. Mance, Henry (7 December 2010), "Why are Chinese schoolkids so good?", Financial Times , retrieved 28 June 2012
  24. "Is the foundation under PISA solid? A critical look at the scaling model underlying international comparisons of student attainment" (PDF). Archived from the original (PDF) on 4 March 2016. Retrieved 15 April 2016.
  25. Simola, Hannu (2005), "The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education" (PDF), Comparative Education, 41 (4): 455–470, doi:10.1080/03050060500317810, S2CID   145325152
  26. 1 2 "The Economics Behind International Education Rankings Archived 26 April 2016 at the Wayback Machine " National Educational Association
  27. 1 2 3 4 Riddile, Mel (15 December 2010), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals, archived from the original on 22 January 2014, retrieved 15 April 2016
  28. Cleland, Elizabeth (29 December 2011). "What Americans Keep Ignoring About Finland's School Success – Anu Partanen". The Atlantic.
  29. "The Leap Blog: The first PISA results for India: The end of the beginning". blog.theleapjournal.org. Retrieved 26 August 2025.
  30. 1 2 "Database – PISA 2009". Pisa2009.acer.edu.au. Archived from the original on 22 March 2016. Retrieved 15 April 2016.
  31. "DataBank Error Page" (PDF). ddp-ext.worldbank.org. Retrieved 6 April 2025.
  32. Initiatives, Educational (November 2006), "Student Learning in the Metros" (PDF), Educational Initiatives
  33. Vishnoi, Anubhuti (7 January 2012), "Poor PISA ranks: HRD seeks reason", The Indian Express
  34. Masani, Zareer (27 February 2008). "India still Asia's reluctant tiger". BBC News.
  35. Cf. http://www.provincia.bz.it/bildung-sprache/deutschsprachige-schule/mitteilungen.asp?publ_action=300&publ_image_id=469121. Retrieved 11 April 2021.
  36. Cf. INFO, December 2004 (i.e. a Circular Letter edited by the regional Education Authority): «In all fields, the South Tyrolean schools achieved a first-rate performance» (p. 2); Mr Höllrigl, then director of the Education Authority, and Mr Meraner, then head of the PISA board, in INFO, January 2005: «I am surprised that we have already become world leaders» (p. 12); Mr Meraner in the most-read daily Dolomiten, 18 February 2005: «We are the world champions even in Problem Solving».
  37. Cf. the South Tyrolean weekly FF, 17 February 2005, p. 10: "Land klagt Lehrer [Regional Government’s Action against Teacher", and FF, 16 March 2006, in which Mr Durnwalder admits to the failed legal suit.
  38. Cf. Dolomiten, 27 January 2005: Mr Hilpold misinformed the press on behalf of the regional government: «South Tyrol was assessed as a nation [Land]. It is due to the fact that we were assessed as a nation that we may compare our results with other nations.» Mr Meraner, director of the Pedagogical Institute, also wrongly claimed that the South Tyrolean overall result may be compared to that of other «nations» because, as he falsely stated, South Tyrol had a national result of its own.
  39. Cf. the PISA report: Learning for Tomorrow’s World. First Results from PISA 2003. Paris, 2004. p. 469; online version: http://www.oecd.org/dataoecd/1/60/34002216.pdf. Retrieved 8 March 2012.
  40. Cf. the PISA report: Learning for Tomorrow’s World. First Results from PISA 2003. Paris, 2004. p. 321; online version: http://www.oecd.org/dataoecd/1/60/34002216.pdf.
  41. Cf. the EMS study published by the Austrian Ministry of Research in 2007: http://www.bmwf.gv.at/startseite/mini_menue/service/publikationen/wissenschaft/universitaetswesen/spiel_studie/. Retrieved 1 April 2012. Abridged version: https://docplayer.org/18915390-Evaluation-der-eignungstests-fuer-das-medizinstudium-in-oesterreich.html = Evaluation der Eignungstests für das Medizinstudium in Österreich - PDF Free Download (docplayer.org) – retrieved 10 January 2021.
  42. E.g. in the year 2007: c.f. the parliamentary question of an opposition party: https://suedtiroler-freiheit.com/2007/08/16/landtagsanfrage-zu-den-medizinstudium-ausbildungsplaetzen. Retrieved 6 April 2021. Or the more recent article in an online newspaper: https://www.salto.bz/de/article/25082016/braucht-suedtirol-die-oesterreicher-quote. Retrieved 6 April 2021.
  43. Cf. the interview in the Austrian daily Tiroler Tageszeitung, 3 November 2008, p. 4.
  44. An abridged version of the South Tyrolean DESI report was published by the Pedagogical Institute on its site: http://www.provinz.bz.it/news/de/news.asp?news_action=5&news_article_id=138926. Retrieved 8 April 2021.
  45. Cf. the reports published by INVALSI, or its predecessor SNV, Servizio nazionale di evaluazione: http://www.invalsi.it/invalsi/index.php?page=snv. Retrieved 27 December 2009.
  46. Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state's international standing." Education Next 11:4 (Fall): 51–59. http://educationnext.org/are-u-s-students-ready-to-compete/
  47. Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10–18. http://educationnext.org/teaching-math-to-the-talented/
  48. Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (14 November); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  49. "PISA Mathematics: A Teacher's Guide" (PDF). 13 August 2019.
  50. Loveless, Tom (9 January 2013). "International Tests Are Not All the Same". Brookings Institution.
  51. quoted in Valerie Strauss, "How poverty affected U.S. PISA scores", The Washington Post, 9 December 2010.
  52. "Stratifying PISA scores by poverty rates suggests imitating Finland is not necessarily the way to go for US schools". Simply Statistics. 23 August 2013.
  53. "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  54. Highlights From PISA 2009, Table 3.
  55. See, Martin Carnoy and Richard Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, 28 January 2013.
  56. Valerie Strauss, "U.S. scores on international test lowered by sampling error: report", Washington Post, 15 January 2013.
  57. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, 28 January 2013
  58. Schleicher was quoted by Amanda Ripley to this effect in her 2011 book, The Smartest Kids in The World (Simon and Schuster).
  59. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, 28 January 2013. Another scholar, Matthew di Carlo of the Albert Shanker Institute, criticized PISA for reporting its results in the form of national rankings, since rankings can give a misleading impression that differences between countries' scores are far larger than is actually the case. Di Carlo also faulted PISA's methodology for disregarding factors such as margin of error. See Matthew di Carlo, "Pisa For Our Time: A Balanced Look", Albert Shanker Institute website, 10 January 2011.
  60. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013.

Further reading

Official websites and reports

Reception and political consequences

France

Germany

  • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.
  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v 32 n 5 pp 619–634 Nov 2006.

United Kingdom

  • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland.

Books

Websites

Video clips