Abbreviation | PISA |
---|---|
Formation | 1997 |
Purpose | Comparison of education attainment across the world |
Headquarters | OECD Headquarters |
Location |
|
Region served | World |
Membership | 79 government education departments |
Official language | English and French |
Head of the Early Childhood and Schools Division | Yuri Belfali |
Main organ | PISA Governing Body (Chair – Michele Bruniges) |
Parent organization | OECD |
Website | www |
The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations intended to evaluate educational systems by measuring 15-year-old school pupils' scholastic performance on mathematics, science, and reading. [1] It was first performed in 2000 and then repeated every three years. Its aim is to provide comparable data with a view to enabling countries to improve their education policies and outcomes. It measures problem solving and cognition. [2]
The results of the 2022 data collection were released in December 2023. [3]
PISA, and similar international standardised assessments of educational attainment are increasingly used in the process of education policymaking at both national and international levels. [4]
PISA was conceived to set in a wider context the information provided by national monitoring of education system performance through regular assessments within a common, internationally agreed framework; by investigating relationships between student learning and other factors they can "offer insights into sources of variation in performances within and between countries". [5]
Until the 1990s, few European countries used national tests. In the 1990s, ten countries / regions introduced standardised assessment, and since the early 2000s, ten more followed suit. By 2009, only five European education systems had no national student assessments. [4]
The impact of these international standardised assessments in the field of educational policy has been significant, in terms of the creation of new knowledge, changes in assessment policy, and external influence over national educational policy more broadly. [6] [7] [8]
Data from international standardised assessments can be useful in research on causal factors within or across education systems. [4] Mons notes that the databases generated by large-scale international assessments have made it possible to carry out inventories and comparisons of education systems on an unprecedented scale* on themes ranging from the conditions for learning mathematics and reading, to institutional autonomy and admissions policies. [9] They allow typologies to be developed that can be used for comparative statistical analyses of education performance indicators, thereby identifying the consequences of different policy choices. They have generated new knowledge about education: PISA findings have challenged deeply embedded educational practices, such as the early tracking of students into vocational or academic pathways. [10]
Barroso and de Carvalho find that PISA provides a common reference connecting academic research in education and the political realm of public policy, operating as a mediator between different strands of knowledge from the realm of education and public policy. [11] However, although the key findings from comparative assessments are widely shared in the research community [4] the knowledge they create does not necessarily fit with government reform agendas; this leads to some inappropriate uses of assessment data.
Emerging research suggests that international standardised assessments are having an impact on national assessment policy and practice. PISA is being integrated into national policies and practices on assessment, evaluation, curriculum standards and performance targets; its assessment frameworks and instruments are being used as best-practice models for improving national assessments; many countries have explicitly incorporated and emphasise PISA-like competencies in revised national standards and curricula; others use PISA data to complement national data and validate national results against an international benchmark. [10]
PISA may influence national education policy choices in a variety of ways. Participation in international assessments like PISA has been linked to significant education policy changes and outcomes, such as higher student enrollments and education reforms. [6] However, critics have argued that participation could lead to undesirable outcomes, such as higher repetition rates and narrowing of curricula. [7] The impact of PISA may also vary according to the specific country context. [12]
Policy-makers in most participating countries see PISA as an important indicator of system performance; PISA reports can define policy problems and set the agenda for national policy debate; policymakers seem to accept PISA as a valid and reliable instrument for internationally benchmarking system performance and changes over time; most countries—irrespective of whether they performed above, at, or below the average PISA score—have begun policy reforms in response to PISA reports. [10]
Against this, impact on national education systems varies markedly. For example, in Germany, the results of the first PISA assessment caused the so-called 'PISA shock': a questioning of previously accepted educational policies; in a state marked by jealously guarded regional policy differences, it led ultimately to an agreement by all Länder to introduce common national standards and even an institutionalised structure to ensure that they were observed. [13] In Hungary, by comparison, which shared similar conditions to Germany, PISA results have not led to significant changes in educational policy. [14]
Because many countries have set national performance targets based on their relative rank or absolute PISA score, PISA assessments have increased the influence of their (non-elected) commissioning body, the OECD, as an international education monitor and policy actor, which implies an important degree of 'policy transfer' from the international to the national level; PISA in particular is having "an influential normative effect on the direction of national education policies". [10] Thus, it is argued that the use of international standardised assessments has led to a shift towards international, external accountability for national system performance; Rey contends that PISA surveys, portrayed as objective, third-party diagnoses of education systems, actually serve to promote specific orientations on educational issues. [4]
National policy actors refer to high-performing PISA countries to "help legitimise and justify their intended reform agenda within contested national policy debates". [15] PISA data can be "used to fuel long-standing debates around pre-existing conflicts or rivalries between different policy options, such as in the French Community of Belgium". [16] In such instances, PISA assessment data are used selectively: in public discourse governments often only use superficial features of PISA surveys such as country rankings and not the more detailed analyses. Rey (2010:145, citing Greger, 2008) notes that often the real results of PISA assessments are ignored as policymakers selectively refer to data in order to legitimise policies introduced for other reasons. [17]
In addition, PISA's international comparisons can be used to justify reforms with which the data themselves have no connection; in Portugal, for example, PISA data were used to justify new arrangements for teacher assessment (based on inferences that were not justified by the assessments and data themselves); they also fed the government's discourse about the issue of pupils repeating a year, (which, according to research, fails to improve student results). [18] In Finland, the country's PISA results (that are in other countries deemed to be excellent) were used by Ministers to promote new policies for 'gifted' students. [19] Such uses and interpretations often assume causal relationships that cannot legitimately be based upon PISA data which would normally require fuller investigation through qualitative in-depth studies and longitudinal surveys based on mixed quantitative and qualitative methods, [20] which politicians are often reluctant to fund.
Recent decades have witnessed an expansion in the uses of PISA and similar assessments, from assessing students' learning, to connecting "the educational realm (their traditional remit) with the political realm". [21] This raises the question of whether PISA data are sufficiently robust to bear the weight of the major policy decisions that are being based upon them, for, according to Breakspear, PISA data have "come to increasingly shape, define and evaluate the key goals of the national / federal education system". [10] This implies that those who set the PISA tests – e.g. in choosing the content to be assessed and not assessed – are in a position of considerable power to set the terms of the education debate, and to orient educational reform in many countries around the globe. [10]
PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).
PISA aims to test literacy the competence of students in three fields: reading, mathematics, science on an indefinite scale. [22]
The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).
In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts." [23]
PISA also assesses students in innovative domains. In 2012 and 2015 in addition to reading, mathematics and science, they were tested in collaborative problem solving. In 2018 the additional innovative domain was global competence.
PISA is sponsored, governed, and coordinated by the OECD, but paid for by participating countries.[ citation needed ]
The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.
To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.
Each student takes a two-hour computer based test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation, and family. School directors fill in a questionnaire describing school demographics, funding, etc. In 2012 the participants were, for the first time in the history of large-scale testing and assessments, offered a new type of problem, i.e. interactive (complex) problems requiring exploration of a novel virtual device. [24] [25]
In selected countries, PISA started experimentation with computer adaptive testing.
Countries are allowed to combine PISA with complementary national tests.
Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take the national test only. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests. [26]
From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and the standard deviation is 100. [27] This is true only for the initial PISA cycle when the scale was first introduced, though, subsequent cycles are linked to the previous cycles through IRT scale linking methods. [28]
This generation of proficiency estimates is done using a latent regression extension of the Rasch model, a model of item response theory (IRT), also known as conditioning model or population model. The proficiency estimates are provided in the form of so-called plausible values, which allow unbiased estimates of differences between groups. The latent regression, together with the use of a Gaussian prior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students. [29] The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.
All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[ citation needed ]
PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.
The results of PISA 2018 were presented on 5 December 2018, which included data for around 700,000 participating students in 81 countries and economies, with Singapore emerging as the top performer in all categories. [30]
Both Lebanon and the Chinese provinces/municipalities of Beijing, Shanghai, Jiangsu and Zhejiang were participants in these edition, but their results were not published as they were not able to fully collect data because of COVID restrictions. [31]
Because of the Russian full-scale invasion of Ukraine, only 18 of 27 Ukrainian regions had their data collected, thus the results are not representative of the following regions: Dnipropetrovsk Oblast, Donetsk Oblast, Kharkiv Oblast, Luhansk Oblast, Zaporizhzhia Oblast, Kherson Oblast, Mykolaiv Oblast, the Autonomous Republic of Crimea and the city of Sevastopol. [32]
Mathematics | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Country | 2015 | 2012 | 2009 | 2006 | 2003 | 2000 | ||||||
Score | Rank | Score | Rank | Score | Rank | Score | Rank | Score | Rank | Score | Rank | |
International Average (OECD) | 490 | — | 494 | — | 495 | — | 494 | — | 499 | — | 492 | — |
Albania | 413 | 57 | 394 | 54 | 377 | 53 | — | — | — | — | 381 | 33 |
Algeria | 360 | 72 | — | — | — | — | — | — | — | — | — | — |
Argentina | 409 | 58 | — | — | — | — | — | — | — | — | 388 | 30 |
Australia | 494 | 25 | 504 | 17 | 514 | 13 | 520 | 12 | 524 | 10 | 533 | 6 |
Austria | 497 | 20 | 506 | 16 | 496 | 22 | 505 | 17 | 506 | 18 | 503 | 12 |
China B-S-J-G [a] | 531 | 6 | — | — | — | — | — | — | — | — | — | — |
Belgium | 507 | 15 | 515 | 13 | 515 | 12 | 520 | 11 | 529 | 7 | 520 | 8 |
Brazil | 377 | 68 | 389 | 55 | 386 | 51 | 370 | 50 | 356 | 39 | 334 | 35 |
Bulgaria | 441 | 47 | 439 | 43 | 428 | 41 | 413 | 43 | — | — | 430 | 28 |
Argentina CABA [b] | 456 | 43 | 418 | 49 | — | — | — | — | — | — | — | — |
Canada | 516 | 10 | 518 | 11 | 527 | 8 | 527 | 7 | 532 | 6 | 533 | 6 |
Chile | 423 | 50 | 423 | 47 | 421 | 44 | 411 | 44 | — | — | 384 | 32 |
Taiwan | 542 | 4 | 560 | 3 | 543 | 4 | 549 | 1 | — | — | — | — |
Colombia | 390 | 64 | 376 | 58 | 381 | 52 | 370 | 49 | — | — | — | — |
Costa Rica | 400 | 62 | 407 | 53 | — | — | — | — | — | — | — | — |
Croatia | 464 | 41 | 471 | 38 | 460 | 38 | 467 | 34 | — | — | — | — |
Cyprus | 437 | 48 | — | — | — | — | — | — | — | — | — | — |
Czech Republic | 492 | 28 | 499 | 22 | 493 | 25 | 510 | 15 | 516 | 12 | 498 | 14 |
Denmark | 511 | 12 | 500 | 20 | 503 | 17 | 513 | 14 | 514 | 14 | 514 | 10 |
Dominican Republic | 328 | 73 | — | — | — | — | — | — | — | — | — | — |
Estonia | 520 | 9 | 521 | 9 | 512 | 15 | 515 | 13 | — | — | — | — |
Finland | 511 | 13 | 519 | 10 | 541 | 5 | 548 | 2 | 544 | 2 | 536 | 5 |
France | 493 | 26 | 495 | 23 | 497 | 20 | 496 | 22 | 511 | 15 | 517 | 9 |
Macedonia | 371 | 69 | — | — | — | — | — | — | — | — | 381 | 33 |
Georgia | 404 | 60 | — | — | — | — | — | — | — | — | — | — |
Germany | 506 | 16 | 514 | 14 | 513 | 14 | 504 | 19 | 503 | 19 | 490 | 16 |
Greece | 454 | 44 | 453 | 40 | 466 | 37 | 459 | 37 | 445 | 32 | 447 | 24 |
Hong Kong | 548 | 2 | 561 | 2 | 555 | 2 | 547 | 3 | 550 | 1 | 560 | 1 |
Hungary | 477 | 37 | 477 | 37 | 490 | 27 | 491 | 26 | 490 | 25 | 488 | 17 |
Iceland | 488 | 31 | 493 | 25 | 507 | 16 | 506 | 16 | 515 | 13 | 514 | 10 |
Indonesia | 386 | 66 | 375 | 60 | 371 | 55 | 391 | 47 | 360 | 37 | 367 | 34 |
Ireland | 504 | 18 | 501 | 18 | 487 | 30 | 501 | 21 | 503 | 20 | 503 | 12 |
Israel | 470 | 39 | 466 | 39 | 447 | 39 | 442 | 38 | — | — | 433 | 26 |
Italy | 490 | 30 | 485 | 30 | 483 | 33 | 462 | 36 | 466 | 31 | 457 | 22 |
Japan | 532 | 5 | 536 | 6 | 529 | 7 | 523 | 9 | 534 | 5 | 557 | 2 |
Jordan | 380 | 67 | 386 | 57 | 387 | 50 | 384 | 48 | — | — | — | — |
Kazakhstan | 460 | 42 | 432 | 45 | 405 | 48 | — | — | — | — | — | — |
South Korea | 524 | 7 | 554 | 4 | 546 | 3 | 547 | 4 | 542 | 3 | 547 | 3 |
Kosovo | 362 | 71 | — | — | — | — | — | — | — | — | — | — |
Latvia | 482 | 34 | 491 | 26 | 482 | 34 | 486 | 30 | 483 | 27 | 463 | 21 |
Lebanon | 396 | 63 | — | — | — | — | — | — | — | — | — | — |
Lithuania | 478 | 36 | 479 | 35 | 477 | 35 | 486 | 29 | — | — | — | — |
Luxembourg | 486 | 33 | 490 | 27 | 489 | 28 | 490 | 27 | 493 | 23 | 446 | 25 |
Macau | 544 | 3 | 538 | 5 | 525 | 10 | 525 | 8 | 527 | 8 | — | — |
Malaysia | 446 | 45 | 421 | 48 | — | — | — | — | — | — | — | — |
Malta | 479 | 35 | — | — | — | — | — | — | — | — | — | — |
Mexico | 408 | 59 | 413 | 50 | 419 | 46 | 406 | 45 | 385 | 36 | 387 | 31 |
Moldova | 420 | 52 | — | — | — | — | — | — | — | — | — | — |
Montenegro | 418 | 54 | 410 | 51 | 403 | 49 | 399 | 46 | — | — | — | — |
Netherlands | 512 | 11 | 523 | 8 | 526 | 9 | 531 | 5 | 538 | 4 | — | — |
New Zealand | 495 | 21 | 500 | 21 | 519 | 11 | 522 | 10 | 523 | 11 | 537 | 4 |
Norway | 502 | 19 | 489 | 28 | 498 | 19 | 490 | 28 | 495 | 22 | 499 | 13 |
Peru | 387 | 65 | 368 | 61 | 365 | 57 | — | — | — | — | 292 | 36 |
Poland | 504 | 17 | 518 | 12 | 495 | 23 | 495 | 24 | 490 | 24 | 470 | 20 |
Portugal | 492 | 29 | 487 | 29 | 487 | 31 | 466 | 35 | 466 | 30 | 454 | 23 |
Qatar | 402 | 61 | 376 | 59 | 368 | 56 | 318 | 52 | — | — | — | — |
Romania | 444 | 46 | 445 | 42 | 427 | 42 | 415 | 42 | — | — | 426 | 29 |
Russia | 494 | 23 | 482 | 32 | 468 | 36 | 476 | 32 | 468 | 29 | 478 | 18 |
Singapore | 564 | 1 | 573 | 1 | 562 | 1 | — | — | — | — | — | — |
Slovakia | 475 | 38 | 482 | 33 | 497 | 21 | 492 | 25 | 498 | 21 | — | — |
Slovenia | 510 | 14 | 501 | 19 | 501 | 18 | 504 | 18 | — | — | — | — |
Spain | 486 | 32 | 484 | 31 | 483 | 32 | 480 | 31 | 485 | 26 | 476 | 19 |
Sweden | 494 | 24 | 478 | 36 | 494 | 24 | 502 | 20 | 509 | 16 | 510 | 11 |
Switzerland | 521 | 8 | 531 | 7 | 534 | 6 | 530 | 6 | 527 | 9 | 529 | 7 |
Thailand | 415 | 56 | 427 | 46 | 419 | 45 | 417 | 41 | 417 | 35 | 432 | 27 |
Trinidad and Tobago | 417 | 55 | — | — | 414 | 47 | — | — | — | — | — | — |
Tunisia | 367 | 70 | 388 | 56 | 371 | 54 | 365 | 51 | 359 | 38 | — | — |
Turkey | 420 | 51 | 448 | 41 | 445 | 40 | 424 | 40 | 423 | 33 | — | — |
United Arab Emirates | 427 | 49 | 434 | 44 | — | — | — | — | — | — | — | — |
United Kingdom | 492 | 27 | 494 | 24 | 492 | 26 | 495 | 23 | 508 | 17 | 529 | 7 |
United States | 470 | 40 | 481 | 34 | 487 | 29 | 474 | 33 | 483 | 28 | 493 | 15 |
Uruguay | 418 | 53 | 409 | 52 | 427 | 43 | 427 | 39 | 422 | 34 | — | — |
Vietnam | 495 | 22 | 511 | 15 | — | — | — | — | — | — | — | — |
Science | ||||||||
---|---|---|---|---|---|---|---|---|
Country | 2015 | 2012 | 2009 | 2006 | ||||
Score | Rank | Score | Rank | Score | Rank | Score | Rank | |
International Average (OECD) | 493 | — | 501 | — | 501 | — | 498 | — |
Albania | 427 | 54 | 397 | 58 | 391 | 54 | — | — |
Algeria | 376 | 72 | — | — | — | — | — | — |
Argentina | 432 | 52 | — | — | — | — | — | — |
Australia | 510 | 14 | 521 | 14 | 527 | 9 | 527 | 8 |
Austria | 495 | 26 | 506 | 21 | 494 | 28 | 511 | 17 |
China B-S-J-G [a] | 518 | 10 | — | — | — | — | — | — |
Belgium | 502 | 20 | 505 | 22 | 507 | 19 | 510 | 18 |
Brazil | 401 | 66 | 402 | 55 | 405 | 49 | 390 | 49 |
Bulgaria | 446 | 46 | 446 | 43 | 439 | 42 | 434 | 40 |
Argentina CABA [b] | 475 | 38 | 425 | 49 | — | — | — | — |
Canada | 528 | 7 | 525 | 9 | 529 | 7 | 534 | 3 |
Chile | 447 | 45 | 445 | 44 | 447 | 41 | 438 | 39 |
Taiwan | 532 | 4 | 523 | 11 | 520 | 11 | 532 | 4 |
Colombia | 416 | 60 | 399 | 56 | 402 | 50 | 388 | 50 |
Costa Rica | 420 | 58 | 429 | 47 | — | — | — | — |
Croatia | 475 | 37 | 491 | 32 | 486 | 35 | 493 | 25 |
Cyprus | 433 | 51 | — | — | — | — | — | — |
Czech Republic | 493 | 29 | 508 | 20 | 500 | 22 | 513 | 14 |
Denmark | 502 | 21 | 498 | 25 | 499 | 24 | 496 | 23 |
Dominican Republic | 332 | 73 | — | — | — | — | — | — |
Estonia | 534 | 3 | 541 | 5 | 528 | 8 | 531 | 5 |
Finland | 531 | 5 | 545 | 4 | 554 | 1 | 563 | 1 |
France | 495 | 27 | 499 | 24 | 498 | 25 | 495 | 24 |
Macedonia | 384 | 70 | — | — | — | — | — | — |
Georgia | 411 | 63 | — | — | — | — | — | — |
Germany | 509 | 16 | 524 | 10 | 520 | 12 | 516 | 12 |
Greece | 455 | 44 | 467 | 40 | 470 | 38 | 473 | 37 |
Hong Kong | 523 | 9 | 555 | 1 | 549 | 2 | 542 | 2 |
Hungary | 477 | 35 | 494 | 30 | 503 | 20 | 504 | 20 |
Iceland | 473 | 39 | 478 | 37 | 496 | 26 | 491 | 26 |
Indonesia | 403 | 65 | 382 | 60 | 383 | 55 | 393 | 48 |
Ireland | 503 | 19 | 522 | 13 | 508 | 18 | 508 | 19 |
Israel | 467 | 40 | 470 | 39 | 455 | 39 | 454 | 38 |
Italy | 481 | 34 | 494 | 31 | 489 | 33 | 475 | 35 |
Japan | 538 | 2 | 547 | 3 | 539 | 4 | 531 | 6 |
Jordan | 409 | 64 | 409 | 54 | 415 | 47 | 422 | 43 |
Kazakhstan | 456 | 43 | 425 | 48 | 400 | 53 | — | — |
South Korea | 516 | 11 | 538 | 6 | 538 | 5 | 522 | 10 |
Kosovo | 378 | 71 | — | — | — | — | — | — |
Latvia | 490 | 31 | 502 | 23 | 494 | 29 | 490 | 27 |
Lebanon | 386 | 68 | — | — | — | — | — | — |
Lithuania | 475 | 36 | 496 | 28 | 491 | 31 | 488 | 31 |
Luxembourg | 483 | 33 | 491 | 33 | 484 | 36 | 486 | 33 |
Macau | 529 | 6 | 521 | 15 | 511 | 16 | 511 | 16 |
Malaysia | 443 | 47 | 420 | 50 | — | — | — | — |
Malta | 465 | 41 | — | — | — | — | — | — |
Mexico | 416 | 61 | 415 | 52 | 416 | 46 | 410 | 47 |
Moldova | 428 | 53 | — | — | — | — | — | — |
Montenegro | 411 | 62 | 410 | 53 | 401 | 51 | 412 | 46 |
Netherlands | 509 | 17 | 522 | 12 | 522 | 10 | 525 | 9 |
New Zealand | 513 | 12 | 516 | 16 | 532 | 6 | 530 | 7 |
Norway | 498 | 24 | 495 | 29 | 500 | 23 | 487 | 32 |
Peru | 397 | 67 | 373 | 61 | 369 | 57 | — | — |
Poland | 501 | 22 | 526 | 8 | 508 | 17 | 498 | 22 |
Portugal | 501 | 23 | 489 | 34 | 493 | 30 | 474 | 36 |
Qatar | 418 | 59 | 384 | 59 | 379 | 56 | 349 | 52 |
Romania | 435 | 50 | 439 | 46 | 428 | 43 | 418 | 45 |
Russia | 487 | 32 | 486 | 35 | 478 | 37 | 479 | 34 |
Singapore | 556 | 1 | 551 | 2 | 542 | 3 | — | — |
Slovakia | 461 | 42 | 471 | 38 | 490 | 32 | 488 | 29 |
Slovenia | 513 | 13 | 514 | 18 | 512 | 15 | 519 | 11 |
Spain | 493 | 30 | 496 | 27 | 488 | 34 | 488 | 30 |
Sweden | 493 | 28 | 485 | 36 | 495 | 27 | 503 | 21 |
Switzerland | 506 | 18 | 515 | 17 | 517 | 13 | 512 | 15 |
Thailand | 421 | 57 | 444 | 45 | 425 | 45 | 421 | 44 |
Trinidad and Tobago | 425 | 56 | — | — | 410 | 48 | — | — |
Tunisia | 386 | 69 | 398 | 57 | 401 | 52 | 386 | 51 |
Turkey | 425 | 55 | 463 | 41 | 454 | 40 | 424 | 42 |
United Arab Emirates | 437 | 48 | 448 | 42 | — | — | — | — |
United Kingdom | 509 | 15 | 514 | 19 | 514 | 14 | 515 | 13 |
United States | 496 | 25 | 497 | 26 | 502 | 21 | 489 | 28 |
Uruguay | 435 | 49 | 416 | 51 | 427 | 44 | 428 | 41 |
Vietnam | 525 | 8 | 528 | 7 | — | — | — | — |
Reading | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Country | 2015 | 2012 | 2009 | 2006 | 2003 | 2000 | ||||||
Score | Rank | Score | Rank | Score | Rank | Score | Rank | Score | Rank | Score | Rank | |
International Average (OECD) | 493 | — | 496 | — | 493 | — | 489 | — | 494 | — | 493 | — |
Albania | 405 | 63 | 394 | 58 | 385 | 55 | — | — | — | — | 349 | 39 |
Algeria | 350 | 71 | — | — | — | — | — | — | — | — | — | — |
Argentina | 425 | 56 | — | — | — | — | — | — | — | — | — | — |
Australia | 503 | 16 | 512 | 12 | 515 | 8 | 513 | 7 | 525 | 4 | 528 | 4 |
Austria | 485 | 33 | 490 | 26 | 470 | 37 | 490 | 21 | 491 | 22 | 492 | 19 |
China B-S-J-G [a] | 494 | 27 | — | — | — | — | — | — | — | — | — | — |
Belgium | 499 | 20 | 509 | 16 | 506 | 10 | 501 | 11 | 507 | 11 | 507 | 11 |
Brazil | 407 | 62 | 407 | 52 | 412 | 49 | 393 | 47 | 403 | 36 | 396 | 36 |
Bulgaria | 432 | 49 | 436 | 47 | 429 | 42 | 402 | 43 | — | — | 430 | 32 |
Argentina CABA [b] | 475 | 38 | 429 | 48 | — | — | — | — | — | — | — | — |
Canada | 527 | 3 | 523 | 7 | 524 | 5 | 527 | 4 | 528 | 3 | 534 | 2 |
Chile | 459 | 42 | 441 | 43 | 449 | 41 | 442 | 37 | — | — | 410 | 35 |
Taiwan | 497 | 23 | 523 | 8 | 495 | 21 | 496 | 15 | — | — | — | — |
Colombia | 425 | 57 | 403 | 54 | 413 | 48 | 385 | 49 | — | — | — | — |
Costa Rica | 427 | 52 | 441 | 45 | — | — | — | — | — | — | — | — |
Croatia | 487 | 31 | 485 | 33 | 476 | 34 | 477 | 29 | — | — | — | — |
Cyprus | 443 | 45 | — | — | — | — | — | — | — | — | — | — |
Czech Republic | 487 | 30 | 493 | 24 | 478 | 32 | 483 | 25 | 489 | 24 | 492 | 20 |
Denmark | 500 | 18 | 496 | 23 | 495 | 22 | 494 | 18 | 492 | 19 | 497 | 16 |
Dominican Republic | 358 | 69 | — | — | — | — | — | — | — | — | — | — |
Estonia | 519 | 6 | 516 | 10 | 501 | 12 | 501 | 12 | — | — | — | — |
Finland | 526 | 4 | 524 | 5 | 536 | 2 | 547 | 2 | 543 | 1 | 546 | 1 |
France | 499 | 19 | 505 | 19 | 496 | 20 | 488 | 22 | 496 | 17 | 505 | 14 |
Macedonia | 352 | 70 | — | — | — | — | — | — | — | — | 373 | 37 |
Georgia | 401 | 65 | — | — | — | — | — | — | — | — | — | — |
Germany | 509 | 11 | 508 | 18 | 497 | 18 | 495 | 17 | 491 | 21 | 484 | 22 |
Greece | 467 | 41 | 477 | 38 | 483 | 30 | 460 | 35 | 472 | 30 | 474 | 25 |
Hong Kong | 527 | 2 | 545 | 1 | 533 | 3 | 536 | 3 | 510 | 9 | 525 | 6 |
Hungary | 470 | 40 | 488 | 28 | 494 | 24 | 482 | 26 | 482 | 25 | 480 | 23 |
Iceland | 482 | 35 | 483 | 35 | 500 | 15 | 484 | 23 | 492 | 20 | 507 | 12 |
Indonesia | 397 | 67 | 396 | 57 | 402 | 53 | 393 | 46 | 382 | 38 | 371 | 38 |
Ireland | 521 | 5 | 523 | 6 | 496 | 19 | 517 | 6 | 515 | 6 | 527 | 5 |
Israel | 479 | 37 | 486 | 32 | 474 | 35 | 439 | 39 | — | — | 452 | 29 |
Italy | 485 | 34 | 490 | 25 | 486 | 27 | 469 | 32 | 476 | 29 | 487 | 21 |
Japan | 516 | 8 | 538 | 3 | 520 | 7 | 498 | 14 | 498 | 14 | 522 | 9 |
Jordan | 408 | 61 | 399 | 55 | 405 | 51 | 401 | 44 | — | — | — | — |
Kazakhstan | 427 | 54 | 393 | 59 | 390 | 54 | — | — | — | — | — | — |
South Korea | 517 | 7 | 536 | 4 | 539 | 1 | 556 | 1 | 534 | 2 | 525 | 7 |
Kosovo | 347 | 72 | — | — | — | — | — | — | — | — | — | — |
Latvia | 488 | 29 | 489 | 27 | 484 | 28 | 479 | 27 | 491 | 23 | 458 | 28 |
Lebanon | 347 | 73 | — | — | — | — | — | — | — | — | — | — |
Lithuania | 472 | 39 | 477 | 37 | 468 | 38 | 470 | 31 | — | — | — | — |
Luxembourg | 481 | 36 | 488 | 30 | 472 | 36 | 479 | 28 | 479 | 27 | 441 | 30 |
Macau | 509 | 12 | 509 | 15 | 487 | 26 | 492 | 20 | 498 | 15 | — | — |
Malaysia | 431 | 50 | 398 | 56 | — | — | — | — | — | — | — | — |
Malta | 447 | 44 | — | — | — | — | — | — | — | — | — | — |
Mexico | 423 | 58 | 424 | 49 | 425 | 44 | 410 | 42 | 400 | 37 | 422 | 34 |
Moldova | 416 | 59 | — | — | — | — | — | — | — | — | — | — |
Montenegro | 427 | 55 | 422 | 50 | 408 | 50 | 392 | 48 | — | — | — | — |
Netherlands | 503 | 15 | 511 | 13 | 508 | 9 | 507 | 10 | 513 | 8 | ‡ | ‡ |
New Zealand | 509 | 10 | 512 | 11 | 521 | 6 | 521 | 5 | 522 | 5 | 529 | 3 |
Norway | 513 | 9 | 504 | 20 | 503 | 11 | 484 | 24 | 500 | 12 | 505 | 13 |
Peru | 398 | 66 | 384 | 61 | 370 | 57 | — | — | — | — | 327 | 40 |
Poland | 506 | 13 | 518 | 9 | 500 | 14 | 508 | 8 | 497 | 16 | 479 | 24 |
Portugal | 498 | 21 | 488 | 31 | 489 | 25 | 472 | 30 | 478 | 28 | 470 | 26 |
Qatar | 402 | 64 | 388 | 60 | 372 | 56 | 312 | 51 | — | — | — | — |
Romania | 434 | 47 | 438 | 46 | 424 | 45 | 396 | 45 | — | — | 428 | 33 |
Russia | 495 | 26 | 475 | 40 | 459 | 40 | 440 | 38 | 442 | 32 | 462 | 27 |
Singapore | 535 | 1 | 542 | 2 | 526 | 4 | — | — | — | — | — | — |
Slovakia | 453 | 43 | 463 | 41 | 477 | 33 | 466 | 33 | 469 | 31 | — | — |
Slovenia | 505 | 14 | 481 | 36 | 483 | 29 | 494 | 19 | — | — | — | — |
Spain | 496 | 25 | 488 | 29 | 481 | 31 | 461 | 34 | 481 | 26 | 493 | 18 |
Sweden | 500 | 17 | 483 | 34 | 497 | 17 | 507 | 9 | 514 | 7 | 516 | 10 |
Switzerland | 492 | 28 | 509 | 14 | 501 | 13 | 499 | 13 | 499 | 13 | 494 | 17 |
Thailand | 409 | 60 | 441 | 44 | 421 | 46 | 417 | 40 | 420 | 35 | 431 | 31 |
Trinidad and Tobago | 427 | 53 | — | — | 416 | 47 | — | — | — | — | — | — |
Tunisia | 361 | 68 | 404 | 53 | 404 | 52 | 380 | 50 | 375 | 39 | — | — |
Turkey | 428 | 51 | 475 | 39 | 464 | 39 | 447 | 36 | 441 | 33 | — | — |
United Arab Emirates | 434 | 48 | 442 | 42 | — | — | — | — | — | — | — | — |
United Kingdom | 498 | 22 | 499 | 21 | 494 | 23 | 495 | 16 | 507 | 10 | 523 | 8 |
United States | 497 | 24 | 498 | 22 | 500 | 16 | — | — | 495 | 18 | 504 | 15 |
Uruguay | 437 | 46 | 411 | 51 | 426 | 43 | 413 | 41 | 434 | 34 | — | — |
Vietnam | 487 | 32 | 508 | 17 | — | — | — | — | — | — | — | — |
Period | Focus | OECD countries | Partner countries | Participating students | Notes |
---|---|---|---|---|---|
2000 | Reading | 28 | 4 + 11 | 265,000 | The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002. |
2003 | Mathematics | 30 | 11 | 275,000 | UK disqualified from data analysis, due to its low response rate. [33] Also included test in problem solving. |
2006 | Science | 30 | 27 | 400,000 | Reading scores for US disqualified from analysis due to misprint in testing materials. [34] |
2009 [35] | Reading | 34 | 41 + 10 | 470,000 | 10 additional non-OECD countries took the test in 2010. [36] [37] |
2012 [38] | Mathematics | 34 | 31 | 510,000 |
China's participation in the 2012 test was limited to Shanghai, Hong Kong, and Macau as separate entities. In 2012, Shanghai participated for the second time, again topping the rankings in all three subjects, as well as improving scores in the subjects compared to the 2009 tests. Shanghai's score of 613 in mathematics was 113 points above the average score, putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries. Educational experts debated to what degree this result reflected the quality of the general educational system in China, pointing out that Shanghai has greater wealth and better-paid teachers than the rest of China. [39] Hong Kong placed second in reading and science and third in maths.
Andreas Schleicher, PISA division head and co-ordinator, stated that PISA tests administered in rural China have produced some results approaching the OECD average. Citing further as-yet-unpublished OECD research, he said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average." [40] Schleicher believes that China has also expanded school access and has moved away from learning by rote, [41] performing well in both rote-based and broader assessments. [40]
In 2018 the Chinese provinces that participated were Beijing, Shanghai, Jiangsu and Zhejiang. In 2015, the participating provinces were Jiangsu, Guangdong, Beijing, and Shanghai. [42] The 2015 Beijing-Shanghai-Jiangsu-Guangdong cohort scored a median 518 in science in 2015, while the 2012 Shanghai cohort scored a median 580.
Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due to hukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15-year-olds residing there. [43] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US. [44] Following the 2015 testing, OECD published in depth studies on the education systems of a selected few countries including China. [45]
In 2014, Liz Truss, the British Parliamentary Under-Secretary of State at the Department for Education, led a fact-finding visit to schools and teacher-training centres in Shanghai. [46] Britain increased exchanges with Chinese teachers and schools to find out how to improve quality. In 2014, 60 teachers from Shanghai were invited to the UK to help share their teaching methods, support pupils who are struggling, and help to train other teachers. [47] In 2016, Britain invited 120 Chinese teachers, planning to adopt Chinese styles of teaching in 8,000 aided schools. [48] By 2019, approximately 5,000 of Britain's 16,000 primary schools had adopted the Shanghai's teaching methods. [49] The performance of British schools in PISA improved after adopting China's teaching styles. [50] [51]
Finland, which received several top positions in the first tests, fell in all three subjects in 2012, but remained the best performing country overall in Europe, achieving their best result in science with 545 points (5th) and worst in mathematics with 519 (12th) in which the country was outperformed by four other European countries. The drop in mathematics was 25 points since 2003, the last time mathematics was the focus of the tests. For the first time Finnish girls outperformed boys in mathematics narrowly. It was also the first time pupils in Finnish-speaking schools did not perform better than pupils in Swedish-speaking schools. Former minister of Education and Science Krista Kiuru expressed concern for the overall drop, as well as the fact that the number of low-performers had increased from 7% to 12%. [52]
India participated in the 2009 round of testing but pulled out of the 2012 PISA testing, with the Indian government attributing its action to the unfairness of PISA testing to Indian students. [53] India had ranked 72nd out of 73 countries tested in 2009. [54] The Indian Express reported, "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this". [55] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".
India did not participate in the 2012, 2015 and 2018 PISA rounds. [56]
A Kendriya Vidyalaya Sangathan (KVS) committee as well as a group of secretaries on education constituted by the Prime Minister of India Narendra Modi recommended that India should participate in PISA. Accordingly, in February 2017, the Ministry of Human Resource Development under Prakash Javadekar decided to end the boycott and participate in PISA from 2020. To address the socio-cultural disconnect between the test questions and students, it was reported that the OECD will update some questions. For example, the word avocado in a question may be replaced with a more popular Indian fruit such as mango. [57]
India did not participate in the 2022 PISA rounds citing due to COVID-19 pandemic disruption. [58]
In 2015, the results from Malaysia were found by the OECD to have not met the maximum response rate. [59] Opposition politician Ong Kian Ming said the education ministry tried to oversample high-performing students in rich schools. [60] [61]
Sweden's result dropped in all three subjects in the 2012 test, which was a continuation of a trend from 2006 and 2009. It saw the sharpest fall in mathematics performance with a drop in score from 509 in 2003 to 478 in 2012. The score in reading showed a drop from 516 in 2000 to 483 in 2012. The country performed below the OECD average in all three subjects. [62] The leader of the opposition, Social Democrat Stefan Löfven, described the situation as a national crisis. [63] Along with the party's spokesperson on education, Ibrahim Baylan, he pointed to the downward trend in reading as most severe. [63]
In 2020, Swedish newspaper Expressen revealed that Sweden had inflated their score in PISA 2018 by not conforming to OECD standards. According to professor Magnus Henrekson a large number of foreign-born students had not been tested. [64]
In the 2012 test, as in 2009, the result was slightly above average for the United Kingdom, with the science ranking being highest (20). [65] England, Wales, Scotland and Northern Ireland also participated as separated entities, showing the worst result for Wales which in mathematics was 43rd of the 65 countries and economies. Minister of Education in Wales Huw Lewis expressed disappointment in the results, said that there were no "quick fixes", but hoped that several educational reforms that have been implemented in the last few years would give better results in the next round of tests. [66] The United Kingdom had a greater gap between high- and low-scoring students than the average. There was little difference between public and private schools when adjusted for socio-economic background of students. The gender difference in favour of girls was less than in most other countries, as was the difference between natives and immigrants. [65]
Writing in the Daily Telegraph , Ambrose Evans-Pritchard warned against putting too much emphasis on the UK's international ranking, arguing that an overfocus on scholarly performances in East Asia might have contributed to the area's low birthrate, which he argued could harm the economic performance in the future more than a good PISA score would outweigh. [67]
In 2013, the Times Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities. [68]
In the article, Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things," he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."
Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless". [69] Goldstein remarked that Dr. Morrison's objection highlights "an important technical issue" if not a "profound conceptual error". However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects." Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECD's "senior people" about them, his points were met with "absolute silence" and have yet to be addressed. "I was amazed at how unforthcoming they were," he told TES. "That makes me suspicious." "Pisa steadfastly ignored many of these issues," he says. "I am still concerned." [70]
Professor Svend Kreiner, of the University of Copenhagen, agreed: "One of the problems that everybody has with PISA is that they don't want to discuss things with people criticising or asking questions concerning the results. They didn't want to talk to me at all. I am sure it is because they can't defend themselves. [70]
Since 2012 a few states have participated in the PISA tests as separate entities. Only the 2012 and 2015 results are available on a state basis. Puerto Rico participated in 2015 as a separate US entity as well.
Mathematics | Science | Reading | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
|
Mathematics | Science | Reading | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
|
Mathematics | ||||||
---|---|---|---|---|---|---|
Race | 2018 [71] | 2015 | 2012 | 2009 | 2006 | 2003 |
Score | Score | Score | Score | Score | Score | |
Asian | 539 | 498 | 549 | 524 | 494 | 506 |
White | 503 | 499 | 506 | 515 | 502 | 512 |
US Average | 478 | 470 | 481 | 487 | 474 | 483 |
More than one race | 474 | 475 | 492 | 487 | 482 | 502 |
Hispanic | 452 | 446 | 455 | 453 | 436 | 443 |
Other | — | 423 | 436 | 460 | 446 | 446 |
Black | 419 | 419 | 421 | 423 | 404 | 417 |
Science | |||||
---|---|---|---|---|---|
Race | 2018 [71] | 2015 | 2012 | 2009 | 2006 |
Score | Score | Score | Score | Score | |
Asian | 551 | 525 | 546 | 536 | 499 |
White | 529 | 531 | 528 | 532 | 523 |
US Average | 502 | 496 | 497 | 502 | 489 |
More than one race | 502 | 503 | 511 | 503 | 501 |
Hispanic | 478 | 470 | 462 | 464 | 439 |
Other | — | 462 | 439 | 465 | 453 |
Black | 440 | 433 | 439 | 435 | 409 |
Reading | |||||||
---|---|---|---|---|---|---|---|
Race | 2018 [71] | 2015 | 2012 | 2009 | 2006 | 2003 | 2000 |
Score | Score | Score | Score | Score | Score | Score | |
Asian | 556 | 527 | 550 | 541 | — | 513 | 546 |
White | 531 | 526 | 519 | 525 | — | 525 | 538 |
US Average | 505 | 497 | 498 | 500 | — | 495 | 504 |
More than one race | 501 | 498 | 517 | 502 | — | 515 | — |
Hispanic | 481 | 478 | 478 | 466 | — | 453 | 449 |
Black | 448 | 443 | 443 | 441 | — | 430 | 445 |
Other | — | 440 | 438 | 462 | — | 456 | 455 |
Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, since 2000, literature on the differences in PISA and TIMSS results and their possible causes has emerged. [72] Data from PISA have furnished several researchers, notably Eric Hanushek, Ludger Wößmann, Heiner Rindermann, and Stephen J. Ceci, with material for books and articles about the relationship between student achievement and economic development, [73] democratization, and health; [74] as well as the roles of such single educational factors as high-stakes exams, [75] the presence or absence of private schools and the effects and timing of ability tracking. [76]
David Spiegelhalter of Cambridge wrote: "Pisa does present the uncertainty in the scores and ranks - for example the United Kingdom rank in the 65 countries is said to be between 23 and 31. It's unwise for countries to base education policy on their Pisa results, as Germany, Norway and Denmark did after doing badly in 2001." [77]
According to a Forbes opinion article, some countries such as China, Hong Kong, Macau, and Argentina select PISA samples from only the best-educated areas or from their top-performing students, slanting the results. [78]
According to an open letter to Andreas Schleicher, director of PISA, various academics and educators argued that "OECD and Pisa tests are damaging education worldwide". [79]
According to O Estado de São Paulo , Brazil shows a great disparity when classifying the results between public and private schools, where public schools would rank worse than Peru, while private schools would rank better than Finland. [80]
A standardized test is a test that is administered and scored in a consistent, or "standard", manner. Standardized tests are designed in such a way that the questions and interpretations are consistent and are administered and scored in a predetermined, standard manner.
In contemporary education, mathematics education—known in Europe as the didactics or pedagogy of mathematics—is the practice of teaching, learning, and carrying out scholarly research into the transfer of mathematical knowledge.
Education in Japan is managed by the Ministry of Education, Culture, Sports, Science and Technology (MEXT) of Japan. Education is compulsory at the elementary and lower secondary levels, for total of nine years.
Education in China is primarily managed by the state-run public education system, which falls under the Ministry of Education. All citizens must attend school for a minimum of nine years, known as nine-year compulsory education, which is funded by the government.
The educational system in Taiwan is the responsibility of the Ministry of Education. The system produces pupils with some of the highest test scores in the world, especially in mathematics and science. Former president Ma Ying-jeou announced in January 2011 that the government would begin the phased implementation of a twelve-year compulsory education program by 2014.
The education system in Switzerland is very diverse, because the constitution of Switzerland delegates the authority for the school system mainly to the cantons. The Swiss constitution sets the foundations, namely that primary school is obligatory for every child and is free in state schools and that the confederation can run or support universities.
The National Assessment of Educational Progress (NAEP) is the largest continuing and nationally representative assessment of what U.S. students know and can do in various subjects. NAEP is a congressionally mandated project administered by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the United States Department of Education. The first national administration of NAEP occurred in 1969. The National Assessment Governing Board (NAGB) is an independent, bipartisan board that sets policy for NAEP and is responsible for developing the framework and test specifications.The National Assessment Governing Board, whose members are appointed by the U.S. Secretary of Education, includes governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. Congress created the 26-member Governing Board in 1988.
The IEA's Progress in International Reading Literacy Study (PIRLS) is an international study of reading (comprehension) achievement in 9-10 year olds. It has been conducted every five years since 2001 by the International Association for the Evaluation of Educational Achievement (IEA). It is designed to measure children's reading literacy achievement, to provide a baseline for future studies of trends in achievement, and to gather information about children's home and school experiences in learning to read.
The International Association for the Evaluation of Educational Achievement (IEA)'s Trends in International Mathematics and Science Study (TIMSS) is a series of international assessments of the mathematics and science knowledge of students around the world. The participating students come from a diverse set of educational systems in terms of economic development, geographical location, and population size. In each of the participating educational systems, a minimum of 4,000 to 5,000 students is evaluated. Contextual data about the conditions in which participating students learn mathematics and science are collected from the students and their teachers, their principals, and their parents via questionnaires.
The Australian Council for Educational Research (ACER), established in 1930, is an independent educational research organisation based in Camberwell, Victoria (Melbourne) and with offices in Adelaide, Brisbane, Cyberjaya, Dubai, Jakarta, London, New Delhi, Perth and Sydney. ACER develops and manages a range of testing and assessment services and conducts research and analysis in the education sector.
Education in Lebanon is regulated by the Ministry of Education and Higher Education (MEHE). In Lebanon, the main three languages, English and/or French with Arabic are taught from early years in schools. English or French are the mandatory media of instruction for mathematics and sciences for all schools. Education is compulsory from age 3 to 14.
Singapore math is a teaching method based on the national mathematics curriculum used for first through sixth grade in Singaporean schools. The term was coined in the United States to describe an approach originally developed in Singapore to teach students to learn and master fewer mathematical concepts at greater detail as well as having them learn these concepts using a three-step learning process: concrete, pictorial, and abstract. In the concrete step, students engage in hands-on learning experiences using physical objects which can be everyday items such as paper clips, toy blocks or math manipulates such as counting bears, link cubes and fraction discs. This is followed by drawing pictorial representations of mathematical concepts. Students then solve mathematical problems in an abstract way by using numbers and symbols.
The most recent comprehensive data on adult literacy in the United States come from the Program for the International Assessment of Adult Competencies (PIAAC) study conducted in stages from 2012 to 2017 by the National Center for Education Statistics (NCES). English literacy test results from 2014 suggest that 21% of U.S. adults ages 16 to 65 score at or below PIAAC literacy level 1, meaning they have difficulty "[completing] tasks that require comparing and contrasting information, paraphrasing, or making low-level inferences." Included in that 21% is the 4.2% of respondents who were unable to be assessed due to language barriers, cognitive disability, or physical disability. A 2020 study by the Gallup analysis company funded by the Barbara Bush Foundation for Family Literacy estimated that getting all U.S. adults to at least PIAAC literacy level 3 proficiency would raise American's incomes by $2.2 trillion.
Nuno Paulo de Sousa Arrobas Crato, GCIH, GCPI is a Portuguese university professor, researcher, applied mathematician, economist, and writer. For many years, Crato was a researcher and professor in the United States. Back in Portugal, he taught mathematics and statistics at the ISEG/Technical University of Lisbon, now University of Lisbon, while pursuing his research in stochastic models and time series. He also published many articles and participated in events of science popularization and for the history of science. In June 2011, he was appointed Minister of Education, Higher Education and Science, in the cabinet of the Portuguese Government led by Pedro Passos Coelho, serving through the end of Coelho's government in 2015. He was three times awarded a national medal from the President of the Republic, as commander (2008) and with the grand cross (2016) of the Order of Prince Henry the Navigator, which is the highest grade given to a national figure. Lastly, as grand cross of Order of Public Instruction (Portugal) (2022), which is the highest grade of this order. He has lived and worked in Lisbon, Azores, the United States and Italy.
Education policy in Brazil has been given importance by the federal and local governments since 1995. At that time, the government of President Fernando Henrique Cardoso and the Brazilian Ministry of Education began to pursue three areas of national education policy:
The Programme for International Student Assessment has had several runs before the most recent one in 2012. The first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years. 470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010.
Female education in STEM refers to child and adult female representation in the educational fields of science, technology, engineering, and mathematics (STEM). In 2017, 33% of students in STEM fields were women.
Margaret Wu is an Australian statistician and psychometrician who specialises in educational measurement. She is an honorary professor at the University of Melbourne.
Generation Z, colloquially also known as zoomers, is the demographic cohort succeeding Millennials and preceding Generation Alpha. Researchers and popular media use the mid-to-late 1990s as starting birth years and the early 2010s as ending birth years. This article focuses specifically on the education of Generation Z.
Matthias von Davier is a psychometrician, academic, inventor, and author. He is the executive director of the TIMSS & PIRLS International Study Center at the Lynch School of Education and Human Development and the J. Donald Monan, S.J., University Professor in Education at Boston College.
PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.