Randy Elliot Bennett

Last updated
Randy Elliot Bennett
Born Brooklyn, New York
OccupationEducational Researcher
NationalityAmerican
Notable works Formative Assessment: A Critical Review
Cognitively Based Assessment of, for, and as Learning: A preliminary theory of action for summative and formative assessment
Educational Assessment: What to Watch in a Rapidly Changing World
The Changing Nature of Educational Assessment
Toward a Theory of Socioculturally Responsive Assessment
Notable awards National Academy of Education elected member
AERA E.F. Lindquist Award
AERA Cognition and Assessment SIG Outstanding Contribution to Research in Cognition and Assessment Award
NCME Bradley Hanson Award
AERA Fellow
Teachers College, Columbia University Distinguished Alumni Award

Randy Elliot Bennett is an American educational researcher who specializes in educational assessment. He is currently the Norman O. Frederiksen Chair in Assessment Innovation at Educational Testing Service in Princeton, NJ. His research and writing focus on bringing together advances in cognitive science, technology, and measurement to improve teaching and learning. He received the ETS Senior Scientist Award in 1996, the ETS Career Achievement Award in 2005, the Teachers College, Columbia University Distinguished Alumni Award in 2016, Fellow status in the American Educational Research Association (AERA) in 2017, the National Council on Measurement in Education's (NCME) Bradley Hanson Award for Contributions to Educational Measurement in 2019 (with H. Guo, M. Zhang, and P. Deane), the E. F. Lindquist Award from AERA and ACT in 2020, elected membership in the National Academy of Education in 2022, and the AERA Cognition and Assessment Special Interest Group Outstanding Contribution to Research in Cognition and Assessment Award in 2024. [1] [2] [3] [4] [5] [6] Randy Bennett was elected President of both the International Association for Educational Assessment (IAEA), a worldwide organization primarily constituted of governmental and NGO measurement organizations, and the National Council on Measurement in Education (NCME), whose members are employed in universities, testing organizations, state and federal education departments, and school districts.

Contents

Publications

Bennett is author or editor of nine books, as well as over 100 journal articles, chapters, and technical reports. Those publications have concentrated on several themes. The 1998 publication, Reinventing Assessment: Speculations on the Future of Large-Scale Educational Testing, [7] presented a three-stage framework for how paper-and-pencil tests would gradually transition to digital form, eventually melding with online activities, blurring the distinction between learning and assessment, and leading to improvements in both pursuits. A series of subsequent publications built upon the work of Robert Glaser, Norman O. Frederiksen, Samuel Messick, James Pellegrino, Lorrie Shepard and others to create a unified model for formative and summative assessments under the Cognitively Based Assessment of, for, and as Learning (CBAL) initiative. [8] [9] This work, noted in the citations for both the E.F. Lindquist Award and his AERA Fellow designation, [2] [4] is described in two journal articles, Transforming K-12 Assessment [10] and Cognitively Based Assessment of, for, and as Learning. [11] The latter publication articulated assumptions for the CBAL assessment model in a detailed "theory of action," which described the assessment system components, intended outcomes, and the action mechanisms that should lead to those outcomes, predating the generally recommended use of that device in operational testing programs. [12] [13]

The journal article, Formative Assessment: A Critical Review, [14] questioned the magnitude of efficacy claims, the meaningfulness of existing definitions, and the general absence of disciplinary considerations in the conceptualization and implementation of formative assessment. [15] The article encouraged a deeper examination of premises, more careful consideration of effectiveness claims, and a move toward incorporating domain considerations directly into the structure and practice of formative assessment. [16] [17] [18]

Two reports--Online Assessment in Mathematics and Writing [19] and Problem Solving in Technology Rich Environments [20] --documented studies that helped set the stage for moving the US National Assessment of Educational Progress from paper presentation to computer delivery. [21] [22]

Several recent articles called attention to the need for testing companies and state education departments to exercise caution in using artificial intelligence (AI) methods for scoring consequential tests. That theme was developed in a book chapter, Validity and Automated Scoring, [23] and summarized in The Changing Nature of Educational Assessment. [24] These publications note that in automated essay scoring, for example, caution is needed because of the inscrutability of some AI scoring methods, their use of correlates that can be easily manipulated for undeserved score gain, and the routine practice of building scoring algorithms to model the judgment of operational human graders, thereby unintentionally incorporating human biases.

Bennett's latest work centers on equity in assessment. The commentary, The Good Side of COVID-19, [25] makes the case that standardized testing, and educational assessment more generally, must be rethought so that they better align with the multicultural, pluralistic society the US is rapidly becoming. In a follow-up article, Toward a Theory of Socioculturally Responsive Assessment, [26] he assembles assessment design principles from multiple literatures and uses them to fashion a definition, theory, and suggested path for implementing measures more attuned to the social, cultural, and other relevant characteristics of diverse individuals and the contexts in which they live. That line of thinking is elaborated upon in Let's Agree to (Mostly) Agree: A Response to Solano-Flores. [27]

A logical extension of the ideas explored in Toward a Theory of Socioculturally Responsive Assessment is to personalize so that assessments are adapted to the characteristics of the individual. In Personalizing Assessment: Dream or Nightmare? [28] Bennett articulates why personalized assessment is needed, what precedents exist for it in educational measurement, what it looks like in practice, and why we should worry about it.

Books

Andrade, H. L., Bennett, R. E., & Cizek, G. J. (Eds.). (2019). Handbook of formative assessment in the disciplines. New York: Routledge.

Bennett, R. E., & von Davier, M. (Eds.). (2017). Advancing human assessment: The methodological, psychological, and policy contributions of ETS. Cham, Switzerland: Springer Open.

Bennett, R. E., & Ward, W. C. (Eds.). (1993). Construction vs. choice in cognitive measurement: Issues in constructed response, performance testing, and portfolio assessment. Hillsdale, NJ: Lawrence Erlbaum Associates.

Willingham, W. W., Ragosta, M., Bennett, R. E., Braun, H. I. Rock, D. A., & Powers, D. E. (1988). Testing handicapped people. Boston, MA: Allyn & Bacon.

Bennett, R. E. (Ed.). (1987). Planning and evaluating computer education programs. Columbus, OH: Merrill.

Bennett, R. E., & Maher, C. A. (Eds). (1986). Emerging perspectives in the assessment of exceptional children. New York: Haworth Press.

Cline, H. F., Bennett, R. E., Kershaw, R. C., Schneiderman, M. B., Stecher, B., & Wilson, S. (1986). The electronic schoolhouse: The IBM secondary school computer education program. Hillsdale, NJ: Lawrence Erlbaum Associates.

Bennett, R. E., & Maher, C. A. (Eds.). (1984). Microcomputers and exceptional children. New York: Haworth Press.

Maher, C. A., & Bennett, R. E. (1984). Planning and evaluating special education services. Englewood Cliffs, NJ: Prentice-Hall.

Related Research Articles

Psychometrics is a field of study within psychology concerned with the theory and technique of measurement. Psychometrics generally covers specialized fields within psychology and education devoted to testing, measurement, assessment, and related activities. Psychometrics is concerned with the objective measurement of latent constructs that cannot be directly observed. Examples of latent constructs include intelligence, introversion, mental disorders, and educational achievement. The levels of individuals on nonobservable latent variables are inferred through mathematical modeling based on what is observed from individuals' responses to items on tests and scales.

<span class="mw-page-title-main">Educational Testing Service</span> Educational testing and assessment organization

Educational Testing Service (ETS), founded in 1947, is the world's largest private educational testing and assessment organization. It is headquartered in Lawrence Township, New Jersey, but has a Princeton address.

Educational assessment or educational evaluation is the systematic process of documenting and using empirical data on the knowledge, skill, attitudes, aptitude and beliefs to refine programs and improve student learning. Assessment data can be obtained by examining student work directly to assess the achievement of learning outcomes or it is based on data from which one can make inferences about learning. Assessment is often used interchangeably with test but is not limited to tests. Assessment can focus on the individual learner, the learning community, a course, an academic program, the institution, or the educational system as a whole. The word "assessment" came into use in an educational context after the Second World War.

Learning styles refer to a range of theories that aim to account for differences in individuals' learning. Although there is ample evidence that individuals express personal preferences on how they prefer to receive information, few studies have found validity in using learning styles in education. Many theories share the proposition that humans can be classified according to their "style" of learning, but differ on how the proposed styles should be defined, categorized and assessed. A common concept is that individuals differ in how they learn.

Computerized adaptive testing (CAT) is a form of computer-based test that adapts to the examinee's ability level. For this reason, it has also been called tailored testing. In other words, it is a form of computer-administered test in which the next item or set of items selected to be administered depends on the correctness of the test taker's responses to the most recent items administered.

Gwyneth M. Boodoo is an American psychologist and expert on educational measurement.

<i>Standards for Educational and Psychological Testing</i> Educational testing standards

The Standards for Educational and Psychological Testing is a set of testing standards developed jointly by the American Educational Research Association (AERA), American Psychological Association (APA), and the National Council on Measurement in Education (NCME). The most recent edition, the 7th, is available in a printed form as well as freely downloadable as PDFs in English and Spanish. ePub and PDF eBook formats are also available at.

<span class="mw-page-title-main">Formative assessment</span> Method in education

Formative assessment, formative evaluation, formative feedback, or assessment for learning, including diagnostic testing, is a range of formal and informal assessment procedures conducted by teachers during the learning process in order to modify teaching and learning activities to improve student attainment. The goal of a formative assessment is to monitor student learning to provide ongoing feedback that can help students identify their strengths and weaknesses and target areas that need work. It also helps faculty recognize where students are struggling and address problems immediately. It typically involves qualitative feedback for both student and teacher that focuses on the details of content and performance. It is commonly contrasted with summative assessment, which seeks to monitor educational outcomes, often for purposes of external accountability.

Education sciences, also known as education studies, education theory, and traditionally called pedagogy, seek to describe, understand, and prescribe education including education policy. Subfields include comparative education, educational research, instructional theory, curriculum theory and psychology, philosophy, sociology, economics, and history of education. Related are learning theory or cognitive science.

Cognitive skills are skills of the mind, as opposed to other types of skills such as motor skills or social skills. Some examples of cognitive skills are literacy, self-reflection, logical reasoning, abstract thinking, critical thinking, introspection and mental arithmetic. Cognitive skills vary in processing complexity, and can range from more fundamental processes such as perception and various memory functions, to more sophisticated processes such as decision making, problem solving and metacognition.

Norman “Fritz” Frederiksen (1909-1998) was an American research psychologist and leading proponent of performance assessment, an approach to educational and occupational testing that focused on the use of tasks similar to the ones individuals actually encounter in real classroom and work environments. In keeping with the philosophy underlying this approach, Frederiksen was a critic of multiple-choice testing, which he felt negatively influenced school curricula and classroom practice. Much of his research centered upon creating and evaluating alternative approaches to the measurement of knowledge and skill, which he pursued over a 40-year career at Educational Testing Service (ETS) in Princeton, NJ. For his work, he received the American Psychological Association's Award for Distinguished Contributions to Knowledge in 1984 and, by the time of his retirement from ETS, had attained the position of Distinguished Scientist, the organization's highest-ranking scientific title at that time.

Adaptive comparative judgement is a technique borrowed from psychophysics which is able to generate reliable results for educational assessment – as such it is an alternative to traditional exam script marking. In the approach, judges are presented with pairs of student work and are then asked to choose which is better, one or the other. By means of an iterative and adaptive algorithm, a scaled distribution of student work can then be obtained without reference to criteria.

ACT, Inc. is an American for-profit company primarily known for the ACT, a standardized test designed to assess high school students' academic achievement and college readiness. It was announced in April 2024 that the company, previously a 501(c)(3) nonprofit organization, had been purchased by the private equity firm Nexus Capital, raising concerns about transparency and accountability.

The National Council on Measurement in Education (NCME) is a U.S. based professional organization for assessment, evaluation, testing, and other aspects of educational measurement. NCME was launched in 1938 and previously operated under the name National Council on Measurements Used in Education.

Lynn Fuchs is an educational psychologist known for research on instructional practice and assessment, reading disabilities, and mathematics disabilities. She is the Dunn Family Chair in Psychoeducational Assessment in the Department of Special Education at Vanderbilt University.

Alina Anca von Davier is a psychometrician and researcher in computational psychometrics, machine learning, and education. Von Davier is a researcher, innovator, and an executive leader with over 20 years of experience in EdTech and in the assessment industry. She is the Chief of Assessment at Duolingo, where she leads the Duolingo English Test research and development area. She is also the Founder and CEO of EdAstra Tech, a service-oriented EdTech company. In 2022, she joined the University of Oxford as an Honorary Research Fellow, and Carnegie Mellon University as a Senior Research Fellow.

Mark Daniel Reckase is an educational psychologist and expert on quantitative methods and measurement who is known for his work on computerized adaptive testing, multidimensional item response theory, and standard setting in educational and psychological tests. Reckase is University Distinguished Professor Emeritus in the College of Education at Michigan State University.

Jacqueline P. Leighton is a Canadian-Chilean educational psychologist, academic and author. She is a full professor in the Faculty of Education as well as vice-dean of Faculty Development and Faculty Affairs at the University of Alberta.

Matthias von Davier is a psychometrician, academic, inventor, and author. He is the executive director of the TIMSS & PIRLS International Study Center at the Lynch School of Education and Human Development and the J. Donald Monan, S.J., University Professor in Education at Boston College.

Fumiko Samejima (1930–c2021) was a prominent Japanese-born psychometrician best known for her development of the Graded Response Model (GRM), a fundamental approach in Item Response Theory (IRT). Her innovative methods became influential in psychological and educational measurement, particularly in improving the accuracy of tests involving Likert-scale questions and other graded responses. She published her seminal paper “Estimation of Latent Ability Using a Response Pattern of Graded Scores” in 1969. This publication became a foundational reference in psychometric literature, significantly advancing the analysis of ordered categorical data.

References

  1. Levine, J. "Honoring the Very Best: Recognition for a Stellar Group of TC Alumni". Teachers College, Columbia University. Retrieved August 18, 2020.
  2. 1 2 "2017 AERA Fellows". American Educational Research Association. Retrieved August 18, 2020.
  3. "Bradley Hanson Award for Contributions to Educational Measurement Recipients Announced". National Council on Measurement in Education. Retrieved August 20, 2020.
  4. 1 2 "E.F. Lindquist Award: 2020 Award Recipient". American Educational Research Association. Retrieved August 18, 2020.
  5. "Seventeen Scholars Elected to Membership in the National Academy of Education". National Academy of Education. 28 January 2022. Retrieved January 28, 2022.
  6. "Current Award: 2024 Outstanding Contribution to Research in Cognition and Assessment". American Educational Research Association. 8 April 2024. Retrieved April 8, 2024.
  7. Bennett, R.E. "Reinventing Assessment: Speculations on the Future of Large-Scale Educational Testing". Educational Testing Service.
  8. Rubenstein, G. (March 18, 2008). "Ending Hit-and-Run Testing: ETS Sets Out to Revolutionize Assessment". Edutopia.
  9. Ash, K. (March 14, 2011). "Tailoring Testing with Digital Tools". Education Week, 30(25). pp. 35, 37.
  10. Bennett, R.E.; Gitomer, D.H. (2009). "Transforming K-12 assessment: Integrating accountability testing, formative assessment, and professional support. In C. Wyatt-Smith & J. Cumming (Eds.), Educational assessment in the 21st century". New York: Springer. pp. 43–61.
  11. Bennett, R.E. (2010). "Cognitively based assessment of, for, and as learning: A preliminary theory of action for summative and formative assessment". Measurement: Interdisciplinary Research and Perspectives, 8. pp. 70–91.
  12. NCME (July 26, 2018). "National Council on Measurement in Education (NCME) Position Statement on Theories of Action for Testing Programs" (PDF). NCME.
  13. Chalhoub-Deville, M. (2016). "Validity theory: Reform policies, accountability testing, and consequences". Language Testing. 33 (4). Language Testing, 33(4): 453–472. doi:10.1177/0265532215593312. S2CID   152167855.
  14. Bennett, R.E. (2011). "Formative Assessment: A Critical Review". Assessment in Education: Principles, Policy & Practice. 18. Assessment in Education: Principles, Policy and Practice, 18: 5–25. doi:10.1080/0969594X.2010.513678. S2CID   14804319.
  15. Sawchuk, S. (May 21, 2009). "Has the Research on Formative Assessment Been Oversold?". Education Week Teacher Beat.
  16. Baird, J.; Hopfenbeck, T.N.; Newton, P.; Stobart, G.; Steen-Utheim, A.T. State of the Field Review: Assessment and Learning (PDF). Norwegian Knowledge Centre for Education.
  17. Heritage, M.; Wiley, E.C. (2020). Formative Assessment in the Disciplines: Framing a Continuum of Professional Learning. Cambridge, MA: Harvard Education Press. pp. 15–47.
  18. Nishizuka, K. (2020). "A Critical Review of Formative Assessment Research and Practice in Japan". International Journal of Curriculum Development and Practice. pp. 15–47.
  19. Sandene, B.; Horkay, N.; Bennett, R.E.; Allen, N.; Braswell, J.; Kaplan, B.; Oranje, A. (2005). Online Assessment in Mathematics and Writing: Reports From the NAEP Technology-Based Assessment Project, Research and Development Series. Washington, D.C. IES. Retrieved August 18, 2020.
  20. Bennett, R.E.; Persky, H.; Weiss, A.R.; Jenkins, F. (2007). Problem Solving in Technology-Rich Environments: A Report From the NAEP Technology-Based Assessment Project. Washington, D.C. IES. Retrieved August 18, 2020.
  21. Cavanagh, S. (August 17, 2007). "Computerized Tests Measure Problem-Solving". Education Week.
  22. Tucker, B. (November 2009). "The Next Generation of Testing". Education Leadership, 67(3). pp. 48–53.
  23. Bennett, R.E.; Zhang, M. (2016). "Validity and automated scoring. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement". New York: Routledge. pp. 142–173.
  24. Bennett, R.E. (2015). "The Changing Nature of Educational Assessment". Review of Research in Education. 39. Review of Research in Education, 39: 370–407. doi:10.3102/0091732X14554179. S2CID   145592665.
  25. Bennett, R.E. (2022). "The Good Side of COVID-19". Educational Measurement: Issues and Practice. 41: 61–63. doi:10.1111/emip.12496. S2CID   246588079.
  26. Bennett, R.E. (2023). "Toward a Theory of Socioculturally Responsive Assessment". Educational Assessment. 28 (2): 83–104. doi: 10.1080/10627197.2023.2202312 .
  27. Bennett, R.E. (2023). "Let's Agree to (Mostly) Agree: A Response to Solano-Flores". Educational Assessment. 28 (2): 122–127. doi:10.1080/10627197.2023.2215978. S2CID   258933453.
  28. Bennett, R.E. (2024). "Personalizing Assessment: Dream or Nightmare?". Educational Measurement: Issues and Practice. doi:10.1111/emip.12652.

Randy E. Bennett publications indexed by Google Scholar.