KEYNOTE SPEAKERS
Dr Harold Hislop
“Assessing learning in schools – Reflections on lessons and challenges in the Irish context”
Dr Harold Hislop has been Chief Inspector and a member of the Management Board of the Department of Education in Ireland since 2010. Harold has led a series of reforms in the inspection and evaluation of schools and other education settings, including the introduction of school self-evaluation, the extension of inspections to early learning and care settings, and the development of a co-professional, collaborative approach to inspection that combines both evaluative and advisory functions.
Professor Emer Smyth
Title: Assessment research: listening to students, looking at consequences”
Abstract: This keynote address considers the kinds of information that should be used in looking at reform of assessment systems. The first part of the presentation focuses on the value of taking account of student voice in looking at the effects of different approaches to assessment, especially high-stakes examinations. While students provide invaluable insights, we also need to understand the consequences of different assessment approaches for educational inequality. The second part of the presentation discusses the extent to which different forms of assessment can reinforce (or indeed counter) social inequalities in student outcomes.
Bio: Emer Smyth is a Research Professor at the Economic and Social Research Institute (ESRI) in Ireland. Her main research interests centre on education, school to work transitions, gender and comparative methodology. She has conducted a number of studies on the effects of schooling contexts on student outcomes, including Do Schools Differ?
Professor Derek Briggs
Title: Content-Referenced Growth
Abstract: In this presentation I will describe an approach to modeling the results from an educational assessment in a way that focuses attention on the qualitative distinctions in student learning that can be inferred from a quantitative measuring scale. This approach, which I call “content-referenced growth,” has four ingredients that require a significant investment in research and design: (1) a learning progression; (2) a cross grade scale; (3) item mapping; and (4) an interactive reporting system.
Bio: Derek Briggs is a professor in the Research and Evaluation Methodology program where he also directs the Center for Assessment Design Research and Evaluation. Dr. Briggs’s research focuses upon advancing methods for the measurement and evaluation of student learning. His daily agenda is to challenge conventional wisdom and methodological chicanery as they manifest themselves in educational research, policy and practice.
Dr Paula Lehane
Winner of the Kathleen Tattersall New Assessment Researcher Award 2022
Title: The Impact of Test Items Incorporating Multimedia Stimuli on the Performance and Attentional Behaviour of Test-Takers
Abstract: Technology-Based Assessments (TBAs) use items that employ a broad array of interactive, dynamic or static stimuli e.g. simulations, animations, text-image. Although it is assumed that these features can make TBAs more authentic and effective, their impact on test-taker performance and behaviour has yet to be fully clarified.
Bio: Dr Paula Lehane is an Assistant Professor in the School of Inclusive and Special Education. A graduate of the B.Ed in Education and Psychology programme at Mary Immaculate College Limerick, Paula started her career as a primary school teacher in a developing school in Dublin. While working as a primary school teacher, she gained extensive experience in the areas of digital education, literacy, assessment and inclusive education.
Derek Briggs is a professor in the Research and Evaluation Methodology program where he also directs the Center for Assessment Design Research and Evaluation. Dr. Briggs’s research focuses upon advancing methods for the measurement and evaluation of student learning. His daily agenda is to challenge conventional wisdom and methodological chicanery as they manifest themselves in educational research, policy and practice.
As a psychometrician, Dr. Briggs works with states and other entities to provide technical advice on the design and use of large-scale student assessments. He has a special interest in the use of learning progressions as a method for facilitating student-level inferences about growth, and helping to bridge the use of test scores for formative and summative purposes. Other interests include the use and analysis of statistical models to support causal inferences about the effects of educational interventions on student achievement.
Dr. Briggs is the past president of the National Council on Measurement in Education for 2021-22, past editor of the journal Educational Measurement: Issues and Practice, and author of the book Historical and Conceptual Foundations of Measurement in the Human Sciences: Credos and Controversies(Routledge).
Abstract: In this presentation I will describe an approach to modeling the results from an educational assessment in a way that focuses attention on the qualitative distinctions in student learning that can be inferred from a quantitative measuring scale. This approach, which I call “content-referenced growth,” has four ingredients that require a significant investment in research and design: (1) a learning progression; (2) a cross grade scale; (3) item mapping; and (4) an interactive reporting system. The goal of content-referenced growth is to support interpretations of students’ scores relative to both the status of their understanding at one point in time, and their growth in understanding across points in time, relative to the content contained in the assessment. I introduce each ingredient and how they fit together in the context of newly developed learning progressions in mathematics and reading. I also discuss some preliminary results from piloting a prototype of an interactive reporting system with teachers who have experience administering and interpreting the results from the i-Ready Diagnostic, a large-scale assessment developed to support formative assessment purposes by the American company, Curriculum Associates.
Winner of the Kathleen Tattersall New Assessment Researcher Award 2022: Understanding Technology-Based Assessments that use Multimedia Stimuli
Dr Paula Lehane is an Assistant Professor in the School of Inclusive and Special Education. A graduate of the B.Ed in Education and Psychology programme at Mary Immaculate College Limerick, Paula started her career as a primary school teacher in a developing school in Dublin. While working as a primary school teacher, she gained extensive experience in the areas of digital education, literacy, assessment and inclusive education.
During her time as a primary school teacher, Paula also completed a Graduate Diploma in Special Educational Needs (GradDip SEN) and a Masters in Additional Support Needs (M.Ed. ASN) with University College Dublin.
Paula subsequently worked as a research assistant with DCU’s Centre for Assessment Research, Policy & Practice in Education (CARPE), where her work focused on various issues in assessment and test development in educational and workplace settings. She received funding from the Irish Research Council (IRC) to complete her PhD in the field of digital tests and assessments for post-primary learners. It is hoped that the findings of this research will support the effective design and deployment of computer-based exams within post-primary education systems.
Technology-Based Assessments (TBAs) use items that employ a broad array of interactive, dynamic or static stimuli e.g. simulations, animations, text-image. Although it is assumed that these features can make TBAs more authentic and effective, their impact on test-taker performance and behaviour has yet to be fully clarified.
This research investigated the extent to which the use of different multimedia stimuli can affect test-taker performance and behaviour using a mixed methods approach. Guided by four main research questions, an experiment was conducted with 251 Irish post-primary students using an animated and text-image version of the same TBA of scientific literacy. Eye movement and interview data were also collected from subsets of these students (n=32 and n=12 respectively) to determine how differing multimedia stimuli can affect test-taker attentional behaviour. A second study involving 24 test-takers completing a series of simulation-type items was also undertaken. Eye movement, interview and test-score data were gathered to provide insight into test-taker engagement with these items.
The results indicated that, overall, there was no significant difference in test-taker performance when identical items used animated or text-image stimuli. However, items with dynamic stimuli often had higher discrimination indices indicating that these items were better at distinguishing between those with high and low levels of knowledge. Eye movement data also revealed that dynamic item stimuli encouraged longer average fixation durations on the response area of an item. An examination of the data relating to test-taker performance and behaviour for simulation-type items found that there was a weak to moderate relationship between task performance and time-to-first-fixation on relevant information/areas.
Education systems around the world are now attempting to devise their own TBAs for their terminal post-primary exams e.g. New Zealand, Ireland. It is hoped that the findings of this research will act as a resource for those who wish to use TBAs in this manner. In particular, insights into test-takers’ eye movements may help to support more appropriate inferences from test scores.
Dr Harold Hislop has been Chief Inspector and a member of the Management Board of the Department of Education in Ireland since 2010. Harold has led a series of reforms in the inspection and evaluation of schools and other education settings, including the introduction of school self-evaluation, the extension of inspections to early learning and care settings, and the development of a co-professional, collaborative approach to inspection that combines both evaluative and advisory functions.
As head of the Inspectorate, Dr. Harold Hislop has led the Inspectorate’s work in supporting educational policy development within the Department, especially in areas such as curriculum and assessment policy, special education policy and teacher education policy. Harold played a key role in developments such as Ireland’s successful Literacy and Numeracy Strategy 2011-2020, curricular reform at lower and upper secondary education, and the alternative end-of-schooling assessment arrangements that the Department of Education put in place during the Covid-19 crisis.
Harold has lectured or advised about school evaluation in universities in Ireland and in several countries including Austria, France, Malta, the United Arab Emirates and Wales. Harold is a Vice-Chair of the Governing Board and the Bureau of the Centre for Educational Research and Innovation at the OECD.
Emer Smyth is a Research Professor at the Economic and Social Research Institute (ESRI) in Ireland. Her main research interests centre on education, school to work transitions, gender and comparative methodology. She has conducted a number of studies on the effects of schooling contexts on student outcomes, including Do Schools Differ?
Professor Emer Smyth led the Post-Primary Longitudinal Study (PPLS), which followed a cohort of young people from the first year of secondary education onwards, and included a survey of, and interviews with, the young people’s parents. Educational inequality has been an important focus of her research, with work on an evaluation of the Youthreach programme, a review of the School Completion Programme and the evaluation of the Delivering Equality of Opportunity in Schools (DEIS) programme.
She is Principal Investigator of Growing Up in Ireland (GUI) and has used GUI data to write reports and journal articles on the transition into primary school, arts and cultural participation among children and young people, spatial variation in child outcomes and the effects of being in a multi-grade class, among other topics.
This keynote address considers the kinds of information that should be used in looking at reform of assessment systems. The first part of the presentation focuses on the value of taking account of student voice in looking at the effects of different approaches to assessment, especially high-stakes examinations. While students provide invaluable insights, we also need to understand the consequences of different assessment approaches for educational inequality. The second part of the presentation discusses the extent to which different forms of assessment can reinforce (or indeed counter) social inequalities in student outcomes.