skip to main content
ACER
Unpacking ISA reports: Key concepts

Unpacking ISA reports: Key concepts

Feature 6 minute read

ACER’s most recent webinar unpacked the key concepts behind the International Schools’ Assessment (ISA) reports. Facilitated by Kimberley Veart, the session provided practical advice on how to use the reports effectively and how the data can support teaching. Sarah Elder, ISA Project Director, took attendees through scale score proficiency levels and international comparisons that make up the core of the ISA reporting suite, while Marc Kralj, Educational Consultant, demonstrated ISA reports in practice. 

The purpose of the ISA program 

Launched in 2002 by ACER, the ISA program evolved from discussions with international schools in the East Asian region to fulfill a need for an assessment that catered specifically to students from diverse linguistic and cultural backgrounds, rather than a repurposed national assessment. 

The ISA program supports schools to: 

  • monitor student progress over time 

  • compare school performance internationally with ‘like school’ groups, schools in the same region (Africa, the Americas, Asia-Pacific, Europe or the Middle East) and all schools 

  • inform improvements in teaching and learning. 

Offered for Grades 3–10, the assessment includes a variety of question types across Mathematical Literacy, Reading, Writing and Scientific Literacy.  

The reporting suite compiles all the test results in a variety of displays so that teachers can identify their students’ strengths, weaknesses or gaps in knowledge and see how their performance compares to the international cohort. 

Scale scores and proficiency levels 

In line with ACER’s Progressive Achievement approach, ISA reports use scale scores, which infer students’ level of attainment in a learning area, allowing a comparison of all students across grade levels, from one year to the next, providing quantitative evidence of student progress over time.  

The scale score ranges that fall within each level for Mathematical Literacy
Mathematical Literacy level Scale score
Level 9 670 & above
Level 8 608–669
Level 7 545–607
Level 6 483–544
Level 5 421–482
Level 4 359–420
Level 3 297–358
Level 2 235–296
Level 1 173-234
Below Level 1 Below 173

 

Proficiency levels are these scale scores tied to a described progression for each learning area. These levels provide qualitative information about what progress typically looks like for students and can inform their next steps. Each learning area has its own defined scales and levels.

The described proficiency levels, focusing on Uncertainty and Data, in Mathematical Literacy
Level Uncertainty and Data
9

Use high-level thinking and reasoning skills in statistical or probabilistic contexts to create mathematical representations of real-world situations. Use insight and reflection to solve problems, and to formulate and communicate arguments and explanations.

8

Apply probabilistic and statistical knowledge in problem situations that are somewhat structured and where the mathematical representation is partially apparent. Use reasoning and insight to interpret and analyse given information, to develop appropriate models and to perform sequential calculation processes. Communicate reasons and arguments.

7

Use basic statistical and probabilistic concepts combined with numerical reasoning in less familiar contexts to solve problems. Carry out multi-step or sequential calculation processes. Use and communicate argumentation based on interpretation of data.

6

Interpret statistical information and data, and link different information sources. Use basic reasoning with simple probability concepts, symbols and conventions and communication of reasoning.

5

Locate statistical information presented in a variety of familiar forms. Understand basic statistical concepts and conventions. Solve probability problems using ideas related to mathematical games and experiments.

4

Solve problems using data presented in simple graphs or tables. Understand and use basic probabilistic ideas in familiar experimental contexts.

3

Locate information presented in simple graphs or tables. Investigate and order chance events.

2

Sort and order data to create graphs in a variety of forms. Use the language of chance to order the possible outcomes of familiar events.

1

Sort and order information from the immediate environment to compare quantities and create simple graphs. Use the everyday language of chance.

0

Locate information presented in a simple pictograph

 

The relationship between scale scores and proficiency levels divulged in the reporting is evidence that students of the same age and of the same grade level can be at very different points in their learning. Teachers should consider where these students are in their learning – who is performing below their cohort, who is at level and who is performing above level.  

The ISA in practice 

Using the learning area of Mathematical Literacy as an example, Mr Kralj went through a detailed, practical demonstration of analysing the data from the ISA reports. The key takeaways were:  

  • How students engage with the assessment
    How students' approach and respond to assessments are behaviours that are influenced by different factors. For example, students guess answers when they are not sure, and their language capability, literacy within the discipline, and prior knowledge and experiences can vary. ‘When we're watching our students being assessed or engaging in the assessment, we're wondering what they're thinking and what's going through their minds.’ 

  • Learning progressions
    In addition to identifying differentiation in the starting points of students, teachers should also consider the differentiation in where students are in their understanding. Just because a student is at a particular level, doesn’t mean they are static. In certain areas they may show more strengths, and in others, a particular weakness.  

  • Areas for improvement
    Reviewing the questions a student answers correctly or incorrectly, informs teachers’ understanding of how the student, and the class, have engaged with the test: did they get the difficult question wrong? If yes, did most students get that question wrong? A formative assessment task to check-in could provide further information.  

  • Weaknesses versus gaps
    A weakness is most likely something that has already been covered but needs reinforcement before students master it. A gap is something that has not been covered for one reason or another. Identifying both can help schools plan to adjust their approach. ‘[From this,] we begin to develop a broader idea about what our short and long-term plans are for the school based upon the evidence we're collecting to ensure that we are meeting and supporting the outcomes of our students.’ An example of this in practice is in our interview with Southbank International School, London.  

  • Teachers know best
    Teachers inherently know the capabilities of their students and can most effectively interrogate the ‘why’ behind results. ‘[Teachers] don't just do an assessment based upon the results [students are] doing. We also observe how students perform in the classroom, as well as how they perform in an assessment.’ 

Find out more 

Watch the complete webinar session

If you have any questions or queries, please contact the team at isa@acer.org.

Subscribe to the Discover newsletter

Privacy policy