tag:theconversation.com,2011:/ca/topics/naplan-future-53214/articlesNAPLAN future – The Conversation2023-03-14T19:06:31Ztag:theconversation.com,2011:article/2013712023-03-14T19:06:31Z2023-03-14T19:06:31ZNAPLAN results inform schools, parents and policy. But too many kids miss the tests altogether<figure><img src="https://images.theconversation.com/files/515027/original/file-20230313-20-dv146l.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6000%2C3997&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Today the NAPLAN testing window starts for more than a million students in Years 3, 5, 7 and 9. Over the next nine days students will sit literacy and numeracy tests which are designed to measure their reading, writing, numeracy, grammar, punctuation and spelling. </p>
<p>Education decision makers will be holding their breath about how many students turn up for NAPLAN. Last year saw the <a href="https://www.edresearch.edu.au/resources/naplan-participation-who-missing-tests-and-why-it-matters">steepest declines</a> on record in secondary school student participation. </p>
<p>This is an issue because NAPLAN results help inform parents, teachers, schools and education authorities about student learning and can influence <a href="https://www.pc.gov.au/inquiries/completed/school-agreement#report">decisions about policies</a>, resources and additional supports for students. Declining NAPLAN participation may result in decisions being based on incomplete data. </p>
<p>In our <a href="https://www.edresearch.edu.au/resources/naplan-participation-who-missing-tests-and-why-it-matters">new paper</a> for the Australian Education Research Organisation, we look at who is not sitting the tests and why that matters.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/maths-anxiety-is-a-real-thing-here-are-3-ways-to-help-your-child-cope-200822">'Maths anxiety' is a real thing. Here are 3 ways to help your child cope</a>
</strong>
</em>
</p>
<hr>
<h2>Who is not sitting the tests?</h2>
<p>While primary school student participation in NAPLAN has been steady at about 95% since 2014, secondary student participation has been in persistent decline. Last year only 87% of Year 9 students sat the tests. </p>
<p>A sharper decline in participation in 2022 <a href="https://acara.edu.au/reporting/national-report-on-schooling-in-australia/national-report-on-schooling-in-australia-data-portal/student-attendance">was partly due to</a> flooding in regions across Australia, high rates of illness and COVID-19 isolation requirements – circumstances we hope will not be repeated. It is the long-term decline in NAPLAN participation in secondary schools that needs attention. </p>
<p>The participation rate is alarmingly low for some groups of students. The figure below shows 79% of Year 9 students living in remote Australia sat NAPLAN last year. First Nations students and students from educationally disadvantaged backgrounds also had low participation rates in 2022; 66% and 75% respectively. </p>
<hr>
<iframe src="https://flo.uri.sh/visualisation/13050285/embed" title="Interactive or visual content" class="flourish-embed-iframe" frameborder="0" scrolling="no" style="width:100%;height:600px;" sandbox="allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation" width="100%" height="400"></iframe>
<div style="width:100%!;margin-top:4px!important;text-align:right!important;"><a class="flourish-credit" href="https://public.flourish.studio/visualisation/13050285/?utm_source=embed&utm_campaign=visualisation/13050285" target="_top"><img alt="Made with Flourish" src="https://public.flourish.studio/resources/made_with_flourish.svg"> </a></div>
<hr>
<p>Our analysis reveals low-performing students are also less likely to participate in the tests. Students who performed poorly in NAPLAN in Year 7 were nearly five times more likely to miss the Year 9 tests than high-performing students. These findings were replicated for primary students.</p>
<p>Students who are educationally at risk need the best decisions from schools and education authorities. If NAPLAN participation rates are low for these smaller populations, the data is less reliable and the ability to make informed decisions may be compromised. </p>
<h2>Why aren’t students sitting the tests?</h2>
<p>Students do not sit NAPLAN for three official reasons: they may be exempt from taking the tests, withdrawn by their parents, or absent on the day. </p>
<p>The main reason for the long-term decline in NAPLAN participation is that more parents have been withdrawing their children from the tests. In 2022 over 11,000 Year 9 students didn’t sit the writing test because they had been withdrawn from it.</p>
<p>Being absent is also a contributing factor in the decline in participation; more so for secondary students than primary. In 2022, more Year 9 students than usual were absent from the writing test (in total over 28,600). </p>
<iframe src="https://flo.uri.sh/visualisation/13049784/embed" title="Interactive or visual content" class="flourish-embed-iframe" frameborder="0" scrolling="no" style="width:100%;height:600px;" sandbox="allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation" width="100%" height="400"></iframe>
<div style="width:100%!;margin-top:4px!important;text-align:right!important;"><a class="flourish-credit" href="https://public.flourish.studio/visualisation/13049784/?utm_source=embed&utm_campaign=visualisation/13049784" target="_top"><img alt="Made with Flourish" src="https://public.flourish.studio/resources/made_with_flourish.svg"> </a></div>
<p>There are many reasons students are absent and withdrawn from NAPLAN. Parents who are worried about how their child may be affected by taking the tests and <a href="https://theconversation.com/what-parents-should-and-shouldnt-say-when-talking-to-their-child-about-naplan-results-189636">receiving results</a> may choose to keep them at home or formally withdraw them from the tests. Anecdotally there have also been <a href="https://www.heraldsun.com.au/news/schools-can-cheat-naplan-exams/news-story/8160b68c1e79ce869538913e730cdad4">reports</a> of schools asking low performing students to stay home on testing days, so they don’t “drag down” school averages.</p>
<p>On the positive side, our analysis showed Year 9 students with language backgrounds other than English participated in higher proportions than average (92% compared to 87%). This suggests cultural differences and family attitudes to education and testing might play an important role in participation. </p>
<h2>Why is high NAPLAN participation important?</h2>
<p>NAPLAN data is used by education authorities to <a href="https://theconversation.com/five-things-we-wouldnt-know-without-naplan-94286">better understand the learning progress of all Australian students</a> to inform system-wide policies and support.</p>
<p>It also helps schools, systems and sectors to monitor and evaluate the effectiveness of educational approaches, and identifies schools which need more support. </p>
<p>For example, <a href="https://www.theeducatoronline.com/k12/news/writing-the-next-chapter-in-student-learning/281525">in NSW</a>, NAPLAN data has been used to understand whether a new teaching role and giving students more practice time have been effective in improving students’ writing skills.</p>
<p>In Victoria, <a href="https://www.theage.com.au/national/victoria/naplan-starts-this-week-here-s-what-the-changes-mean-for-students-and-parents-20230312-p5crfr.html?ref=rss&utm_medium=rss&utm_source=rss_national">Brandon Park Primary School</a> used its NAPLAN results to inform a whole school change to its teaching of reading, which brought remarkable success. </p>
<p>Given the benefits that good use of NAPLAN data can bring, it is critical the results are representative of the student groups being tested. </p>
<p>While the Australian Curriculum, Assessment and Reporting Authority estimates data for withdrawn and absent students, our analysis suggests student proficiency is likely to be overestimated.</p>
<p>That’s because students not sitting the test are more likely to be lower-performing students from their respective demographic groups. Real data is always better than estimates.</p>
<h2>What now?</h2>
<p>The Australian <a href="https://www.education.gov.au/alice-springs-mparntwe-education-declaration">education system</a> is meant to be about achieving equitable outcomes from education for all students.</p>
<p>Equity is something we should all expect and support. </p>
<p>To achieve it, we need accurate information about student progress on a national scale. NAPLAN is meant to provide that information, so we should support and encourage students to turn up for the tests and try their best. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-avoid-annoying-your-kids-and-getting-stressed-by-proxy-during-exam-season-200719">How to avoid annoying your kids and getting 'stressed by proxy' during exam season</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/201371/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lucy Lu is the Senior Manager, Analytics and Strategic Projects for the Australian Education Research Organisation (AERO). </span></em></p><p class="fine-print"><em><span>Olivia Groves is a Principal Researcher for the Australian Education Research Organisation (AERO).</span></em></p>Our analysis reveals the participation rate is alarmingly low for some groups of students, such as First Nations kids and students from educationally disadvantaged backgrounds.Lucy Lu, Adjunct Senior Lecturer, Faculty of Education and Social Work, University of SydneyOlivia Groves, Adjunct Research Fellow, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1020962018-08-27T20:11:01Z2018-08-27T20:11:01ZNAPLAN 2018 summary results: a few weeks late, but otherwise little change from previous years<figure><img src="https://images.theconversation.com/files/233401/original/file-20180824-149469-1iekuqy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years, but they don't.</span> <span class="attribution"><span class="source">www.shutterstock.com</span></span></figcaption></figure><p>This year’s NAPLAN results have finally landed. The results are a few weeks behind schedule, <a href="http://www.abc.net.au/news/2018-08-08/naplan-results-delayed-over-concerns-results-invalid/10082734">due to disagreement</a> over how scores should be reported between the body that administers the test and state education officials. </p>
<p>Debate centres on whether data from the <a href="https://www.nap.edu.au/online-assessment">new online version</a> of the test and the pen-and-paper version are statistically comparable. The online version is being phased in between now and 2020, and is designed to be <a href="https://www.nap.edu.au/online-assessment/FAQs">more effective</a> due to its adaptive testing design.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-the-naplan-results-delay-is-a-storm-in-a-teacup-101321">Why the NAPLAN results delay is a storm in a teacup</a>
</strong>
</em>
</p>
<hr>
<p>The <a href="http://acara.edu.au/">Australian Curriculum, Assessment and Reporting Authority</a> (ACARA), which is responsible for NAPLAN, maintains the online and paper tests are comparable. ACARA has sought assurance from assessment experts, who say <a href="https://www.smh.com.au/education/a-storm-in-a-teacup-naplan-results-to-be-released-20180810-p4zwsi.html">the results are comparable</a>. Others disagree, including two <a href="http://www.abc.net.au/news/2018-08-27/naplan-testing-report-says-results-should-be-discarded/10160596">United States assessment experts</a> who yesterday said the online and paper results are “<a href="https://www.smh.com.au/education/a-futile-exercise-why-the-2018-naplan-results-should-be-dumped-20180824-p4zznk.html">inherently incompatible</a>” and “<a href="http://www.abc.net.au/news/2018-08-27/naplan-testing-report-says-results-should-be-discarded/10160596">should be discarded</a>”. </p>
<p>Such comments add fuel to an already red-hot fire, driven by those who <a href="https://www.theguardian.com/australia-news/2018/may/04/nsw-governments-call-to-scrap-naplan-rejected-by-simon-birmingham">want NAPLAN scrapped</a>, such as New South Wales education minister Rob Stokes, and those who want <a href="http://www.abc.net.au/news/2018-02-15/naplan-testing-faces-scrutiny-and-push-for-changes/9446842">a broad scale national review</a>, such as Queensland education minister Grace Grace. </p>
<p>But ultimately, we can only work with the data ACARA has released, which combines online and paper data. Overall, it shows 2018 results differ very little from <a href="https://theconversation.com/naplan-2017-results-have-largely-flat-lined-and-patterns-of-inequality-continue-88132">last year’s results</a> or longer-term trends.</p>
<h2>How is NAPLAN run?</h2>
<p><a href="http://www.nap.edu.au/about">NAPLAN</a> tests all young people in all schools (government and non-government) across Australia. It takes place every year, assessing Australian school students in years three, five, seven and nine across four domains: reading, writing, language conventions (spelling, and grammar and punctuation) and numeracy.</p>
<p>This year, 20% of students completed the new test online, with the remaining 80% doing the pen-and-paper version.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">NAPLAN results were delayed due to debate about whether data from the new online version of the test and the pen-and-paper version are comparable.</span>
<span class="attribution"><span class="source">www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>NAPLAN uses an <a href="http://www.nap.edu.au/results-and-reports/how-to-interpret/scales">assessment scale</a> divided into ten bands to report student progress through years three, five, seven and nine. Band one is the lowest and ten is the highest.</p>
<p>ACARA has responsibility for the test (on behalf of federal, state and territory governments) and each year publishes NAPLAN data for every school in the nation on the publicly accessible <a href="https://www.myschool.edu.au/">My School</a> website.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/five-things-we-wouldnt-know-without-naplan-94286">Five things we wouldn't know without NAPLAN</a>
</strong>
</em>
</p>
<hr>
<h2>What did we learn this year?</h2>
<p>Working from the assumption that the two test delivery methods are comparable, ACARA’s 2018 data indicate: </p>
<ul>
<li>Tasmania and the ACT had a statistically significant decline in year five writing performance from 2017</li>
<li>WA had a statistically significant improvement in year nine grammar and punctuation performance from 2017</li>
<li>NSW, Victoria and the ACT continue to be the highest-performing systems, scoring at or above the national average across all domains and year levels</li>
<li>the Northern Territory continues to under-perform across all domains and year levels, relative to the other states and territories and in relation to national minimum standards</li>
<li>year nine students who completed the writing test online performed better, on average, than those who completed the writing test with pen and paper (according to ACARA, these differences in results are at least partly attributable to the test mode used).</li>
</ul>
<hr>
<p><iframe id="9mMlu" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/9mMlu/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>Similar to previous years, there are large discrepancies between year nine reading and writing across all states. </p>
<h2>What about longer-term trends?</h2>
<p>The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years. But they don’t. </p>
<p>For example, we see very little change to longer-term trends, which show:</p>
<ul>
<li><p>statistically significant improvements at the national level in spelling (years three and five), reading (years three and five), numeracy (years five and nine), and grammar and punctuation (years three and seven) </p></li>
<li><p>statistically significant declines in writing achievement at the national level in years five, seven and nine (based on data from 2011 to 2018).</p></li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-is-ten-years-old-so-how-is-the-nation-faring-81565">NAPLAN is ten years old – so how is the nation faring?</a>
</strong>
</em>
</p>
<hr>
<p>It’s also very likely the final results (to be released in December) will show a continuation of <a href="https://theconversation.com/naplan-2017-results-have-largely-flat-lined-and-patterns-of-inequality-continue-88132">long-standing patterns of achievement</a> between young people from different backgrounds, which reflect broader inequalities in Australia.</p>
<h2>What are the implications moving forward?</h2>
<p>Debate over NAPLAN is unlikely to subside any time soon and it may be the case that a national review of the program ultimately emerges. It will be interesting to see what comes from the current <a href="https://qed.qld.gov.au/programs-initiatives/education/naplan-2018-review">NAPLAN review in Queensland</a> and how this contributes to broader national conversations.</p>
<p>Federal politics is also a moveable feast, with <a href="https://au.educationhq.com/news/50961/so-who-is-the-new-federal-minister-for-education/">Dan Tehan</a> assuming the role of federal education minister over the weekend, following last week’s leadership spill. Tehan replaces Simon Birmingham, who has <a href="https://www.theguardian.com/australia-news/2018/may/04/nsw-governments-call-to-scrap-naplan-rejected-by-simon-birmingham">defended</a> the merits of NAPLAN and has been central to promoting a broader reform agenda in schools. This includes recommendations coming out of the <a href="https://www.education.gov.au/review-achieve-educational-excellence-australian-schools">Gonski 2.0 report</a> released earlier this year.</p>
<p>The future of Gonski 2.0 may very well hold clues to the future of NAPLAN. The report recommends <a href="https://theconversation.com/gonski-review-reveals-another-grand-plan-to-overhaul-education-but-do-we-really-need-it-93119">an online formative assessment tool</a> be developed. This raises questions about whether such a tool, if created, might ultimately replace or serve as a supplement to NAPLAN.</p>
<p>In the short term, we will continue to see NAPLAN move online, unless any major new road blocks emerge.</p><img src="https://counter.theconversation.com/content/102096/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Glenn C. Savage receives funding from the Australian Research Council.</span></em></p><p class="fine-print"><em><span>Jessica Holloway and Steven Lewis do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years, but they don’t.Glenn C Savage, ARC DECRA Fellow and Senior Lecturer in Education Policy and Sociology of Education, The University of Western AustraliaJessica Holloway, Research Fellow, Research for Educational Impact (REDI), Deakin UniversitySteven Lewis, Alfred Deakin Postdoctoral Research Fellow, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/936432018-05-22T07:03:58Z2018-05-22T07:03:58ZHow NAPLAN could assess creativity and critical thinking<figure><img src="https://images.theconversation.com/files/219546/original/file-20180518-119911-1c6x2fb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Trends in education suggest an increased focus on the assessment and teaching of thinking skills in the future.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>Recently, some have suggested critical thinking and problem-solving skills <a href="http://www.abc.net.au/news/2018-03-07/naplan-call-review-after-report-reveals-no-change-in-decade/9519840">should be measured</a> by NAPLAN, in line with other international tests.</p>
<p>The ability to solve problems, generate creative outcomes and to analyse and evaluate are seen as <a href="http://www.p21.org/our-work/p21-framework">key capabilities for living in the 21st century</a>. Both national and state curricula and employment bodies <a href="https://www.qcaa.qld.edu.au/senior/new-snr-assessment-te/redev-snr-syll/21st-century-skills">argue these need to be taught</a>. If they have a place in contemporary education, it makes sense their gradual acquisition and use by students be monitored and assessed as they progress through school. </p>
<p>But can and should they be assessed in a NAPLAN-type context? A number of issues to do with how problem-solving, creative and critical thinking skills are learned bear on any analysis of their assessment. </p>
<h2>These are complex capacities</h2>
<p>In the case of creative thinking, researchers distinguish between <a href="https://www.researchgate.net/publication/264618782_Creative_Potential_and_Its_Measurement">creative potential</a> and the actual production of creative outcomes. This distinction applies as well to problem solving and critical thinking. </p>
<p>To generate the outcomes in each case, you need the potential to think creatively. It’s the potential that <a href="https://hal-univ-paris10.archives-ouvertes.fr/hal-01392522/document">can be assessed</a> in the school context. The <a href="https://hal-univ-paris10.archives-ouvertes.fr/hal-01392522/document">potential is displayed</a> either in the skills that are likely to lead to the outcome (such as the ability to think inferentially or divergently) or in the quality of the response to a provided task or scenario. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-teach-all-students-to-think-critically-35331">How to teach all students to think critically</a>
</strong>
</em>
</p>
<hr>
<p><a href="http://researchbank.acu.edu.au/fea_pub/4036">Research</a> on complex problem solving shows you can assess its components. Similarly, a range of tasks has been used to assess creative potential, including the <a href="http://www.ststesting.com/ngifted.htm">Torrance Test</a>. These assess and analyse components such as fluency, divergent thinking and originality. As well, there are many tests of critical thinking. You can assess your critical thinking skill <a href="https://www.testpartnership.com/free/critical/1/">online</a>.</p>
<p>The <a href="https://www.australiancurriculum.edu.au/f-10-curriculum/general-capabilities/critical-and-creative-thinking/learning-continuum/?isFirstPageLoad=false&level=Level+1&level=Level+2&level=Level+3&level=Level+4&level=Level+5&level=Level+6">Learning Continuum of Critical and Creative Thinking</a> in the <a href="https://www.australiancurriculum.edu.au/">Australian Curriculum</a> specifies the types of thinking students will typically achieve at foundation (the first year of school), year two, four, six, eight and ten. Tasks can be used to monitor this gradual acquisition.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219676/original/file-20180521-42238-kt9c7f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Assessing problem-solving, creative and critical thinking skills is not new to Australian schools, NAPLAN has always assessed these skills.</span>
<span class="attribution"><span class="source">Eva Rinaldi/flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<h2>We have a history</h2>
<p>Assessing problem-solving, creative and critical thinking skills is not new in Australian schools. NAPLAN tasks have always assessed these skills. The more difficult tasks of NAPLAN reading ask students to evaluate and analyse texts they read, to infer and to think divergently, to synthesise and to compare. </p>
<p>NAPLAN numeracy requires students to apply these to quantitative and numerical contexts. Several assessment tools developed by <a href="https://www.acer.org/">Australian Council for Educational Research</a> (ACER) since the 1980s have assessed a range of thinking and problem solving skills. One example is the <a href="https://www.acer.org/tsa/jenkins-non-verbal-test/">Jenkins Non-Verbal Reasoning Test</a>. </p>
<p>Teachers and schools in Victoria are <a href="https://theconversation.com/schools-will-teach-soft-skills-from-2017-but-assessing-them-presents-a-challenge-68749">currently required</a> to report student progress against the critical and creative thinking capabilities. The <a href="http://www.vcaa.vic.edu.au/">Victorian Curriculum and Assessment Authority</a> (VCAA) is researching contemporary assessment procedures and has developed an <a href="https://fuse.education.vic.gov.au/ResourcePackage/LandingPage?ObjectId=bc6bd542-5bb0-4887-9063-d6db7e1d1d31&SearchScope=All">online assessment tool</a> to assist schools to implement the critical and creative thinking capabilities.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/gonski-2-0-teaching-creativity-and-critical-thinking-through-the-curriculum-is-already-happening-95922">Gonski 2.0: teaching creativity and critical thinking through the curriculum is already happening</a>
</strong>
</em>
</p>
<hr>
<p>This move is in line with international trends. Creative thinking, for example, will be included as a test domain <a href="http://www.mitchellinstitute.org.au/opinion/pisa-moving-towards-creativity/">in PISA 2021</a>.</p>
<h2>Should thinking skills be assessed nationally in NAPLAN?</h2>
<p>A number of questions are relevant.</p>
<p><strong>What specifically will be assessed?</strong> One decision that needs to be made is whether to measure the outcomes or the thinking skills that can potentially contribute to the outcomes. </p>
<p><strong>What assessment tools might be used?</strong> If we assume the assessment might examine students’ use of thinking skills, then it would be possible to draw on the current work being done internationally. For example, the work of assessment authorities and the various international projects examining the assessment of <a href="http://www.p21.org/our-work/p21-framework">21st century skills</a>. </p>
<p>Multiple choice formats have been used for decades to assess the application of thinking skills. In a contemporary online context, these formats could be combined with the use of branched testing to offer students the opportunity to display more complex and demanding thinking skills. This format is essentially similar to the adaptive procedure in <a href="http://www.vcaa.vic.edu.au/Pages/prep10/ondemand/index.aspx">On Demand Testing</a> in Victoria. The multiple choice format obviously comes with the assumption the assessment of separate skills is valid for the intended purpose. </p>
<p>The written response format <a href="http://fusecontent.education.vic.gov.au/858ab3f9-32f7-4f3d-a74e-cdb022cc1306/CCT_JoannasTrip.pdf">currently being trialled by the Victorian Curriculum and Assessment Authority</a> to assess critical and creative thinking exemplifies a second type of task format. These tasks attempt to take account of the influence of the domain of knowledge about which the student is thinking. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219678/original/file-20180521-42203-9v6or6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The shift in focus to creativity and critical thinking is in line with a broader international shift.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p><strong>How will the data be interpreted and used?</strong> This is a current criticism of NAPLAN. Every item on NAPLAN Reading <a href="https://theconversation.com/criticise-if-you-must-the-naplan-tests-are-valuable-for-teachers-32108">assesses</a> a student’s ability to use independently specific skills that contribute to efficient reading comprehension. This has not been sufficient to stem the <a href="https://education.unimelb.edu.au/news_and_activities/projects/archived_projects/naplan/naplans_opinion">current criticisms made of it</a>. The assessment authorities would need to clarify explicitly the intended purposes and to align these with the protocols used. </p>
<p>Dialogue about the assessment of these skills has recommended a shift from the assessment of students focused on the outcome of a program to assessing students development at a particular time through <a href="http://www2.curtin.edu.au/edusummit/local/docs/Pre-summit_brief_paper_TWG5_-_Assessment.pdf">regular assessment during learning</a>. Contemporary online assessments could support this. It could be possible for data appropriate for both purposes could be collected. </p>
<p><strong>Why and by whom will the data that is collected be seen as useful?</strong> You will have your perspective on possible benefits in assessing skills in this area. These are the skills that determine the quality of knowledge that students construct and their capacity to comprehend, to make decisions and to innovate. Some educators see its relevance to <a href="https://www.teachermagazine.com.au/articles/teaching-thinking-skills-in-school">planning students’ future</a> educational experiences. Education policy makers <a href="https://www.premier.vic.gov.au/wp-content/uploads/2017/01/170126-World-First-Education-Target-For-Victorian-Students.pdf">are interested</a> in the capacity of students to think independently and autonomously in the ways that are assessed. </p>
<h2>Is testing creativity critical?</h2>
<p>The Victorian government has adopted the Education State <a href="https://www.theage.com.au/national/victoria/do-you-have-the-new-skills-victorian-students-are-being-tested-for-20180202-p4yz9n.html">goal</a> that by 2025 25% more year ten students will reach the highest levels of achievement in critical and creative thinking skills. Links with the future development of the Australian culture <a href="https://research.acer.edu.au/cgi/viewcontent.cgi?article=1018&context=transitions_misc">have been made</a>. </p>
<p>Trends in education suggest an increased focus on the assessment and teaching of thinking skills in the future. One possible direction is a dual assessment approach that includes both school and state or national tiers. </p>
<p>Schools would monitor and assess the knowledge, thinking and emotional engagement displayed by individual students in their pursuit of subject specific outcomes. This can be through a range of avenues such as problem solution, projects, productions and portfolios. </p>
<p>State or national educational authorities could monitor the gradual acquisition of broad-based thinking skills; the potential to think creatively and critically and to problem solve.</p><img src="https://counter.theconversation.com/content/93643/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>John Munro does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>To assess problem-solving, creative and critical thinking skills on NAPLAN would fit with broader movements in education internationally, but there are some questions to address first.John Munro, Professor, Faculty of Education and Arts, Australian Catholic UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/933562018-05-18T04:53:45Z2018-05-18T04:53:45ZBeyond ‘dumb’ tests: NAPLAN needs to value regional, rural and remote students<figure><img src="https://images.theconversation.com/files/217153/original/file-20180502-153884-1tevdes.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">NAPLAN is a "dumb" test administered to all students, regardless of contextual factors such as location and culture.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>The recently released <a href="https://www.education.gov.au/independent-review-regional-rural-and-remote-education">Independent Review into Rural, Regional and Remote Education</a> again noted students in non-metropolitan areas perform, on average, below those in metropolitan areas. One such measure commonly used to make this claim is students’ NAPLAN results.</p>
<p>But the problem with NAPLAN is that it’s a “dumb” test: administered to all students, regardless of contextual factors such as <a href="https://theconversation.com/standardised-tests-are-culturally-biased-against-rural-students-86305">location and culture</a>. This is said to be a benefit, as it provides a measure comparable across all students.</p>
<p>Monitoring the performance of a system is an important and necessary public policy process. But in practice, no tool is able to neutralise the influence of contextual factors. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/standardised-tests-are-culturally-biased-against-rural-students-86305">Standardised tests are culturally biased against rural students</a>
</strong>
</em>
</p>
<hr>
<h2>Relevance over one size fits all</h2>
<p>The review questioned how relevant the “metro-centric” nature of these assessments was to those in rural areas. The review gave the example “of a question asking what you could see at the busy train station — of which many rural, regional and remote students have no experience and therefore no concepts to be able to respond accurately”. </p>
<p>The review went on to ask “do rural, regional and remote students (and others) have knowledge and skills which they value and find useful but which are not measured and therefore not valued more widely?” </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=374&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=374&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=374&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=469&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=469&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=469&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Asking metropolitan students questions that draw on life in the country could promote a common understanding of country life.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The problem of relevance raises two possibilities for the future of NAPLAN in its current form: the inclusion of different questions drawing on different knowledge, or questions asked in different ways that still assess the same basic skill. </p>
<p>In modern educational assessment, there is no need for all students to sit the same test, that asks the same questions, on the same day. We are smarter than that. We can test the same skills that need monitoring through different types of questions that differentiate for students contexts and prior learning. In this way, we can build on the experiences of students in city or rural schools, or two metropolitan schools in culturally diverse communities. </p>
<p>Does a question have to draw on the example of a train timetable to assess numeracy, or ask students to write about a “day at the beach” to assess writing? Logically the answer would be no. This may be where the move to online purposely designed testing could be beneficial.</p>
<p>We could, for instance, include examples of country life in the curriculum, such as calving, and ask metropolitan students questions drawing on life in the country - to promote a common understanding of country life.</p>
<p>Currently we don’t, but we do include metropolitan examples that highlight the basis of this bias. This is again an issue picked up by the review of rural, regional and remote education that noted the need for direct participation of educators and communities in the development of curriculum and assessment. Too often consultation is optional, and not representative of the community itself.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Creating a curriculum based on more universal concepts might be a positive for the future of NAPLAN testing.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-solve-australias-rural-school-challenge-focus-on-research-and-communities-94979">How to solve Australia's 'rural school challenge': focus on research and communities</a>
</strong>
</em>
</p>
<hr>
<h2>Making education contextually meaningful</h2>
<p><a href="http://www.canberra.edu.au/researchrepository/file/b0d2d92b-b2c0-4691-aed3-d456128768b2/1/full_text_published.pdf">Students do better</a> when the course material is more relevant to them. Creating a curriculum based on more universal concepts and allowing teachers to choose the examples they use to illustrate the concepts is achievable. It happens in the <a href="http://www.bsss.act.edu.au/information_for_students/whats_moderation">ACT</a>, <a href="https://www.sace.sa.edu.au/teaching/assessment/school-assessment">South Australia</a> and Queensland, where teachers moderate assessment across schools <a href="https://www.qcaa.qld.edu.au/k-12-policies/student-assessment/p-12-resources">as part of their work</a>.</p>
<p>Previously we had the “<a href="https://trove.nla.gov.au/work/5767388?selectedversion=NBD7309995">Country Areas Program</a>” which was aimed at working with communities to make schooling meaningful and relevant to students’ lives.</p>
<p>But this was replaced in 2009 by <a href="https://www.acara.edu.au/reporting/national-report-on-schooling-in-australia-2011/national-initiatives-and-achievements/partnerships">national partnerships</a>, which was based on literacy and numeracy benchmarks. It also assumed all non-metropolitan schools were low socioeconomic status schools because of their NAPLAN scores - ignoring the cultural relevancy problems of the test. As such a metro-centric one-size-fits-all approach to school improvement was applied. </p>
<p>This was around the same time the tests that preceded NAPLAN (for instance the basic skills test in NSW) went “high-stakes” by being publicly reported on the MySchool website. Clearly we want all kids reaching minimum benchmarks, but that approach doesn’t examine the nature of the benchmark or the way it’s measured. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=337&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=337&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=337&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">There have been many tests, like the basic skills test in NSW, for many years without the current problems of NAPLAN.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Instead, the result is a high-stakes measure applied in a census-like fashion – the same test for everyone – that distorts practice. It’s perhaps not surprising that analysis shows results have not improved, and <a href="http://www.abc.net.au/news/2018-03-07/naplan-call-review-after-report-reveals-no-change-in-decade/9519840">suggests a decline in results, for disadvantaged groups</a> in the ten years NAPLAN has been operating. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-only-tells-part-of-the-story-of-student-achievement-86144">NAPLAN only tells part of the story of student achievement</a>
</strong>
</em>
</p>
<hr>
<h2>The medium is not the message</h2>
<p>We’ve had tests for many years without the current problems of NAPLAN. Part of the problem is the publication in order to compare achievement, and the resulting high stakes it creates. This is exacerbated by the nature of the NAPLAN test as a “dumb test”.</p>
<p>A simple change would be to continue NAPLAN testing, but using “smarter tests” and valuing teachers professionalism by engaging in moderation practices. In this way we can get a genuine picture of how all students, regardless of location and cultural background, are travelling. </p>
<p>This approach would enhance our appreciation of the diversity of students and enhance the skills of the profession. Teachers would be comparing work samples in response to questions drawing on the local knowledge of students from Menindee and Manly. This would force teachers to focus on the skills of literacy and numeracy, and understand how they are enacted differently in different places. </p>
<p>Alternatively, testing random schools as samples is an equally valid approach to monitoring a system. This is used in international tests such a <a href="https://www.acer.org/ozpisa">PISA</a> and <a href="https://www.acer.org/timss">TIMSS</a>.</p>
<p>Ultimately we need to be smarter. We need to move away from dumb tests that treat the profession as incapable of measuring student performance in a valid and meaningful way, and students and communities as neutral cultures without histories.</p><img src="https://counter.theconversation.com/content/93356/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Philip Roberts receives funding from the Australian Government. </span></em></p>There is no need for all students to sit the same test, that asks the same questions, on the same day. We are smarter than that.Philip Roberts, Associate professor, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/942862018-05-14T20:18:34Z2018-05-14T20:18:34ZFive things we wouldn’t know without NAPLAN<figure><img src="https://images.theconversation.com/files/218528/original/file-20180511-34021-ja3n9v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">We may need to rethink how NAPLAN is used, but overall it's an important tool for researchers and policymakers.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>NAPLAN, the National Assessment Program – Literacy and Numeracy, has been a prominent part of Australia’s education landscape since 2008, when it was introduced by the then education minister, Julia Gillard. </p>
<p>It’s a controversial test, lauded by some but disliked by many. </p>
<p>Ten years on, the role of NAPLAN is under question. Some argue it should be dropped entirely. Here’s why it’s a vital navigation tool for policymakers and researchers. </p>
<h2>What is NAPLAN?</h2>
<p>Every May, Australian school students in years 3, 5, 7 and 9 sit standardised tests in reading, writing, numeracy, spelling and grammar. </p>
<p>A great virtue of NAPLAN is that each domain is scored on a single scale. Achievement can be compared across different school year levels, courtesy of a common learning progression for all levels of the NAPLAN tests. This lets us analyse the learning growth of specific groups of students as they move through school. </p>
<p>I have <a href="https://grattan.edu.au/report/targeted-teaching-how-better-use-of-data-can-improve-student-learning/">consistently argued</a> the best way to lift achievement is to maximise individual learning progress. The same theme underpins the <a href="https://docs.education.gov.au/system/files/doc/other/662684_tgta_accessible_final_0.pdf">Gonski 2.0</a> report. And if we want to lift learning progress at scale, we must be able to measure it.</p>
<h2>What is NAPLAN used for?</h2>
<p>There are many claims about the benefits of NAPLAN, each of which deserves scrutiny on its merits. For example, using NAPLAN:</p>
<ul>
<li><p>policymakers and researchers can better understand student performance, to inform system-wide policies, support and resource allocation for schools</p></li>
<li><p>teachers can use the data as a diagnostic tool to improve teaching in the classroom</p></li>
<li><p>parents can make more informed choices about where to send their children, via the <a href="https://www.myschool.edu.au">My School website</a>, which publishes school-level results</p></li>
<li><p><a href="http://www.nap.edu.au/docs/default-source/default-document-library/naplan-2018-information-brochure-for-parents-and-carers.pdf?sfvrsn=2">parents have more information</a> about how their child is progressing relative to others.</p></li>
</ul>
<p>Focusing just on the first point, here are five things we know a lot more about because of NAPLAN.</p>
<h2>1. Achievement gaps for Indigenous students</h2>
<p>Indigenous students don’t achieve at the same level as their non-Indigenous peers. While this has been known for decades, we would not know just how large some of these gaps are without NAPLAN, or how the gaps have changed over time. </p>
<p><a href="https://theconversation.com/closing-the-gap-in-indigenous-literacy-and-numeracy-not-remotely-or-in-cities-88704">At a national level</a>, year 9 Indigenous students are on average three years behind non-Indigenous students in numeracy, 3.4 years behind in reading, and 4.2 years behind in writing. </p>
<p><strong>Indigenous students are three to four years behind by year 9</strong></p>
<hr>
<p><iframe id="tc-infographic-232" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/232/28f1b896efa03a8df4a31cca8c5719f3ab438d30/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<figure><figcaption>Mean NAPLAN scores from 2010 to 2017 (2011 to 2017 for Writing) translated to equivalent year level (EYL) using an updated version of the methodology in Widening Gaps (2016). The EYL scale is based on the average performance of metropolitan non-Indigenous students over 2010-2016. <b> Source: Grattan Institute/The Conversation</b></figcaption>
</figure>
<hr>
<p>Translating NAPLAN scores into <a href="https://grattan.edu.au/report/widening-gaps/">equivalent year levels</a> makes it much easier to understand and compare performance across student groups. But the Indigenous gap is so large that no fancy mathematics is needed: year 9 Indigenous students scored on average 465 in NAPLAN writing in 2017, below the 480 non-Indigenous students scored in year 5. </p>
<p>The gaps are even larger in very remote areas where Indigenous students are more than seven years behind in writing.</p>
<h2>2. Progress gaps for students in disadvantaged schools</h2>
<p>Students in disadvantaged schools perform worse. Again, not news.</p>
<p>What’s more of a surprise is that, when we tracked a cohort of Victorian students across all four of their NAPLAN tests, the size of the gap tripled, from one year and three months in year 3 to three years and eight months in year 9. </p>
<p>Even more concerning was the finding when we compared students with comparable capabilities early in their schooling. From the same year 3 starting score, students in disadvantaged schools fall more than two years behind by year 9, with potentially high-achieving students missing out the most. </p>
<p><strong>Students with similar early potential do worse in disadvantaged schools, especially high-achieving students</strong></p>
<hr>
<p><iframe id="tc-infographic-271" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/271/51f3dcb640932a8e4c3aac59cf77c771f923c095/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<h2>3. Comparison among states</h2>
<p>The states and territories are responsible for running school education in Australia. Different states and territories take different approaches. </p>
<p>In theory, this means jurisdictions can learn from each other. But this requires accurate comparisons, which take account of socio-economic differences. For example, parents in some states have higher levels of education than in others.</p>
<p>On a like-for-like basis, comparable students are achieving at very different levels depending where they live in Australia. </p>
<p><strong>States and territories have very different levels of achievement when compared on a like-for-like basis</strong></p>
<hr>
<iframe src="https://datawrapper.dwcdn.net/hhTNy/4/" scrolling="no" frameborder="0" allowtransparency="true" width="100%" height="857"></iframe>
<hr>
<p>Looking at the next level of detail makes it clear no state or territory can afford to be complacent. </p>
<p>For example, New South Wales has the highest levels of achievement of any state for students whose parents have a university degree, but its disadvantaged students make less progress than the national average. By contrast, Victoria has the highest achievement levels for students whose parents didn’t finish school, but is not stretching its most advantaged students in the same way. </p>
<h2>4. Changes over time</h2>
<p>NAPLAN has now been running for long enough to identify trends over time. Too often, the story is one of stagnation. But there are bright spots, including the early years in Queensland. </p>
<p><strong>Relative to the rest of Australia, Queensland has increased its year 3 numeracy and reading scores by three to four months since 2010</strong></p>
<hr>
<p><iframe id="5aOP2" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/5aOP2/3/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>It’s interesting to note 2010 was the first NAPLAN cohort where Queensland students started school with a prep year. This probably accounts for some of the improvement. But it’s also notable that the relative levels of achievement have improved over time, not just in a single step. This suggests Queensland’s education system is getting some other things right.</p>
<p>The richness of NAPLAN data allows us to spot much more subtle patterns as well. For example, while very remote Indigenous students are doing very poorly in writing, there are <a href="https://theconversation.com/closing-the-gap-in-indigenous-literacy-and-numeracy-not-remotely-or-in-cities-88704">signs of improvement in this cohort in NSW</a>. This level of granular analysis would not be possible without the NAPLAN tests being done every year by all schools.</p>
<h2>5. Identifying high-growth schools</h2>
<p>The “holy grail” for many advocates of NAPLAN is to use it to identify the schools that are most effective in maximising student learning growth, and to apply lessons from those schools to others not adding as much value. </p>
<p>This is easier said than done, not least because the socioeconomic mix in each school affects the rate of learning growth as well as the students’ achievement. </p>
<p>New analysis, taking socioeconomic factors into account, shows that about 8% of schools have “beaten their odds” for all five cohorts for which we have reliable NAPLAN progress data. Given this would only occur 3% of the time for a coin toss, we can confidently say that at least 5% of Australia’s schools are routinely outperforming. </p>
<p><strong>About 5% of schools are routinely doing better than we would expect given their student population mix</strong></p>
<hr>
<p><iframe id="bAwme" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/bAwme/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>Of course, NAPLAN can’t tell us why these schools are different. Maybe it’s what the schools and their teachers are doing. Maybe it’s the nature of their incoming cohort. Whatever it is, we need to know. </p>
<h2>Where to from here</h2>
<p>NAPLAN is an imperfect navigation tool. It certainly doesn’t have GPS-like levels of precision. But giving up on NAPLAN would be like 19th-century sailors dumping sextants and chronometers in favour of returning to using the stars, wind and currents to navigate. </p>
<p>Maybe we need to rethink how NAPLAN is used, but overall it should be kept.</p>
<hr>
<p><em>This article has been updated since publication to remove quotes from Les Perelman from the write-off at the top of the piece.</em></p><img src="https://counter.theconversation.com/content/94286/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Grattan Institute began with contributions to its endowment of $15 million from each of the Federal and Victorian Governments. In order to safeguard its independence, Grattan Institute’s board controls this endowment. The funds are invested and Grattan uses the income to pursue its activities.</span></em></p>While we may need to rethink how we use NAPLAN, it is an important and useful tool for researchers and policy makers.Peter Goss, School Education Program Director, Grattan InstituteLicensed as Creative Commons – attribution, no derivatives.