tag:theconversation.com,2011:/africa/topics/naplan-online-16614/articlesNAPLAN online – The Conversation2023-08-23T05:44:30Ztag:theconversation.com,2011:article/2121012023-08-23T05:44:30Z2023-08-23T05:44:30ZThe latest NAPLAN results don’t look great but we need to go beyond the headline figures<p>This year’s national NAPLAN <a href="https://www.acara.edu.au/reporting/national-report-on-schooling-in-australia/naplan-national-results">results are out</a>, with the <a href="https://www.abc.net.au/news/2023-08-23/one-in-three-students-not-meeting-naplan-standards/102756262">news only two-thirds</a> of Australian students met <a href="https://www.nap.edu.au/naplan/results-and-reports/proficiency-level-descriptions">minimum achievement levels</a> in literacy and numeracy. </p>
<p>The headlines are everything we would expect them to be – full of panic. Most reporting is focused on the number of Australian students not meeting the new proficiency standards, with talk of “<a href="https://www.skynews.com.au/australia-news/politics/albanese-government-under-pressure-following-failed-naplan-expectations/video/121439875cd205797ed7a27122a1135b">failure</a>” and “<a href="https://www.theaustralian.com.au/subscribe/news/1/?sourceCode=TAWEB_WRE170_a_GGL&dest=https%3A%2F%2Fwww.theaustralian.com.au%2Fnation%2Fpolitics%2F662bn-debacle-one-in-three-kids-fails-naplan-literacy-numeracy%2Fnews-story%2Ffdb0cde16efe5262ffe08024d5a7bc2c&memtype=anonymous&mode=premium&v21=dynamic-low-test-score&V21spcbehaviour=append">debacles</a>”. </p>
<p>The numbers certainly don’t look great, but should we be worried?</p>
<h2>Changes to NAPLAN</h2>
<p>NAPLAN was introduced in 2008 and is an annual test of all Australian students in years 3, 5, 7 and 9. It aims to see whether students are developing basic skills in literacy and numeracy.</p>
<p>On Wednesday, we saw the overall results released. Individual student reports will go home during term 3, via schools. </p>
<p>Earlier this year, NAPLAN <a href="https://theconversation.com/what-do-the-naplan-test-changes-mean-for-schools-and-students-199764">underwent significant changes</a>. These changes included a shift to online testing, moving the testing dates forward and new proficiency standards. </p>
<p>At the time of the announcement, many education experts <a href="https://www.afr.com/politics/federal/naplan-changes-aim-to-fix-the-underachievement-problem-20230210-p5cjhr">warned</a> that 2023 results might be lower than usual. </p>
<p>Many pointed to the shift from ten proficiency bands to four achievement levels (“needs additional support,” “developing,” “strong” and “exceeding”). This likely explains a lot of what we’re seeing today. It also means we cannot compare this year’s results with previous results.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1626766536273715203"}"></div></p>
<h2>The shift to online testing</h2>
<p>The shift to online testing may also have had a significant <a href="https://www.future-ed.org/paper-vs-online-testing-whats-the-impact-on-test-scores/">impact</a> on results. </p>
<p>Disparities in access to technology <a href="https://www.urban.org/urban-wire/even-pandemic-students-limited-technology-access-lagged-behind-their-peers">can impact</a> how students perform on the test. Students who regularly use computers and the internet at home are likely to feel more confident while taking an online test. Students without might struggle with basic computer skills. This can lead to more mistakes that have nothing to do with numeracy and literacy.</p>
<p>Changes to the testing window from May to March also means schools had less time to prepare students for NAPLAN in 2023. Theoretically, this might have a positive impact on education in the long run. Less time can be devoted to “test prep” or “teaching to the test”. This can free up time to spend on more authentic learning activities. But for this year, the change caught schools off guard, which may have impacted student performance.</p>
<p>We also shouldn’t forget about the impact of COVID. It is hard to estimate all the ways students have been affected by the pandemic. We can assume these effects will be felt for years to come, and we should continue to interpret NAPLAN results with this in mind.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1566940423569932288"}"></div></p>
<h2>Disparities and funding</h2>
<p>What we should be worried about is the clear disparity between Australia’s most vulnerable students and their peers. </p>
<p>Like every other year, NAPLAN results <a href="https://www.theguardian.com/australia-news/2023/aug/23/australia-naplan-results-literacy-numeracy-nsw-qld-vic-sa-nt-tas-wa">show</a> significant gaps between Indigenous students and their peers. About one-third of Indigenous students “need additional support”, compared to one-tenth of students overall. Some 50% of students in the most remote regions of Australia also “need additional support”. </p>
<p>This is not a new concern, and one experts have been <a href="https://www.afr.com/policy/health-and-education/an-inquisition-into-australia-s-great-school-funding-rort-20220913-p5bhtj">worried</a> about for many years. While politicians often blame schools and teachers, the real problem is with equitable funding. Public schools are responsible for teaching most students who require additional support, yet they are <a href="https://www.theguardian.com/education/2023/jul/17/gonski-review-government-funding-private-public-schools">not adequately funded</a> to do so. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/fewer-than-1-in-5-students-who-are-behind-in-year-3-catch-up-and-stay-caught-up-211516">Fewer than 1 in 5 students who are behind in Year 3 catch up and stay caught up</a>
</strong>
</em>
</p>
<hr>
<h2>Proceed with caution</h2>
<p>We must interpret this year’s NAPLAN results with caution. Our instinct might be to panic, but the reality is significant changes to the test have led to these results. It might take a few years before we can make any meaningful sense about overall progress and change.</p>
<p>We can also look to some experts’ <a href="https://www.afr.com/policy/health-and-education/rebooted-naplan-may-be-the-wake-up-call-australia-needs-20230315-p5cs9w#:%7E:text=NAPLAN%20is%20a%20cornerstone%20of,ammunition%20to%20drive%20important%20improvements.&text=The%20annual%20tests%20of%20school,numeracy%20are%20in%20full%20swing.">optimism</a> about the changes. They say the new achievement levels and earlier testing dates will eventually lead to simpler and more useful results. They hope this means better communication between schools and families, as well as more time for schools to act.</p>
<p>Importantly, we should not interpret this year’s results as an indictment on schools. Rather, we should force governments to <a href="https://theconversation.com/a-new-report-proposes-full-public-funding-for-private-schools-but-theres-a-catch-203840">fully fund schools</a> to the level they have said is necessary. This year’s results leave no question about the urgency of equitable funding.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-type-of-school-does-matter-when-it-comes-to-a-childs-academic-performance-199886">The type of school does matter when it comes to a child's academic performance</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/212101/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jessica Holloway receives funding from the Australian Research Council. </span></em></p>According to this year’s NAPLAN results, one third of Australian students do not meet minimum achievement levels in literacy and numeracy.Jessica Holloway, Senior Research DECRA Fellow, Institute for Learning Sciences and Teacher Education, Australian Catholic UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2013712023-03-14T19:06:31Z2023-03-14T19:06:31ZNAPLAN results inform schools, parents and policy. But too many kids miss the tests altogether<figure><img src="https://images.theconversation.com/files/515027/original/file-20230313-20-dv146l.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C6000%2C3997&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Today the NAPLAN testing window starts for more than a million students in Years 3, 5, 7 and 9. Over the next nine days students will sit literacy and numeracy tests which are designed to measure their reading, writing, numeracy, grammar, punctuation and spelling. </p>
<p>Education decision makers will be holding their breath about how many students turn up for NAPLAN. Last year saw the <a href="https://www.edresearch.edu.au/resources/naplan-participation-who-missing-tests-and-why-it-matters">steepest declines</a> on record in secondary school student participation. </p>
<p>This is an issue because NAPLAN results help inform parents, teachers, schools and education authorities about student learning and can influence <a href="https://www.pc.gov.au/inquiries/completed/school-agreement#report">decisions about policies</a>, resources and additional supports for students. Declining NAPLAN participation may result in decisions being based on incomplete data. </p>
<p>In our <a href="https://www.edresearch.edu.au/resources/naplan-participation-who-missing-tests-and-why-it-matters">new paper</a> for the Australian Education Research Organisation, we look at who is not sitting the tests and why that matters.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/maths-anxiety-is-a-real-thing-here-are-3-ways-to-help-your-child-cope-200822">'Maths anxiety' is a real thing. Here are 3 ways to help your child cope</a>
</strong>
</em>
</p>
<hr>
<h2>Who is not sitting the tests?</h2>
<p>While primary school student participation in NAPLAN has been steady at about 95% since 2014, secondary student participation has been in persistent decline. Last year only 87% of Year 9 students sat the tests. </p>
<p>A sharper decline in participation in 2022 <a href="https://acara.edu.au/reporting/national-report-on-schooling-in-australia/national-report-on-schooling-in-australia-data-portal/student-attendance">was partly due to</a> flooding in regions across Australia, high rates of illness and COVID-19 isolation requirements – circumstances we hope will not be repeated. It is the long-term decline in NAPLAN participation in secondary schools that needs attention. </p>
<p>The participation rate is alarmingly low for some groups of students. The figure below shows 79% of Year 9 students living in remote Australia sat NAPLAN last year. First Nations students and students from educationally disadvantaged backgrounds also had low participation rates in 2022; 66% and 75% respectively. </p>
<hr>
<iframe src="https://flo.uri.sh/visualisation/13050285/embed" title="Interactive or visual content" class="flourish-embed-iframe" frameborder="0" scrolling="no" style="width:100%;height:600px;" sandbox="allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation" width="100%" height="400"></iframe>
<div style="width:100%!;margin-top:4px!important;text-align:right!important;"><a class="flourish-credit" href="https://public.flourish.studio/visualisation/13050285/?utm_source=embed&utm_campaign=visualisation/13050285" target="_top"><img alt="Made with Flourish" src="https://public.flourish.studio/resources/made_with_flourish.svg"> </a></div>
<hr>
<p>Our analysis reveals low-performing students are also less likely to participate in the tests. Students who performed poorly in NAPLAN in Year 7 were nearly five times more likely to miss the Year 9 tests than high-performing students. These findings were replicated for primary students.</p>
<p>Students who are educationally at risk need the best decisions from schools and education authorities. If NAPLAN participation rates are low for these smaller populations, the data is less reliable and the ability to make informed decisions may be compromised. </p>
<h2>Why aren’t students sitting the tests?</h2>
<p>Students do not sit NAPLAN for three official reasons: they may be exempt from taking the tests, withdrawn by their parents, or absent on the day. </p>
<p>The main reason for the long-term decline in NAPLAN participation is that more parents have been withdrawing their children from the tests. In 2022 over 11,000 Year 9 students didn’t sit the writing test because they had been withdrawn from it.</p>
<p>Being absent is also a contributing factor in the decline in participation; more so for secondary students than primary. In 2022, more Year 9 students than usual were absent from the writing test (in total over 28,600). </p>
<iframe src="https://flo.uri.sh/visualisation/13049784/embed" title="Interactive or visual content" class="flourish-embed-iframe" frameborder="0" scrolling="no" style="width:100%;height:600px;" sandbox="allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation" width="100%" height="400"></iframe>
<div style="width:100%!;margin-top:4px!important;text-align:right!important;"><a class="flourish-credit" href="https://public.flourish.studio/visualisation/13049784/?utm_source=embed&utm_campaign=visualisation/13049784" target="_top"><img alt="Made with Flourish" src="https://public.flourish.studio/resources/made_with_flourish.svg"> </a></div>
<p>There are many reasons students are absent and withdrawn from NAPLAN. Parents who are worried about how their child may be affected by taking the tests and <a href="https://theconversation.com/what-parents-should-and-shouldnt-say-when-talking-to-their-child-about-naplan-results-189636">receiving results</a> may choose to keep them at home or formally withdraw them from the tests. Anecdotally there have also been <a href="https://www.heraldsun.com.au/news/schools-can-cheat-naplan-exams/news-story/8160b68c1e79ce869538913e730cdad4">reports</a> of schools asking low performing students to stay home on testing days, so they don’t “drag down” school averages.</p>
<p>On the positive side, our analysis showed Year 9 students with language backgrounds other than English participated in higher proportions than average (92% compared to 87%). This suggests cultural differences and family attitudes to education and testing might play an important role in participation. </p>
<h2>Why is high NAPLAN participation important?</h2>
<p>NAPLAN data is used by education authorities to <a href="https://theconversation.com/five-things-we-wouldnt-know-without-naplan-94286">better understand the learning progress of all Australian students</a> to inform system-wide policies and support.</p>
<p>It also helps schools, systems and sectors to monitor and evaluate the effectiveness of educational approaches, and identifies schools which need more support. </p>
<p>For example, <a href="https://www.theeducatoronline.com/k12/news/writing-the-next-chapter-in-student-learning/281525">in NSW</a>, NAPLAN data has been used to understand whether a new teaching role and giving students more practice time have been effective in improving students’ writing skills.</p>
<p>In Victoria, <a href="https://www.theage.com.au/national/victoria/naplan-starts-this-week-here-s-what-the-changes-mean-for-students-and-parents-20230312-p5crfr.html?ref=rss&utm_medium=rss&utm_source=rss_national">Brandon Park Primary School</a> used its NAPLAN results to inform a whole school change to its teaching of reading, which brought remarkable success. </p>
<p>Given the benefits that good use of NAPLAN data can bring, it is critical the results are representative of the student groups being tested. </p>
<p>While the Australian Curriculum, Assessment and Reporting Authority estimates data for withdrawn and absent students, our analysis suggests student proficiency is likely to be overestimated.</p>
<p>That’s because students not sitting the test are more likely to be lower-performing students from their respective demographic groups. Real data is always better than estimates.</p>
<h2>What now?</h2>
<p>The Australian <a href="https://www.education.gov.au/alice-springs-mparntwe-education-declaration">education system</a> is meant to be about achieving equitable outcomes from education for all students.</p>
<p>Equity is something we should all expect and support. </p>
<p>To achieve it, we need accurate information about student progress on a national scale. NAPLAN is meant to provide that information, so we should support and encourage students to turn up for the tests and try their best. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-avoid-annoying-your-kids-and-getting-stressed-by-proxy-during-exam-season-200719">How to avoid annoying your kids and getting 'stressed by proxy' during exam season</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/201371/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lucy Lu is the Senior Manager, Analytics and Strategic Projects for the Australian Education Research Organisation (AERO). </span></em></p><p class="fine-print"><em><span>Olivia Groves is a Principal Researcher for the Australian Education Research Organisation (AERO).</span></em></p>Our analysis reveals the participation rate is alarmingly low for some groups of students, such as First Nations kids and students from educationally disadvantaged backgrounds.Lucy Lu, Adjunct Senior Lecturer, Faculty of Education and Social Work, University of SydneyOlivia Groves, Adjunct Research Fellow, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1020962018-08-27T20:11:01Z2018-08-27T20:11:01ZNAPLAN 2018 summary results: a few weeks late, but otherwise little change from previous years<figure><img src="https://images.theconversation.com/files/233401/original/file-20180824-149469-1iekuqy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years, but they don't.</span> <span class="attribution"><span class="source">www.shutterstock.com</span></span></figcaption></figure><p>This year’s NAPLAN results have finally landed. The results are a few weeks behind schedule, <a href="http://www.abc.net.au/news/2018-08-08/naplan-results-delayed-over-concerns-results-invalid/10082734">due to disagreement</a> over how scores should be reported between the body that administers the test and state education officials. </p>
<p>Debate centres on whether data from the <a href="https://www.nap.edu.au/online-assessment">new online version</a> of the test and the pen-and-paper version are statistically comparable. The online version is being phased in between now and 2020, and is designed to be <a href="https://www.nap.edu.au/online-assessment/FAQs">more effective</a> due to its adaptive testing design.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-the-naplan-results-delay-is-a-storm-in-a-teacup-101321">Why the NAPLAN results delay is a storm in a teacup</a>
</strong>
</em>
</p>
<hr>
<p>The <a href="http://acara.edu.au/">Australian Curriculum, Assessment and Reporting Authority</a> (ACARA), which is responsible for NAPLAN, maintains the online and paper tests are comparable. ACARA has sought assurance from assessment experts, who say <a href="https://www.smh.com.au/education/a-storm-in-a-teacup-naplan-results-to-be-released-20180810-p4zwsi.html">the results are comparable</a>. Others disagree, including two <a href="http://www.abc.net.au/news/2018-08-27/naplan-testing-report-says-results-should-be-discarded/10160596">United States assessment experts</a> who yesterday said the online and paper results are “<a href="https://www.smh.com.au/education/a-futile-exercise-why-the-2018-naplan-results-should-be-dumped-20180824-p4zznk.html">inherently incompatible</a>” and “<a href="http://www.abc.net.au/news/2018-08-27/naplan-testing-report-says-results-should-be-discarded/10160596">should be discarded</a>”. </p>
<p>Such comments add fuel to an already red-hot fire, driven by those who <a href="https://www.theguardian.com/australia-news/2018/may/04/nsw-governments-call-to-scrap-naplan-rejected-by-simon-birmingham">want NAPLAN scrapped</a>, such as New South Wales education minister Rob Stokes, and those who want <a href="http://www.abc.net.au/news/2018-02-15/naplan-testing-faces-scrutiny-and-push-for-changes/9446842">a broad scale national review</a>, such as Queensland education minister Grace Grace. </p>
<p>But ultimately, we can only work with the data ACARA has released, which combines online and paper data. Overall, it shows 2018 results differ very little from <a href="https://theconversation.com/naplan-2017-results-have-largely-flat-lined-and-patterns-of-inequality-continue-88132">last year’s results</a> or longer-term trends.</p>
<h2>How is NAPLAN run?</h2>
<p><a href="http://www.nap.edu.au/about">NAPLAN</a> tests all young people in all schools (government and non-government) across Australia. It takes place every year, assessing Australian school students in years three, five, seven and nine across four domains: reading, writing, language conventions (spelling, and grammar and punctuation) and numeracy.</p>
<p>This year, 20% of students completed the new test online, with the remaining 80% doing the pen-and-paper version.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/233654/original/file-20180827-149475-8ro8kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">NAPLAN results were delayed due to debate about whether data from the new online version of the test and the pen-and-paper version are comparable.</span>
<span class="attribution"><span class="source">www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>NAPLAN uses an <a href="http://www.nap.edu.au/results-and-reports/how-to-interpret/scales">assessment scale</a> divided into ten bands to report student progress through years three, five, seven and nine. Band one is the lowest and ten is the highest.</p>
<p>ACARA has responsibility for the test (on behalf of federal, state and territory governments) and each year publishes NAPLAN data for every school in the nation on the publicly accessible <a href="https://www.myschool.edu.au/">My School</a> website.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/five-things-we-wouldnt-know-without-naplan-94286">Five things we wouldn't know without NAPLAN</a>
</strong>
</em>
</p>
<hr>
<h2>What did we learn this year?</h2>
<p>Working from the assumption that the two test delivery methods are comparable, ACARA’s 2018 data indicate: </p>
<ul>
<li>Tasmania and the ACT had a statistically significant decline in year five writing performance from 2017</li>
<li>WA had a statistically significant improvement in year nine grammar and punctuation performance from 2017</li>
<li>NSW, Victoria and the ACT continue to be the highest-performing systems, scoring at or above the national average across all domains and year levels</li>
<li>the Northern Territory continues to under-perform across all domains and year levels, relative to the other states and territories and in relation to national minimum standards</li>
<li>year nine students who completed the writing test online performed better, on average, than those who completed the writing test with pen and paper (according to ACARA, these differences in results are at least partly attributable to the test mode used).</li>
</ul>
<hr>
<p><iframe id="9mMlu" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/9mMlu/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<hr>
<p>Similar to previous years, there are large discrepancies between year nine reading and writing across all states. </p>
<h2>What about longer-term trends?</h2>
<p>The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years. But they don’t. </p>
<p>For example, we see very little change to longer-term trends, which show:</p>
<ul>
<li><p>statistically significant improvements at the national level in spelling (years three and five), reading (years three and five), numeracy (years five and nine), and grammar and punctuation (years three and seven) </p></li>
<li><p>statistically significant declines in writing achievement at the national level in years five, seven and nine (based on data from 2011 to 2018).</p></li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-is-ten-years-old-so-how-is-the-nation-faring-81565">NAPLAN is ten years old – so how is the nation faring?</a>
</strong>
</em>
</p>
<hr>
<p>It’s also very likely the final results (to be released in December) will show a continuation of <a href="https://theconversation.com/naplan-2017-results-have-largely-flat-lined-and-patterns-of-inequality-continue-88132">long-standing patterns of achievement</a> between young people from different backgrounds, which reflect broader inequalities in Australia.</p>
<h2>What are the implications moving forward?</h2>
<p>Debate over NAPLAN is unlikely to subside any time soon and it may be the case that a national review of the program ultimately emerges. It will be interesting to see what comes from the current <a href="https://qed.qld.gov.au/programs-initiatives/education/naplan-2018-review">NAPLAN review in Queensland</a> and how this contributes to broader national conversations.</p>
<p>Federal politics is also a moveable feast, with <a href="https://au.educationhq.com/news/50961/so-who-is-the-new-federal-minister-for-education/">Dan Tehan</a> assuming the role of federal education minister over the weekend, following last week’s leadership spill. Tehan replaces Simon Birmingham, who has <a href="https://www.theguardian.com/australia-news/2018/may/04/nsw-governments-call-to-scrap-naplan-rejected-by-simon-birmingham">defended</a> the merits of NAPLAN and has been central to promoting a broader reform agenda in schools. This includes recommendations coming out of the <a href="https://www.education.gov.au/review-achieve-educational-excellence-australian-schools">Gonski 2.0 report</a> released earlier this year.</p>
<p>The future of Gonski 2.0 may very well hold clues to the future of NAPLAN. The report recommends <a href="https://theconversation.com/gonski-review-reveals-another-grand-plan-to-overhaul-education-but-do-we-really-need-it-93119">an online formative assessment tool</a> be developed. This raises questions about whether such a tool, if created, might ultimately replace or serve as a supplement to NAPLAN.</p>
<p>In the short term, we will continue to see NAPLAN move online, unless any major new road blocks emerge.</p><img src="https://counter.theconversation.com/content/102096/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Glenn C. Savage receives funding from the Australian Research Council.</span></em></p><p class="fine-print"><em><span>Jessica Holloway and Steven Lewis do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The current debate about comparability would be more concerning if 2018 results showed radically different trends compared to previous years, but they don’t.Glenn C Savage, ARC DECRA Fellow and Senior Lecturer in Education Policy and Sociology of Education, The University of Western AustraliaJessica Holloway, Research Fellow, Research for Educational Impact (REDI), Deakin UniversitySteven Lewis, Alfred Deakin Postdoctoral Research Fellow, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/933562018-05-18T04:53:45Z2018-05-18T04:53:45ZBeyond ‘dumb’ tests: NAPLAN needs to value regional, rural and remote students<figure><img src="https://images.theconversation.com/files/217153/original/file-20180502-153884-1tevdes.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">NAPLAN is a "dumb" test administered to all students, regardless of contextual factors such as location and culture.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>The recently released <a href="https://www.education.gov.au/independent-review-regional-rural-and-remote-education">Independent Review into Rural, Regional and Remote Education</a> again noted students in non-metropolitan areas perform, on average, below those in metropolitan areas. One such measure commonly used to make this claim is students’ NAPLAN results.</p>
<p>But the problem with NAPLAN is that it’s a “dumb” test: administered to all students, regardless of contextual factors such as <a href="https://theconversation.com/standardised-tests-are-culturally-biased-against-rural-students-86305">location and culture</a>. This is said to be a benefit, as it provides a measure comparable across all students.</p>
<p>Monitoring the performance of a system is an important and necessary public policy process. But in practice, no tool is able to neutralise the influence of contextual factors. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/standardised-tests-are-culturally-biased-against-rural-students-86305">Standardised tests are culturally biased against rural students</a>
</strong>
</em>
</p>
<hr>
<h2>Relevance over one size fits all</h2>
<p>The review questioned how relevant the “metro-centric” nature of these assessments was to those in rural areas. The review gave the example “of a question asking what you could see at the busy train station — of which many rural, regional and remote students have no experience and therefore no concepts to be able to respond accurately”. </p>
<p>The review went on to ask “do rural, regional and remote students (and others) have knowledge and skills which they value and find useful but which are not measured and therefore not valued more widely?” </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=374&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=374&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=374&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=469&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=469&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217143/original/file-20180502-135810-dj8wf7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=469&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Asking metropolitan students questions that draw on life in the country could promote a common understanding of country life.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The problem of relevance raises two possibilities for the future of NAPLAN in its current form: the inclusion of different questions drawing on different knowledge, or questions asked in different ways that still assess the same basic skill. </p>
<p>In modern educational assessment, there is no need for all students to sit the same test, that asks the same questions, on the same day. We are smarter than that. We can test the same skills that need monitoring through different types of questions that differentiate for students contexts and prior learning. In this way, we can build on the experiences of students in city or rural schools, or two metropolitan schools in culturally diverse communities. </p>
<p>Does a question have to draw on the example of a train timetable to assess numeracy, or ask students to write about a “day at the beach” to assess writing? Logically the answer would be no. This may be where the move to online purposely designed testing could be beneficial.</p>
<p>We could, for instance, include examples of country life in the curriculum, such as calving, and ask metropolitan students questions drawing on life in the country - to promote a common understanding of country life.</p>
<p>Currently we don’t, but we do include metropolitan examples that highlight the basis of this bias. This is again an issue picked up by the review of rural, regional and remote education that noted the need for direct participation of educators and communities in the development of curriculum and assessment. Too often consultation is optional, and not representative of the community itself.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217145/original/file-20180502-135814-3a8fzy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Creating a curriculum based on more universal concepts might be a positive for the future of NAPLAN testing.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-solve-australias-rural-school-challenge-focus-on-research-and-communities-94979">How to solve Australia's 'rural school challenge': focus on research and communities</a>
</strong>
</em>
</p>
<hr>
<h2>Making education contextually meaningful</h2>
<p><a href="http://www.canberra.edu.au/researchrepository/file/b0d2d92b-b2c0-4691-aed3-d456128768b2/1/full_text_published.pdf">Students do better</a> when the course material is more relevant to them. Creating a curriculum based on more universal concepts and allowing teachers to choose the examples they use to illustrate the concepts is achievable. It happens in the <a href="http://www.bsss.act.edu.au/information_for_students/whats_moderation">ACT</a>, <a href="https://www.sace.sa.edu.au/teaching/assessment/school-assessment">South Australia</a> and Queensland, where teachers moderate assessment across schools <a href="https://www.qcaa.qld.edu.au/k-12-policies/student-assessment/p-12-resources">as part of their work</a>.</p>
<p>Previously we had the “<a href="https://trove.nla.gov.au/work/5767388?selectedversion=NBD7309995">Country Areas Program</a>” which was aimed at working with communities to make schooling meaningful and relevant to students’ lives.</p>
<p>But this was replaced in 2009 by <a href="https://www.acara.edu.au/reporting/national-report-on-schooling-in-australia-2011/national-initiatives-and-achievements/partnerships">national partnerships</a>, which was based on literacy and numeracy benchmarks. It also assumed all non-metropolitan schools were low socioeconomic status schools because of their NAPLAN scores - ignoring the cultural relevancy problems of the test. As such a metro-centric one-size-fits-all approach to school improvement was applied. </p>
<p>This was around the same time the tests that preceded NAPLAN (for instance the basic skills test in NSW) went “high-stakes” by being publicly reported on the MySchool website. Clearly we want all kids reaching minimum benchmarks, but that approach doesn’t examine the nature of the benchmark or the way it’s measured. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=337&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=337&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=337&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217149/original/file-20180502-135851-1p8od8h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">There have been many tests, like the basic skills test in NSW, for many years without the current problems of NAPLAN.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Instead, the result is a high-stakes measure applied in a census-like fashion – the same test for everyone – that distorts practice. It’s perhaps not surprising that analysis shows results have not improved, and <a href="http://www.abc.net.au/news/2018-03-07/naplan-call-review-after-report-reveals-no-change-in-decade/9519840">suggests a decline in results, for disadvantaged groups</a> in the ten years NAPLAN has been operating. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-only-tells-part-of-the-story-of-student-achievement-86144">NAPLAN only tells part of the story of student achievement</a>
</strong>
</em>
</p>
<hr>
<h2>The medium is not the message</h2>
<p>We’ve had tests for many years without the current problems of NAPLAN. Part of the problem is the publication in order to compare achievement, and the resulting high stakes it creates. This is exacerbated by the nature of the NAPLAN test as a “dumb test”.</p>
<p>A simple change would be to continue NAPLAN testing, but using “smarter tests” and valuing teachers professionalism by engaging in moderation practices. In this way we can get a genuine picture of how all students, regardless of location and cultural background, are travelling. </p>
<p>This approach would enhance our appreciation of the diversity of students and enhance the skills of the profession. Teachers would be comparing work samples in response to questions drawing on the local knowledge of students from Menindee and Manly. This would force teachers to focus on the skills of literacy and numeracy, and understand how they are enacted differently in different places. </p>
<p>Alternatively, testing random schools as samples is an equally valid approach to monitoring a system. This is used in international tests such a <a href="https://www.acer.org/ozpisa">PISA</a> and <a href="https://www.acer.org/timss">TIMSS</a>.</p>
<p>Ultimately we need to be smarter. We need to move away from dumb tests that treat the profession as incapable of measuring student performance in a valid and meaningful way, and students and communities as neutral cultures without histories.</p><img src="https://counter.theconversation.com/content/93356/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Philip Roberts receives funding from the Australian Government. </span></em></p>There is no need for all students to sit the same test, that asks the same questions, on the same day. We are smarter than that.Philip Roberts, Associate professor, University of CanberraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/935552018-05-16T20:14:04Z2018-05-16T20:14:04ZWe need to reform NAPLAN to make it more useful<figure><img src="https://images.theconversation.com/files/213994/original/file-20180410-114121-smsilr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">4.5 million young Australians between the ages of nine and 24 have taken NAPLAN tests at some point during their schooling.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>The National Assessment Program - Literacy and Numeracy <a href="https://www.nap.edu.au/results-and-reports/national-reports">(NAPLAN)</a> has now been in place for a decade. Some 4.5 million young Australians between the ages of nine and 24 have taken NAPLAN tests at some point during their schooling. </p>
<p>But NAPLAN has its <a href="http://sydney.edu.au/education_social_work/news_events/resources/No_NAPLAN.pdf">critics</a> and, as with all testing programs, would benefit from ongoing review and refinement. </p>
<p>Here are two suggestions that might make these tests more useful to classroom teaching and learning.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/classroom-design-should-follow-evidence-not-architectural-fads-89861">Classroom design should follow evidence, not architectural fads</a>
</strong>
</em>
</p>
<hr>
<h2>Abandon public comparisons of school results</h2>
<p>In common with the state-based tests (for example, the <a href="https://www.records.nsw.gov.au/series/18928">NSW Basic Skills Test</a>) it replaced, NAPLAN was introduced to provide parents, teachers and schools with objective information about students’ foundational literacy and numeracy skills. </p>
<p>This was after unacceptably low levels of reading and numeracy were going undetected and unaddressed in Australian schools.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/213998/original/file-20180410-71151-cvv6ae.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">NAPLAN testing has reportedly led to inappropriate levels of practice testing and increased student test anxiety.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Since the introduction of NAPLAN, there has been a marked increase in the stakes attached to these tests. School results have been made available for public comparison on the <a href="https://myschool.edu.au/">My School</a> website. Some schools even use NAPLAN in their marketing and student selection processes. </p>
<p>Other schools and school systems use NAPLAN to hold teachers and school leaders accountable for improvement, including making test results part of performance reviews. And there have been proposals to make NAPLAN results the basis of teacher <a href="https://www.smh.com.au/national/teacher-wages-to-be-linked-to-test-results-20091114-ifnw.html">performance pay</a> and financial <a href="http://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22library%2Fpartypol%2FIGOX6%22">rewards</a> for school improvement.</p>
<p>As a result, parents, teachers and schools now place greater importance on NAPLAN results in comparison to the earlier state-based tests. This has led to <a href="https://theconversation.com/naplan-testing-does-more-harm-than-good-26923">reports</a> of inappropriate levels of practice testing and increased student test anxiety. It has also narrowed teaching to the test, and led to occasional cheating.</p>
<p>The decision to make all schools’ NAPLAN results public was based on a belief this would provide parents with better information when choosing schools. </p>
<p>This was a market-driven belief that, for schools, the risk of losing students would be a powerful incentive to improve. But test-based incentives have proven largely <a href="https://michaelfullan.ca/wp-content/uploads/2016/06/13396088160.pdf">ineffective</a> in driving school improvement.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/213999/original/file-20180410-71160-119ixgp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The current paper test is given to students in years three, five, seven and nine.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Parents have sometimes drawn incorrect conclusions about the quality of a school from publicly reported test results. And public comparisons of schools have resulted in a range of unanticipated negative <a href="https://www.goodschools.com.au/insights/education-updates/concerns-over-naplan-testing">consequences</a> such as narrowing teaching and increasing levels of teacher and student stress.</p>
<p>An obvious strategy is to stop reporting school results publicly and to restrict access to school-level NAPLAN data to individual schools and school systems. The primary focus of literacy and numeracy testing might then return to its original purpose of informing teaching and learning.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-only-tells-part-of-the-story-of-student-achievement-86144">NAPLAN only tells part of the story of student achievement</a>
</strong>
</em>
</p>
<hr>
<h2>Enhance the instructional value of NAPLAN</h2>
<p>NAPLAN is a paper test given to students in years three, five, seven and nine, although this year some schools will administer the test online for the first time. The instructional value of these tests appears to be limited in several ways.</p>
<ol>
<li><p>Because common <a href="https://www.teachermagazine.com.au/articles/naplan-reading-the-achievement-spread">year-level tests</a> are too difficult for some students in each year level and too easy for others, NAPLAN provides little information to guide the teaching of these students.</p></li>
<li><p>Because the marking of paper tests is a time consuming process, results are provided many weeks after testing, limiting their usefulness to teaching.</p></li>
<li><p>NAPLAN tests include only a few items on each literacy and numeracy skill and so are of limited diagnostic value for individual students.</p></li>
</ol>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/214000/original/file-20180410-71173-1lkl0di.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">An online version of NAPLAN could provide opportunities to improve the quality of the test.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The delivery of NAPLAN online, being gradually implemented from this year, offers an opportunity to address these limitations. </p>
<p>The first step is to continue to get rid of fixed, year-level tests and fully replace them with online “<a href="https://www.edglossary.org/computer-adaptive-test/">adaptive</a>” tests. In adaptive testing, students are given test questions that are directly targeted to individual students’ skill levels. Adaptive testing provides more precise information about the points individuals have reached in their learning, regardless of their year level.</p>
<p>The benefits of adaptive testing are best realised when the purpose of testing is to establish and understand where individuals are in their skill development. This requires the substantive interpretation of test results by reference to a well-constructed <a href="https://rd.acer.org/article/progress-towards-a-global-measurement-scale">map</a> of long-term skill development. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-online-will-australian-schools-and-students-be-ready-25327">NAPLAN online: will Australian schools and students be ready?</a>
</strong>
</em>
</p>
<hr>
<p>NAPLAN scores are expressed on a numerical scale that extends across years three, five, seven and nine. These scores are interpreted with reference to a hierarchy of skill “bands” (or <a href="https://www.nap.edu.au/nap-sample-assessments/civics-and-citizenship/proficiency-levels">proficiency levels</a>). </p>
<p>The instructional usefulness of NAPLAN will be enhanced by working to describe and illustrate these skill levels in ways that maximise guidance to teaching and learning and by making them the direct reference for understanding students’ NAPLAN performances. </p>
<p>Online delivery and scoring will provide more immediate feedback to teachers and students thus improving NAPLAN’s instructional value.</p>
<p>Delivery in an online environment also introduces the possibility of changing NAPLAN itself. For example, the wrong answers a student gives in numeracy could be used to draw automatic conclusions about the mistakes they are making. This could be checked by giving more questions of the same type and feeding the results back to the teacher.</p><img src="https://counter.theconversation.com/content/93555/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Geoff Masters works for ACER which undertakes commissioned research and development related to NAPLAN.</span></em></p>NAPLAN has now been in place for a decade and needs ongoing review and refinement to make it more useful to classroom teaching and learning.Geoff Masters, CEO, Australian Council for Educational ResearchLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/950352018-05-13T20:28:21Z2018-05-13T20:28:21ZRe-envisioning NAPLAN: use technology to make the tests more authentic and relevant<figure><img src="https://images.theconversation.com/files/216403/original/file-20180426-175054-5ie9lh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">We need to look for more engaging and relevant assessments that use the tools available in an online environment for re-envisioning NAPLAN.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>NAPLAN testing starts this week. With calls for a review, many education experts are calling the Future of NAPLAN into question. In <a href="https://theconversation.com/au/topics/future-of-naplan-53601">this series</a>, the experts look at options for removing, replacing, rethinking or resuming NAPLAN.</em></p>
<hr>
<p>Think about where and how you read and write most often. It’s probably not on paper. It’s most likely to be online – using the internet, email, messenger, or Facebook. While print-based literacy skills are necessary in these forms of communication, they are not the only “basic” literacy skills we use. </p>
<p>In classrooms, students learn to read and write using a range of resources, for example, books, pens, paper, apps, websites. But they also learn critical thinking and problem solving, collaboration, creativity and communication necessary to achieve their future goals. They are sites of excitement enriched with learning where students are <a href="https://theconversation.com/naplan-only-tells-part-of-the-story-of-student-achievement-86144">encouraged to take risks</a> within a broad curriculum.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/tailored-online-naplan-better-for-monitoring-high-and-low-achievers-42613">Tailored online NAPLAN better for monitoring high and low achievers</a>
</strong>
</em>
</p>
<hr>
<p>NAPLAN online provides students with different pathways through the test based on their responses using <a href="https://theconversation.com/tailored-online-naplan-better-for-monitoring-high-and-low-achievers-42613">tailored testing</a>. Questions get either harder or easier based on the answers to previous questions. But the texts students read and types of questions to answer have not changed to take advantage of the online environment. Tailored testing may provide quicker access to scores, but it doesn’t provide any new or additional information.</p>
<p>Calls to review NAPLAN are coming from <a href="http://www.abc.net.au/news/2018-03-07/naplan-call-review-after-report-reveals-no-change-in-decade/9519840">principal associations</a> and <a href="http://www.abc.net.au/news/2018-04-21/timeout-on-naplan-needed-ex-principals-boss-stephen-breen-says/9682192">state authorities</a> but we can’t afford to wait for the full roll-out of NAPLAN online. We need to be considering alternatives now to re-envision NAPLAN so it can assess the challenging, more complex skills our students need to acquire for their future. NAPLAN needs to be more relevant to students’ lives and educational experiences. Using the online delivery in a meaningful way is one way we can change tack. </p>
<h2>Alternative tests</h2>
<p>One example is the Online Reading Comprehension Assessment (<a href="http://www.orca.uconn.edu/professional-development/show-me/show-me-overview/">O.R.C.A.</a>). Researchers at the University of Connecticut designed performance-based assessments which assess students during an actual online assessment task. </p>
<p>During the test, students access a limited number of internet sites included within the boundaries of the test system. The O.R.C.A. asks students to conduct research on topics in the human body systems with an avatar as a guide through the assessment task, at a year seven level. </p>
<p>It measures reading to locate information using a search engine, reading to synthesise information across multiple webpages, reading to critically evaluate the reliability of information found on the internet, and writing to communicate a short report of research in an email or wiki. It is a validated and reliable test, being used with 3,000 students across two states in the United States. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/aXxrR2wBR5Y?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>An ORCA-type assessment mirrors embedded literacy practices, and numeracy practices, present in everyday life and schooling that reflect today’s students’ world. Testing of language, vocabulary and spelling could be included based on the websites. There are more possibilities for the writing assessment. It could use another mode, such as a video, multimodal product or images for students’ responses. </p>
<p>Another possible alternative are the digitally-based assessments developed in the United States for the National Assessment of Educational Progress (<a href="https://nces.ed.gov/nationsreportcard/dba/">NAEP</a>). NAEP digital assessments use tablet or computer technology to ask a variety of questions and task types. </p>
<p>They assess what students know and are able to do in more authentic or direct ways, including scenario-based tasks, interactive computer tasks, and hybrid hands-on tasks. Some questions include multimedia, such as audio and video, or digital tools, such as an onscreen calculator. Schools are provided with the technology if required. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/-RJ4k0I6h2c?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>International achievement comparisons</h2>
<p>If Australian 15-year-old students are to demonstrate the skills they need to thrive, like those needed to work and communicate with others required in the Programme for International Student Assessment (<a href="http://www.oecd.org/pisa/aboutpisa/">PISA</a>), then NAPLAN online will not provide educators with information or opportunities to develop these skills with students. </p>
<p>In PISA, students are asked to interpret texts, solve mathematics problems or explain a phenomenon scientifically using their knowledge and reasoning skills. In NAPLAN, they answer multiple choice questions. If we want to improve our standing internationally then we need to change the assessment tasks students have to complete.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/naplan-is-ten-years-old-so-how-is-the-nation-faring-81565">NAPLAN is ten years old – so how is the nation faring?</a>
</strong>
</em>
</p>
<hr>
<p>We need to look for more engaging and relevant assessments that use the tools available in an online environment for re-envisioning NAPLAN. In doing this, we will also be broadening the complexity of skills being assessed and making it a more reliable predictor of competency and standard of literacy and teaching than the current online test.</p>
<hr>
<p><em>This article has been updated since publication to remove quotes from Les Perelman from the write-off at the top of the piece.</em></p><img src="https://counter.theconversation.com/content/95035/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katina Zammit does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Assessments need to be relevant to the real world and test more complex skills to better predict competency, standards of literacy and teaching.Katina Zammit, Director of Academic Program - Primary, Western Sydney UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/879372017-11-23T19:11:21Z2017-11-23T19:11:21ZEvidence-based education needs standardised assessment<figure><img src="https://images.theconversation.com/files/195979/original/file-20171123-6027-1t7k8zl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Standardised assessments can inform what teachers teach, based on evidence of student learning.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The latest <a href="https://www.education.gov.au/review-achieve-educational-excellence-australian-schools">Gonski</a> review aims to improve evidence-based decision-making in Australian school education. It recognises that governments’ educational investment must be based on more than politics, just as teachers’ instructional decisions must be based on more than intuition. Like other professional sectors, Australian education must be guided by rigorous evidence of what works, for whom and in what contexts. </p>
<p>Standardised assessments, like <a href="https://www.nap.edu.au/naplan">NAPLAN</a>, are powerful tools in building a strong evidence base for education policy and practice. As NAPLAN enters its second decade, it is timely to reflect on how Australia can make best use of standardised assessment to drive system improvement. This does not deny valid criticisms of punitive standardised testing regimes. Instead, it considers how we might avoid a “baby and bathwater” scenario, and retain the benefits of standardised testing with fewer flaws.</p>
<h2>Comparison not competition</h2>
<p>Comparison of standardised assessments across systems, schools and classrooms can guide evidence-based policy and practice in many ways. Analysis of NAPLAN trends can help identify <a href="https://www.teachermagazine.com.au/columnists/geoff-masters/how-well-are-we-learning-from-naplan">policies and practices</a> that may have contributed to improvements. The first <a href="https://docs.education.gov.au/system/files/doc/other/what_is_the_schooling_resource_standard_and_how_does_it_work.pdf">Gonski review</a> used comparisons of NAPLAN data as evidence to estimate the costs of quality school education. </p>
<p>Australia participates in international standardised tests like <a href="http://www.oecd.org/pisa/">PISA</a>, <a href="https://timssandpirls.bc.edu/">TIMSS</a> and <a href="https://timssandpirls.bc.edu/">PIRLS</a>. This is part of a broader global conversation about how to make education systems work better for everyone. Many teachers and school leaders are now using standardised test data to <a href="https://research.acer.edu.au/cgi/viewcontent.cgi?article=1019&context=tll_misc">guide school improvement</a>.</p>
<p>On the other hand, standardised assessment can fuel unhealthy competition. The worst effects of MySchool can be seen in <a href="https://theconversation.com/unfair-funding-is-turning-public-schools-into-sinks-of-disadvantage-751">residualised</a> schools abandoned by students and families who can afford to go elsewhere. The worst effects of NAPLAN itself can be seen in students placed under pressure to gain the score they need to get into a selective school, or top-stream class. </p>
<p>Internationally, simplistic PISA league tables risk undermining the global improvement agenda that the assessment was designed to support. </p>
<p>Standardised testing does not have to be used this way. It is most effective when used for <a href="http://nepc.colorado.edu/publication/data-driven-improvement-accountability">system improvement</a>, not sanctions or exclusion. Australia has not followed other nations in linking assessment to sanctions for schools or pay for teachers. This is something to be celebrated and sustained.</p>
<h2>Standardised not homogenised</h2>
<p>Standardised assessments work best when they adapt to students’ individuality. For example, through “<a href="https://www.nap.edu.au/online-assessment">tailored testing</a>” in NAPLAN online. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/fbX8FudbeDs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>There is <a href="https://www.teachermagazine.com.au/columnists/geoff-masters/shifting-the-focus-of-naplan">potential</a> for Australia to go much further by assessing students across the full continuum of learning, instead of bundling them into year-level groups. </p>
<p>ACER is also <a href="https://www.acer.org/cari/projects/new-metric-projects/assessment-of-general-capabilities">developing</a> standardised assessments that use a wider range of methods to capture the skills of students who may not perform their best on a written test. This makes standardised tests more inclusive of different learning styles and cultures, as well as disability.</p>
<h2>Assessment for teaching</h2>
<p>Standardised assessments can inform what teachers teach, based on evidence of student learning. This happens most effectively when assessments are mapped to curriculum. More work needs to be done to strengthen the connection between curriculum and assessment in Australia. This would help teachers make better use of NAPLAN results to inform their teaching. Current work on describing national learning progressions in literacy and numeracy will help connect the Australian Curriculum to NAPLAN assessment. </p>
<p>We also need to assess the right things. Australia’s <a href="https://www.nap.edu.au/nap-sample-assessments/assessment-frameworks">National Assessment Program</a> covers a broad range of subject areas, beyond literacy and numeracy. <a href="https://rd.acer.org/article/assessing-general-capabilities">Research</a> is also underway about assessing general capabilities, such as critical and creative thinking, and collaboration, which are essential to students’ success in modern workplaces.</p>
<h2>Pluralism not hegemony</h2>
<p>A healthy education system will have multiple assessments (large-scale and small), each designed to suit the purpose at hand. NAPLAN is an imperfect measure by nature, and cannot be expected to measure children’s learning as competently as the teacher who spends hours with them every day. </p>
<p>On the other hand, individual teachers’ judgements cannot map learning across the entire education system. Teachers may be experts on the progress of their students, but they cannot compare that progress with students in the school down the road, let alone a school in another state or territory. Standardised assessment provides the best birds-eye view of where the system is working, and where additional attention is required.</p>
<p>Most importantly, standardised assessment is part of the social contract between governments and populations, to provide a quality education for every child. </p>
<p>ACER works with many countries developing standardised assessments, hungry for information about how well their system is working. In countries where government investment is limited, standardised assessments have even been developed by <a href="https://www.acer.org/gem/citizen-led-assessments-evaluation-reports">citizen-led groups</a> to meet parents’ demands for information about their children’s learning. This is the best illustration of the purpose of standardised assessment: as evidence that empowers education stakeholders to focus on positive change.</p><img src="https://counter.theconversation.com/content/87937/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jen Jackson works for the Australian Council for Educational Research. She receives funding from the Australian Government Department of Foreign Affairs and Trade.</span></em></p><p class="fine-print"><em><span>Raymond J Adams heads the Centre for Global Education Monitoring at ACER which is funded by ACER and DFAT. Ray chairs ACARA’s Measurement Advisory Group</span></em></p><p class="fine-print"><em><span>Ross Turner works for the Australian Council for Educational Research. </span></em></p>Standardised tests are a powerful tool for building an evidence base of what works to guide education policy.Jen Jackson, Research Fellow, Educational Monitoring and Research, Australian Council for Educational ResearchRaymond J Adams, Head Centre for Global Education Monitoring - ACER, Australian Council for Educational ResearchRoss Turner, Principal Research Fellow, Australian Council for Educational ResearchLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/861442017-11-15T19:17:19Z2017-11-15T19:17:19ZNAPLAN only tells part of the story of student achievement<figure><img src="https://images.theconversation.com/files/193347/original/file-20171106-1068-8x3q38.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">After 10 years of minimal breakthroughs, NAPLAN doesn't seem to be going anywhere but online.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><hr>
<p><em>Since it was introduced in the 1800s, standardised testing in Australian schools has attracted controversy and divided opinion. In this <a href="https://theconversation.com/au/topics/standardised-testing-series-46310">series</a>, we examine its pros and cons, including appropriate uses for standardised tests and which students are disadvantaged by them.</em></p>
<hr>
<p>The National Assessment Program – Literacy And Numeracy (<a href="https://www.nap.edu.au/naplan">NAPLAN</a>) had its 10th birthday this year, but few well-wishers came to the party. </p>
<p>Administered in Years 3, 5, 7 and 9, NAPLAN measures the performance of educational programs, schools and each student’s literacy and numeracy achievements against benchmarks. In short, the aim of NAPLAN is to ensure that students’ and the nation’s literacy and numeracy skills are improving. This year, 10 years of <a href="http://www.nap.edu.au/results-and-reports/national-reports">data</a> revealed that <a href="http://www.abc.net.au/news/2017-08-02/naplan-results-show-small-change-in-school-students-performance/8764994">little has changed</a> since NAPLAN began. </p>
<p>After millions of dollars of investment, as well as the abundance of data that NAPLAN has created, we are still not seeing amazing leaps and bounds in achievement. The nation is effectively standing still. </p>
<h2>NAPLAN is good at measuring differences and change over time</h2>
<p>NAPLAN gives us a picture of several aspects of students’ learning. These include: their performance under test conditions, their basic use of punctuation, grammar, spelling, numeracy skills and writing an exposition or narrative text. </p>
<p>NAPLAN has provided data to help us quantify the gap between Indigenous and non-Indigenous students’ literacy and numeracy, and provide indicators of <a href="http://closingthegap.pmc.gov.au/education">where the gap is closing</a>. </p>
<p>We can see the <a href="http://ftp.iza.org/dp9535.pdf">differences in boys’ and girls’ achievements</a>, and the significant difference that a <a href="https://grattan.edu.au/report/widening-gaps/">parent’s level of education</a> makes to results. </p>
<p>NAPLAN can, importantly, track a student’s improvement, or lack thereof, from one exam to the next. It can also highlight changes, although it can’t specify the factors involved in it. </p>
<p>Finally, NAPLAN can <a href="https://books.google.com.au/books?id=yJ7hCgAAQBAJ&pg=PT171&dq=identify+disadvantage+NAPLAN&hl=en&sa=X&ved=0ahUKEwj2yd7R2qjXAhUCmJQKHeMjASoQ6AEIPzAF#v=snippet&q=disadvantage&f=false">identify areas of disadvantage</a> or need, for example geographical areas, state or territory differences or demographics.</p>
<h2>NAPLAN cannot measure creativity or engagement</h2>
<p>Despite all that NAPLAN can measure, it only tells part of the story of literacy and numeracy achievement. Results may not show growth of learning in schools with students from low socio-economic backgrounds or culturally and linguistically diverse students, because it only measures a narrow skill set on one particular day of the year. It does not represent student achievements across the year, nor across the breadth of the curriculum which schools use to evaluate their programs. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=856&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=856&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=856&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1076&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1076&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193340/original/file-20171106-1014-tqsoqs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1076&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The Bard of Avon’s creative use of language and love of making up words would likely earn him a poor score on a NAPLAN test.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>It also does not measure engagement in learning. Engagement can look like students’ enjoying reading, willingness to engage in numeracy tasks or whether they are using these skills outside a test situation. </p>
<p>This leaves little room for creative play with the style of writing prescribed, promoting very structured teaching of the texts. It’s far easier to provide students with a simplistic structure and key language features, rather than encourage a creative response with more complexity. An assessor may not value the difference in writing style, as it is not reflected in the marking criteria. One wonders how Shakespeare would have performed on NAPLAN. Our prediction is that his phrase “the world’s my oyster”, would have placed him in the bottom two bands. </p>
<h2>NAPLAN’s influence on learning</h2>
<p>This narrow version of literacy, numeracy and writing isn’t reflected in the rich learning that occurs in classrooms. NAPLAN places children as young as eight in an exam environment, and asks them to think in a way they aren’t used to. Classroom life in year 3 <a href="http://www.pdst.ie/sites/default/files/Session%203%20-%20PS%20Co%20-%20Op%20%EF%80%A2%20Group%20Work.pdf">is usually more accustomed to</a> collaborative learning, using problem-solving and discovery methods are essential for knowledge and understanding. </p>
<p>In contrast, NAPLAN reflects little of the ways that children understand and interpret the world. Two weeks before NAPLAN, many years 3 and 5 teachers start teaching to the test by developing exam skills, practising answering multiple choice questions, teaching the structure and language features of an exposition and/or narrative text. Teachers feel they must simulate the exam environment, practice and even guess the questions that might be asked.</p>
<p>This narrows the curriculum as well as the types of literacy and numeracy activities that students usually engage in as part of their learning. It takes up classroom time that could be spent teaching literacy and numeracy skills meaningfully, by reading quality children’s literature, creating various text types, engaging in <a href="https://ukla.org/research/projects/details//agentic-writing-across-the-primary-curriculum">process drama pedagogies</a> or trying creative tasks. </p>
<p>Standardised tests like NAPLAN also diminish the joy of learning. Teachers have <a href="http://www.whitlam.org/__data/assets/pdf_file/0011/694199/The_experience_of_education_-_Qualitative_Study.pdf">reported</a> that 90% of students feel stressed before the test. In fact, a 2016 <a href="https://espace.library.uq.edu.au/data/UQ_383374/s4261111_phd_final.pdf?Expires=1509676745&Signature=OcZOE8k7ACzOXg86mBn21rIlOM-AP2j5Jxp57p8pLWjER9U5RlHnWj7WpsiYRqpgiqHcj8ra86i8kH%7EbDv3xLoI1QoYz8RLW-Cqs2lmtcNIiYA6HlbNXyZjiIHbToxVa0UcszBfVQAfBYOkOu-l6ns4dMpWuCHTLuUYez0T61adCZ3KtB7nZbN183JLhMbwQWPfgMo5WLjTizf25jIYusOqFdNVRCB3X7kMvas14gEHSW2PnBQ7CRq2YDnDYTqxV9-59UUg4gpwJlCjO4NLXW2PaEd99UyztpIWeTtFaP-coKiWihkY0goWukNdCVGQNT1JblWVEEDtcO8cXat7-MQ__&Key-Pair-Id=APKAJKNBJ4MJBJNC6NLQ">study</a> found that students in high school do not see the relevance of NAPLAN to their education. Year 7 students even felt it stopped their learning.</p>
<p>Dangerously, NAPLAN frames mistakes as bad. Mistakes are essential if schools are going to encourage original thoughts. Lateral and creative thinking is required to conquer challenges like climate change, global inequality and rising global conflicts. Students will need to take risks, understand that problems may have multiple solutions and they mustn’t only look for a singular right answer. </p>
<h2>Creative alternatives</h2>
<p>Teachers are not to blame for any of these issues. Throughout the teaching year, teachers use creative strategies to improve students’ outcomes. Philosophy has been <a href="http://www.sapere.org.uk/Portals/0/SAPERE%20P4C%20Research%20map%20-%20first%20draft%20June%202011.pdf">found</a> to make a significant and impressive difference. Sydney Theatre Company’s School Drama <a href="https://www.aare.edu.au/blog/?tag=stc">program</a> has been <a href="http://www.tandfonline.com/doi/abs/10.1080/14452294.2015.1083152">found</a> to improve literacy outcomes as well as empathy, confidence, motivation and engagement. These approaches are not available to every child in every school. This should be a priority, but education is like a large ship - slow to turn around. Wide-scale reform that prioritises creativity and philosophical thinking takes time. </p>
<p>NAPLAN, on the other hand, is high-stakes testing. Schools are required to administer it with <a href="http://www.education.vic.gov.au/about/educationstate/Pages/catchup.aspx">additional funding</a> tied to results of underachieving students and that may not accurately represent data for the whole <a href="https://theconversation.com/naplan-data-and-school-funding-a-dangerous-link-46021">school population</a> if all students do not complete the tests. Using a centrally created test puts enormous pressure on every student, teacher and principal to perform. It discards teachers’ contextual knowledge about the students and the learning environment. Results are published as a comparative analysis of schools on the <a href="https://www.myschool.edu.au/">My School</a> website. </p>
<p>Despite 10 years of minimal breakthroughs and a <a href="http://www.whitlam.org/__data/assets/pdf_file/0008/276191/High_Stakes_Testing_Literature_Review.pdf">plethora of evidence</a> that shows that NAPLAN may do more harm than good, there is no sign it’s going anywhere except online. </p>
<p>Governments love NAPLAN. It contains all of their favourite buzz-words: transparency, accountability, data and quality. In the process, look at what it denies our students: innovation, creativity, risk, originality and joy. These are far less attractive to politicians, more difficult to measure in a national multiple choice test, but far more relevant to children’s future achievements.</p><img src="https://counter.theconversation.com/content/86144/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rachael Jacobs is a member of Teachers for Refugees, the NTEU, the NSW Greens and on the board of Drama Australia. </span></em></p><p class="fine-print"><em><span>Katina Zammit does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>NAPLAN is great at tracking changes over time and between demographics, but not so great at measuring what factors effect change, engagement or creativity.Rachael Jacobs, Lecturer in Arts Education, Western Sydney UniversityKatina Zammit, Director of Academic Program - Primary, Western Sydney UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/410762015-05-04T03:07:10Z2015-05-04T03:07:10ZWho needs teachers when computers can mark exams?<figure><img src="https://images.theconversation.com/files/80215/original/image-20150504-23856-7hbzz6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Will a computer algorithm understand the creativity, flair, irony, wit and unconventional approaches used in kids' writing?</span> <span class="attribution"><span class="source">from www.shutterstock.com.au</span></span></figcaption></figure><p>Early May would be incomplete without some NAPLAN controversy. This year’s comes from the <a href="http://www.theaustralian.com.au/news/nation/naplan-written-tests-to-be-marked-by-computers/story-e6frg6nf-1227326263697">announcement</a> last week that the national exam sat by students across the country in Years 3, 5, 7 and 9 is to be marked by computers in 2017.</p>
<p>Part of the argument for moving to online marking is that it will decrease turnaround time from months to just weeks. While this is uncontroversial for multiple-choice-style tests, which have a correct answer, it is much more problematic when applied to creative writing.</p>
<h2>Can computers mark creative writing?</h2>
<p>The NAPLAN written task is usually a narrative or persuasive task and is an extended piece of prose. The <a href="http://www.nap.edu.au/naplan/writing/writing.html">marking criteria</a> include audience, text structure, cohesion, vocabulary, paragraphing, sentence structure, punctuation and spelling.</p>
<p>When writing persuasive texts, the guide explains that:</p>
<blockquote>
<p>students are required to write their opinion and to draw on personal knowledge and experience when responding to test topics.</p>
</blockquote>
<p>The guide also explains that for narrative texts, there should be a:</p>
<blockquote>
<p>growing understanding that the middle of the story needs to involve a problem or complication that introduces conflict, danger or tension that must be resolved. It is this uncertainty that draws the reader in and builds suspense.</p>
</blockquote>
<p>The question is whether computers can appropriately mark students’ creative writing with this level of sophistication. </p>
<p>According to the Australian Curriculum, Assessment and Reporting Authority (ACARA), <a href="http://www.canberratimes.com.au/nsw/naplan-education-chief-defends-computers-over-teachers-marking-creative-writing-20150430-1mwitw.html">they can</a>. </p>
<p>The approach being taken is one that uses supervised machine learning, where sample tests marked by humans are fed into an algorithm that learns how to recognise quality responses by reverse-engineering scoring decisions. <a href="http://www.itnews.com.au/News/403322,how-australia-plans-to-mark-naplan-with-cognitive-computing.aspx">Trials</a> conducted by ACARA have demonstrated that:</p>
<blockquote>
<p>artificial intelligence solutions perform as well, or even better, than the teachers involved.</p>
</blockquote>
<p>One argument is that computer marking has less variability than human markers, although these <a href="http://www.nytimes.com/2012/04/23/education/robo-readers-used-to-grade-test-essays.html?_r=0">claims</a> to marker reliability are <a href="http://graphics8.nytimes.com/packages/pdf/science/Critique_of_Shermis.pdf">contested</a>. </p>
<p>For example, what would happen if a student were to submit a <a href="http://chronicle.com/article/Writing-Instructor-Skeptical/146211/">nonsense piece</a> that happened to meet the expectations of the algorithm?</p>
<p>Automated marking is not a new thing. It has been particularly visible since the rise of <a href="http://theconversation.com/computer-thinks-youre-dumb-automated-essay-grading-in-the-world-of-moocs-13321">MOOCs</a> and the search for a cheap alternative for marking student papers. </p>
<p>The <a href="http://www.sciencedirect.com/science/article/pii/S1075293513000512">research literature</a> provides a mixed picture of potential <a href="http://www.journalofwritingassessment.org/article.php?article=58">benefits and pitfalls</a>, yet there has been vocal <a href="http://www.salon.com/2013/09/30/computer_grading_will_destroy_our_schools/">opposition</a> to computer marking from <a href="http://www.independent.co.uk/news/uk/home-news/professors-angry-over-essays-marked-by-computer-8562276.html">academics</a> and <a href="http://dianeravitch.net/2014/09/03/why-computers-should-not-grade-student-essays-2/">educationalists</a>. </p>
<p>The rise of <a href="http://aeon.co/magazine/technology/steven-poole-can-algorithms-ever-take-over-from-humans/">algorithms</a> can be seen in many places, including chess-playing computers, self-driving cars, metadata analysis to predict behaviour, online advertising, speech-recognition software and auto-completing search engines. It seems only logical that algorithms would enter our classrooms.</p>
<h2>What actually matters in education?</h2>
<p>One thing that strikes me as ironic is that we would be using computers, which can’t actually read or write, to test the reading and writing of our students. Is the next step to replace our teachers with robot instructors who can provide standardised, objective and completely emotionless feedback in the classroom?</p>
<p>How can a computer assess creativity and flair? How would it recognise irony, wit and humour? What about writers who use unconventional approaches for effect?</p>
<p>While algorithms can easily process literal meaning, what happens with inferential meaning or drawing on rich contexts, background knowledge, prior learning, cultural and social discourses? These are all part of the complex tapestry of human meaning-making in reading and writing.</p>
<p>As one example, the NAPLAN <a href="http://www.nap.edu.au/verve/_resources/Amended_2013_Persuasive_Writing_Marking_Guide_-With_cover.pdf">marking guide</a> refers to the use of classical rhetorical discourse in persuasive writing, including:</p>
<blockquote>
<p>Pathos - appeal to emotion </p>
<p>Ethos - appeal to values </p>
<p>Logos - appeal to reason. </p>
</blockquote>
<p>I have not yet come across a computer except in science-fiction films that has emotions or values that could be appealed to in any persuasive sense.</p>
<p>There are serious concerns that computer marking of the NAPLAN writing task will have unintended effects on <a href="http://www.smh.com.au/comment/naplan-what-will-autoscoring-accomplish-20150430-1mww17.html">teaching and learning</a>, including online reading and writing <a href="http://onlinelibrary.wiley.com/doi/10.1598/JAAL.55.1.1/full">strategies</a> different to those of traditional print-based comprehension and composition.</p>
<p>A further concern is that computer marking will have a reductive effect on student writing, with “teaching to the test” becoming more of a problem than it already is.</p>
<p>Maybe it isn’t that far-fetched to imagine computers marking assignments and robots teaching classrooms. After all, there are <a href="http://www.theguardian.com/technology/2014/feb/22/robots-google-ray-kurzweil-terminator-singularity-artificial-intelligence">predictions</a> that we will reach the singularity, the point at which artificial intelligence overtakes humans, in 2029. </p>
<p>Wouldn’t ACARA be better off putting the money into something that has an impact on the quality of learning of students in Australian schools rather than conducting this particular experiment? To be focusing on test scoring that is faster and cheaper seems to be at odds with what actually matters in education.</p>
<p>Until we reach the singularity, perhaps we should focus on improving equity and access for students who are most disadvantaged in our education system, and leave the robots out of it.</p><img src="https://counter.theconversation.com/content/41076/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stewart Riddle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>NAPLAN is going to be marked by computers from 2017. Can an algorithm understand the complex and emotional writing techniques we want our children to be learning?Stewart Riddle, Senior Lecturer, University of Southern QueenslandLicensed as Creative Commons – attribution, no derivatives.