tag:theconversation.com,2011:/us/topics/school-assessment-11556/articlesSchool assessment – The Conversation2022-05-24T02:53:18Ztag:theconversation.com,2011:article/1806792022-05-24T02:53:18Z2022-05-24T02:53:18ZWriting for our (digital) lives: war, social media and the urgent need to update how we teach English<figure><img src="https://images.theconversation.com/files/460507/original/file-20220429-25-f6iayg.jpg?ixlib=rb-1.1.0&rect=0%2C14%2C1920%2C1264&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Pixabay</span></span></figcaption></figure><p>The war in Ukraine is being described as the first social media war, even as “<a href="https://www.theguardian.com/commentisfree/2022/mar/05/ukraines-victories-in-the-tiktok-war-wont-stop-vlad-the-invaders-missiles">the TikTok war</a>”. Memes, tweets, videos and blog posts communicate both vital information and propaganda, potentially changing the course of history. This highlights the importance of agile and critical social media use.</p>
<p>English in schools, in contrast, still focuses on reading books and writing exam essays. Despite <a href="https://link.springer.com/article/10.1007/s13384-021-00457-5">mentions of media</a> in the Australian Curriculum for English, the study of digital writing via social media is not prioritised in senior assessment or national high-stakes testing. This approach seems increasingly out of touch with modern communication.</p>
<p><a href="https://www.vice.com/en/article/epxq3j/russia-ukraine-invasion-memes">Meme-ification</a> is a feature of media coverage of the Ukraine war. This new word describes the explosion of ordinary people creating shareable, and potentially influential, digital content.</p>
<p>Anyone with a smartphone and internet access can participate in a war that is being fought both on the ground and on digital platforms. And this content frequently references other popular digital culture. For example, Ukrainian President Volodymyr Zelenskyy is portrayed as <a href="https://www.insidehook.com/article/internet/ukraine-war-twitter-main-character-ww3-memes">Captain Ukraine</a> by photoshopping his head onto Marvel’s Captain America’s body and tweeting this image.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1498941347939692546"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/guns-tanks-and-twitter-how-russia-and-ukraine-are-using-social-media-as-the-war-drags-on-180131">Guns, tanks and Twitter: how Russia and Ukraine are using social media as the war drags on</a>
</strong>
</em>
</p>
<hr>
<h2>English education for our age</h2>
<p>This “writing” contributes to narratives and debates about heroism, military morale, fan fiction and US cultural imperialism. This kind of immediate, vibrant and global communication needs to be the basis of study in English.</p>
<p>The ability to critically consume and strategically create social media is vital to the health of democracies. Yet writing for social media posts and powerful platforms such as Twitter, TikTok and Facebook is not central to how we teach English.</p>
<p>Students need to be able to create memes, write rolling news blogs and produce digital news podcasts, all for networked audiences. They need to determine aims, invent concepts, manipulate images, combine different media, compose compelling text and respect copyright law. This is impactful and purposeful writing to achieve influence in the world.</p>
<p>Research initiatives such as the <a href="https://files.eric.ed.gov/fulltext/EJ1285112.pdf">Digital Self Portrait project</a> demonstrate how students can create vivid new forms of “writing” that explore tensions between their own digitally rich lives and traditional literacies.</p>
<p>Digital writing is often collaborative, and a recent <a href="https://www.edresearch.edu.au/sites/default/files/2022-02/writing-instruction-literature-review.pdf">Australian Education Research Organisation review</a> recommends more collaborative writing in classrooms. Community organisations such as <a href="http://www.write4change.org/">Write4Change</a> are making this possible by connecting youth to write together using digital media via private, communal and moderated sites on mainstream platforms.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"785273794692784128"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/in-an-age-of-digital-disinformation-dropping-level-1-media-studies-in-nz-high-schools-is-a-big-mistake-151475">In an age of digital disinformation, dropping level 1 media studies in NZ high schools is a big mistake</a>
</strong>
</em>
</p>
<hr>
<h2>Our approach is outdated</h2>
<p>Yet education’s high-stakes assessment regimes don’t value these forms of writing. Sadly, the National Assessment Program – Literacy and Numeracy (NAPLAN) has <a href="https://naplanreview.com.au/pdfs/2020_NAPLAN_review_final_report.pdf">narrowed the kinds of writing</a> taught in schools even further. One <a href="https://nap.edu.au/docs/default-source/default-document-library/naplan-narrative-prompt---the-box.pdf">sample NAPLAN writing task</a> says, basically, “Here is a picture of a box. Write a story about it.”</p>
<p>This approach needs to change so students are practising the forms of writing and communication that are meaningful in today’s world. This will support citizens of the future to participate fully in workplaces and, most importantly, in democracies.</p>
<p>The Australian government, through the Australian Research Council, has recognised this and funded a new study into the importance of contemporary writing in education. This is through a <a href="https://www.arc.gov.au/grants/discovery-program/discovery-early-career-researcher-award-decra">Discovery Early Career Research Award</a> (DECRA) titled <a href="https://teachingdigitalwriting.wordpress.com/">Teaching digital writing in secondary English</a>. This project will explore how teachers can conceptualise and enact the teaching of real-world writing.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1503976832865304577"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-well-does-the-new-australian-curriculum-prepare-young-people-for-climate-change-183356">How well does the new Australian Curriculum prepare young people for climate change?</a>
</strong>
</em>
</p>
<hr>
<h2>It’s not a choice of classics or digital writing</h2>
<p>Of course, studying the classics remains important, as does mastering basic skills. Zelenskyy himself quoted Hamlet in a recent <a href="https://www.nytimes.com/2022/03/12/opinion/zelensky-ukraine-russia-biden.html">address to the British parliament</a>. So this is not an either/or situation, but what digital writing expert Professor Troy Hicks calls “<a href="https://hickstro.org/2022/02/13/embracing-the-both-and-of-digital-writing/">both/and</a>”. We can study <em>both</em> Hamlet as a play <em>and</em> how other media quote its main character in powerful ways.</p>
<p>Students can themselves explore making strategic literary references in their own social media posts and interventions. The study of rhetoric (argument and persuasion) and aesthetics (cultural value) needs to include diverse media for contemporary relevance. </p>
<p>Human conflicts, projects, imaginings and achievements are now happening in new forms. The devastating theatre of war playing out in Ukraine and online has offered “<a href="https://www.theguardian.com/world/2022/apr/16/zelenskiy-ukraine-war-writers-journalists">a masterclass in message</a>”.</p>
<p>If a key aim of Australia’s compulsory literacy education is to “<a href="https://www.australiancurriculum.edu.au/f-10-curriculum/english/">create confident communicators, imaginative thinkers and informed citizens</a>” then students need to learn to communicate in the modes of contemporary society. They need to enjoy the engagement and learning that comes from participating in genuinely important dialogues and situations, even if just in protected classroom and school-based versions of these. </p>
<p>Social media use potentially both threatens and supports democracy. Yet media education remains devalued in the English curriculum and classroom, largely in favour of reproducing print literature forms and essays.</p>
<p>It is time for English to join the 21st century and embrace all the diverse and digital means of communication that are part of our lives today. Our freedom and futures depend on it.</p><img src="https://counter.theconversation.com/content/180679/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dr Lucinda McKnight receives funding from the Australian government through the Australian Research Council (ARC). She is a Discovery Early Career Researcher Award (DECRA) recipient.</span></em></p>The Ukraine war shows how important agile and critical social media use can be. It’s a reminder that our English curriculum in schools is out of touch with our world of digital communication.Lucinda McKnight, Senior Lecturer in Pedagogy and Curriculum, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1692002021-10-05T15:29:47Z2021-10-05T15:29:47ZSats – why bringing back tests for 14-year-olds could help disadvantaged students<p>The UK government is <a href="https://www.theguardian.com/politics/2021/oct/01/dfe-considering-return-of-sats-at-14-and-axing-teaching-hours-limits">reportedly</a> considering bringing back national curriculum tests (known as Sats) for 14-year-olds in England. The reasons cited were that, without formal assessment to mark the end of key stage three (KS3 – years seven, eight and nine of <a href="https://theconversation.com/covid-19-has-made-the-transition-from-primary-to-secondary-school-harder-heres-how-parents-can-help-166313">secondary school</a>) children were at risk of losing focus, and losing out. </p>
<p>The KS3 Sats were abolished in 2008. These tests had first been introduced in 1988 and were used across all national curriculum subjects, including English, maths, science, history, geography, modern foreign languages, design and technology and art and design. </p>
<p>Although <a href="https://twitter.com/ChrisHillidge/status/1443982578298540035">many teachers rejoiced</a> when they were scrapped because it apparently reduced their workload, <a href="https://ofqual.blog.gov.uk/2021/05/17/bias-in-teacher-assessment-results/">research suggests</a> that disadvantaged students – particularly those from ethnic minorities and lower socio-economic backgrounds – may be losing out under the current teacher assessment regime. Various education stakeholders since – including the former chief inspector of schools, <a href="https://www.bbc.com/news/education-25336254">Michael Wilshaw</a> – have called for some form of external assessment in year nine to resume. </p>
<p>England is not an outlier in terms of the amount of testing that students face when compared to other countries – or in terms of the importance placed on these tests. There may be less testing <a href="https://www.tes.com/news/exams-assessment-testing-uk-france-germany-usa-spain">in Sweden</a>, for example. But <a href="https://www.palgrave.com/gp/book/9780230230255">research shows</a> that early testing in Germany, Italy, Austria, the Czech Republic and many other countries can determine a child’s future in a way that does not happen in England (except in <a href="https://www.tandfonline.com/doi/full/10.1080/01425692.2018.1443432">a tiny number</a> of grammar schools). </p>
<p>So is bringing back KS3 Sats a good idea?</p>
<h2>Why the tests were abolished</h2>
<p>When the Labour government <a href="https://www.theguardian.com/education/2008/oct/14/sats-scrapped">scrapped them</a> in October 2008, it was considered a historic move. The then education secretary, Ed Balls, claimed that the tests served no real purpose. Additional reasons given for the change included a reduced assessment workload for teachers, the idea that the tests distorted the nature of education and the test anxiety experienced by pupils. </p>
<p>But these reasons do not really stand up to scrutiny. The Sats were replaced by teacher assessments, which meant changing – but not reducing – the hours teachers had to spend on assessing their pupils’ progress. Many schools still use KS3 tests and exams for their own purposes anyway, even if they are not a <a href="https://www.tes.com/news/do-we-still-need-key-stage-3-exams">statutory requirement</a>. Schools use such assessments to inform students, teachers and parents about progress, identify areas of support that school leaders can attend to and provide a measure of school performance for governors and others. </p>
<p>And if key-stage testing was abolished because it was distorting the so-called true nature of education, it is not clear why the similar testing of pupils at key stages one, two and four was retained. There was no reason to suggest that “teaching to the test” was somehow less problematic for pupils aged seven, 11, or 16 than it was at age 14. </p>
<p>Beyond that, in terms of anxiety, wellbeing and happiness, evidence suggests that sitting exams (even at a younger age than 14) <a href="https://inews.co.uk/news/education/sats-tests-year-6-pupils-ks2-exams-happiness-wellbeing-unaffected-1214675">does not cause major problems</a> for children. </p>
<p>Further reasons for abolishing KS3 tests have been suggested. But perhaps the biggest factor was the collapse of the US-based testing firm ETS in 2008 which led to many tests remaining unmarked and boxes of pupils responses <a href="http://news.bbc.co.uk/1/hi/education/7507113.stm">just sitting around</a>. Perhaps abolition was a political response to the perceived chaos in schools, and not actually related to educational improvement. </p>
<h2>Does bringing them back makes sense?</h2>
<p>So, should KS3 Sats start again (even if they never fully disappeared)? It would certainly be more convenient for researchers such as ourselves who look at pupil progress and how to improve it. Currently we have access to data on assessments at age seven and 11 and then a long gap until age 16. However, this is not a reason likely to sway politicians, teachers or the general public.</p>
<p>A sound reason for a return to testing would be if the teacher assessments currently in use were somehow inaccurate or unfair. This is not entirely clear.</p>
<p><a href="https://acamh.onlinelibrary.wiley.com/doi/full/10.1111/jcpp.13070">Research suggests</a> that teacher assessment is as stable and reliable as formal testing. However, there is also substantial evidence from the office of Qualifications and Examinations Regulation (the UK government’s exam watchdog) and others that disadvantaged students <a href="https://www.tes.com/news/assessment-disadvantaged-students-will-lose-out">may lose out</a> to their peers in teacher assessment. This <a href="https://ofqual.blog.gov.uk/2021/05/17/bias-in-teacher-assessment-results/">applies especially</a> for pupils from some ethnic minorities and lower socio-economic backgrounds. </p>
<p>This issue of children possibly losing out in teacher assessment is important because KS3 results can be part of the step to accessing subject and qualification choices at KS4. If there are some <a href="https://theconversation.com/four-things-that-can-bias-how-teachers-assess-student-work-142135">unconscious biases</a> in assessment this may then bias pupils’ entire future trajectories.<br>
Despite the abolition of KS3 Sats in 2008, heavy workload <a href="https://researchfeatures.com/rethinking-complex-determinants-teacher-shortages/">continues to be cited</a>
as an obstacle to keeping teachers in the workforce. Many schools <a href="https://www.gl-assessment.co.uk/case-studies/assessment-at-key-stage-3-from-levels-to-validity/">are using</a> increasingly sophisticated digital tools and digital standardised and self-marking tests. Such approaches could be helpful in minimising workloads in the event that KS3 Sats (or something similar) were reinstated. </p>
<p>The <a href="https://theconversation.com/a-level-and-gcse-cancellation-a-missed-opportunity-to-rethink-assessment-152846">whole assessment process</a> has, of course, been hit by the pandemic. Two years of national key stage tests results, up to and including A-levels, have been lost. Perhaps now is the time to deal with those problems, and not to add a further test – at least yet.</p>
<p>Whatever it chooses to do, the Department for Education must plan a robust evaluation of any proposed change at KS3, with policy led by evidence of its benefits and a solid plan for how to implement it.</p><img src="https://counter.theconversation.com/content/169200/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Would formal exams be a better, and fairer, measure of pupil performance than teacher assessments?Stephen Gorard, Professor of Education and Public Policy, Durham UniversityNadia Siddiqui, Research Fellow in the School of Education, Durham UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1530402021-02-26T15:22:38Z2021-02-26T15:22:38ZGCSE and A-level teacher assessments: benefits of replacing exams undermined by lack of transparency<figure><img src="https://images.theconversation.com/files/386694/original/file-20210226-17-z9gv2m.jpg?ixlib=rb-1.1.0&rect=17%2C8%2C5973%2C3979&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/teacher-hand-holding-red-pen-checking-1388901272">NuPenDekDee/Shutterstock</a></span></figcaption></figure><p>In January, UK education secretary Gavin Williamson announced that GCSE and A-level exams in England would <a href="https://www.theguardian.com/education/2021/jan/06/gcse-a-levels-and-sats-exams-to-be-scrapped-in-england-this-year">not go ahead</a>. Now, Williamson has outlined <a href="https://www.gov.uk/government/publications/awarding-qualifications-in-summer-2021/awarding-qualifications-in-summer-2021">further information</a> about how assessments for pupils will take place.</p>
<p>Teachers’ judgements will be at the heart of grading decisions this year, based on a range of possible assessment methods including coursework, mock exams, essays and in-class tests.</p>
<p>Williamson emphasised that <a href="https://www.gov.uk/government/publications/awarding-qualifications-in-summer-2021/awarding-qualifications-in-summer-2021">fairness</a> and trust in teachers will be central to the assessment approach. Teachers will also have the option to use a common set of questions based on past exams and prepared by exam boards – but these, along with guidance and marking criteria, will not be available until the end of March. </p>
<h2>Mixed messages</h2>
<p>The use of a range of assessment methods is to be commended. It reflects the fact that different methods can suit different students, while also giving all students a richer variety of ways to demonstrate their achievements. The involvement of teachers in assessment is also welcome, because teaching, learning and assessment are not separate activities, but are very much dependent on one another. </p>
<p>What’s more, the government has stated that there will be a quality assurance process undertaken by exam boards to ensure consistency across schools and to identify malpractice. And this process will not be based on an algorithm like last year’s <a href="https://theconversation.com/gavin-williamson-ofqual-and-the-great-a-level-blame-game-144766">disastrous attempt</a> at moderation, or <a href="https://www.bbc.co.uk/iplayer/episode/m000t3b2/house-of-commons-education-and-exams-statement?page=1">pegged to past results</a>.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1365002066431602688"}"></div></p>
<p>If this new quality assurance approach means that we are finally abandoning a <a href="https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1057&context=pare">norm-based system</a> of assessment in which results depend on where a student comes in a ranking, then this is also excellent news. It also does far more to allay concerns about so-called “grade inflation” than simply comparing grades between years, which may not take account of improved levels of attainment.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/learning-from-exam-results-crisis-the-way-students-work-is-assessed-needs-to-change-144710">Learning from exam results crisis: the way students' work is assessed needs to change</a>
</strong>
</em>
</p>
<hr>
<p>However, there are significant problems with the government’s approach to A-level and GCSE assessments in 2021. It has failed to be timely, and remains, even after today’s announcements, lacking in transparency.</p>
<p>Timeliness is essential to fair assessment, and students should know as early as possible how they will be assessed. It is incredibly out of date, and contrary to <a href="https://www.tandfonline.com/doi/full/10.1080/0969594X.2014.898128">research on assessment</a> to disconnect assessment and learning. </p>
<p>Students should have an understanding of how they will be assessed as they learn, so the two processes can support one another. This is hard to achieve during a global pandemic, but the significant delay in cancelling exams in England, compared with earlier decisions in Scotland and Wales, has exacerbated the problem. </p>
<p>Second, timeliness is essential for teacher preparation. Again, research on assessment is clear that it is not only closely linked to learning, but <a href="https://www.tandfonline.com/doi/full/10.1080/0969594X.2018.1430685">also to teaching</a>. It is only reasonable that a teacher supporting a student’s preparation for an assessment should know how the assessment will be done. But the truly staggering aspect of the announcement is that teachers will not be given marking criteria, guidance or training from exam boards until the end of the spring term. </p>
<p>Timeliness and transparency are closely related. The lateness of government plans to reveal the details of both marking criteria and quality assurance processes seriously impairs the genuine and meaningful transparency of their approach. </p>
<p>It is perfectly appropriate that the “trust” in teachers is to be mediated by some quality assurance mechanism. But without explicit details of how this will be done, I remain concerned that exam boards are being given power without clear accountability. </p>
<h2>Focus on fairness</h2>
<p>Gavin Williamson has made very different claims about assessment fairness. He has stated that this year’s arrangements, based on teacher assessments, are all <a href="https://www.gov.uk/government/news/teacher-assessed-grades-for-students">about fairness</a>. </p>
<p>However, in the period following last summer’s results crisis, the position of the UK government appeared to remain fixed on a view that exams, and exams alone, are the fairest form of assessment. Gavin Williamson repeatedly made this claim, even doing so even at the very moment he <a href="https://www.theguardian.com/education/2021/jan/06/gcse-a-levels-and-sats-exams-to-be-scrapped-in-england-this-year">cancelled exams for 2021</a>. </p>
<p>This belief rests on assumptions rather than evidence, and reflects the way in which exams are often considered to be a neutral form of judging student ability. But no form of assessment is neutral, and all involve making choices between competing priorities.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1364978528828219393"}"></div></p>
<p>The problem with exams is that those who claim they are fair assume they are reliable, and that consistent marking is easier to achieve. But they are not considering whether exams are the best way to demonstrate particular knowledge and skills. The dominance of exams within the English system reflects highly politicised <a href="https://www.tandfonline.com/doi/full/10.1111/eie.12110">values and judgements</a> rather than innate educational worth.</p>
<p>A central pillar of this year’s approach is to trust teachers and schools as the people who <a href="https://www.theguardian.com/education/2021/jan/06/gcse-a-levels-and-sats-exams-to-be-scrapped-in-england-this-year">know their students best</a> – but this emphasis on teachers is at considerable odds with the government’s continuing focus on exam boards when making decisions. The government has been working with the exam boards, according to <a href="https://www.bbc.co.uk/iplayer/episode/m000t3b2/house-of-commons-education-and-exams-statement?page=1">Williamson’s comments</a> in Parliament, while teachers are left waiting to find out what they will be required to do.</p>
<p>This year, the government may well be trying to do the best it can for A-level and GCSE students in very difficult circumstances. But entrenched assumptions and failures of timeliness and transparency significantly reduce our ability to judge whether or not a fair system has been put in place.</p><img src="https://counter.theconversation.com/content/153040/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jan McArthur does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The UK government’s announcement on how students work will be graded is too little, too late.Jan McArthur, Senior Lecturer in Education and Social Justice, Lancaster UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1141542019-04-09T22:14:00Z2019-04-09T22:14:00ZTesting literacy today requires more than a pencil and paper<figure><img src="https://images.theconversation.com/files/267432/original/file-20190403-177199-1pqgbts.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Large-scale literacy testing has not kept pace with how literacy is practiced in classooms, assessed by teachers and mandated by curriculum.</span> <span class="attribution"><span class="source">tim gouw/unsplash</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>Large-scale testing, or what many know as standardized testing,
often carries important consequences for students. The results of large-scale tests may be used by schools or policy-makers to make important decisions <a href="https://www.tandfonline.com/doi/full/10.1080/09695940802164226">such as grouping students by ability or assessing how well schools are doing</a>.</p>
<p>Yet when it comes to literacy testing, <a href="https://www.routledge.com/Literacy-Lives-in-Transcultural-Times/Zaidi-Rowsell/p/book/9781138225169">while the competencies of literacy have changed in our digital, globalized world</a>, the methods that many educational systems use to assess literacy have not. </p>
<p>One recent analysis of standardized tests in the United States, for example, found tests haven’t changed much over the last 100 years: tests are <a href="https://ed.stanford.edu/news/history-high-school-english-told-through-100-years-exams">mostly multiple choice, with questions geared toward assessing skills like vocabulary, recall and comprehension</a>.</p>
<p>In Canada today, on such large-scale standardized tests, students are <a href="https://www.edcan.ca/articles/standards-accountability-and-student-assessment-systems/">likely to read a passage and answer a series of multiple-choice questions</a>. Students might have an opportunity to write a short answer or essay response. Provincial tests, for the most part, continue to prioritize measuring traditional literacy skills of reading and writing with answers primarily communicated via pencil-to-paper. Such a testing structure forms the basis for public accountability in many provinces. </p>
<p>Across Canada, researchers and educators have documented the need <a href="https://www.edcan.ca/articles/large-scale-assessment/">to transform how the provinces assess literacy</a> and consider more innovative designs. Testing should accurately capture what children are learning <a href="https://www.washingtonpost.com/education/2019/03/04/if-all-that-testing-had-been-improving-us-we-would-have-been-highest-achieving-nation-world-heres-what-does-work-school-reform/?noredirect=on&utm_term=.1755420d9ed6">without detracting from authentic teaching and learning</a>.</p>
<h2>What literacy means today</h2>
<p>Formerly, literacy was broadly understood to encompass four domains: reading, writing, speaking and listening. But today, how we define literacy has changed. </p>
<p>Firstly, literacy is now understood to involve skills and knowledge related to all modes of visual representation and digital communications. Today’s students tend to read shorter texts within a variety of platforms on social media, websites and apps. Schools now teach literacy through visual, moving image and even sound-based texts that children and teenagers encounter when reading and writing online. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/268414/original/file-20190409-2921-1i7tobc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Pencil-to-paper tests won’t capture the full range of competencies involved with literacy today.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Secondly, literacy today is also understood to be about <a href="https://www.routledge.com/Re-theorizing-Literacy-Practices-Complex-Social-and-Cultural-Contexts/Bloome-Castanheira-Leung-Rowsell/p/book/9780815368625">how students can use knowledge and skills related to personal and citizen engagement and agency</a>. According to the United Nations Educational Scientific and Cultural Organization (UNESCO), literacy involves the <a href="http://www.unesco.org/education/GMR2006/full/chapt6_eng.pdf">“capacity for social awareness and critical reflection as a basis for personal and social change</a>.” </p>
<p>These forms of literacy teaching and learning — both multimedia literacy related to varied forms of representation <a href="https://www.tandfonline.com/doi/abs/10.1080/07370008.2015.1029609?journalCode=hcgi20">and expression</a> and applied literacy — <a href="https://www.springer.com/gp/book/9789462092006">are called multiliteracies</a>.</p>
<p>Large-scale literacy testing needs to keep pace with how the skills related to these concepts are practised in classooms, assessed by teachers and mandated by provincial curriculum. </p>
<p>Overall, curriculum is increasingly emphasizing <a href="https://www.tcpress.com/critical-encounters-in-secondary-english-9780807756232?page_id=95">a more holistic concept of literacy development</a>. <a href="http://www.edu.gov.on.ca/eng/curriculum/secondary/english910currb.pdf">The English curriculum in Ontario</a> acknowledges students’ literacy development is not understood solely as reading and writing. <a href="https://curriculum.gov.bc.ca/curriculum/english-language-arts/10/new-media">B.C</a> <a href="http://www.learnalberta.ca/ProgramOfStudy.aspx?lang=en&ProgramId=404703#">and Alberta</a> similarly recognize the changing nature of literacy. </p>
<h2>Revamping large-scale testing for the 21st century</h2>
<p>The need to <a href="https://www.routledge.com/The-PISA-Effect-on-Global-Educational-Governance-1st-Edition/Volante/p/book/9781138217416">reconsider large-scale testing formats</a> was recently acknowledged by <a href="http://www.oecd.org/education/andreas-schleicher.htm">Andreas Schleicher</a>,
director for the directorate of education and skills for the Organization for Economic Cooperation and Development (OECD) — the organization that administers the most prominent cross-comparative test in the world, <a href="http://www.oecd.org/pisa/">the Programme for International Student Assessment </a> (PISA). He recently <a href="https://www.smh.com.au/education/international-education-testing-program-set-to-change-20190324-p516zq.html">said PISA is trying to move away from multiple choice to have more adaptive, engaging formats</a>.</p>
<p>In Singapore — <a href="http://www.oecd.org/pisa/pisa-2015-results-in-focus.pdf">the country that performs highest in PISA global tests</a> — the minister of education recently announced a <a href="https://www.moe.gov.sg/news/speeches/opening-address-by-mr-ong-ye-kung--minister-for-education--at-the-schools-work-plan-seminar">reduction in testing for students to better balance rigour and “the joy of learning</a>.”</p>
<p>When we understand literacy to also be about developing adaptive and connective <a href="https://theconversation.com/reduce-childrens-test-anxiety-with-these-tips-and-a-re-think-of-what-testing-means-111730">skills in our rapidly changing world</a>, we can see that such decisions to transform assessment are not potentially downplaying literacy, but rather, potentially enhancing it. </p>
<p>In Canada, assessment reforms and innovations are slowly taking shape.
For example, British Columbia <a href="https://www2.gov.bc.ca/gov/content/education-training/k-12/administration/program-management/assessment/foundation-skills-assessment/fsa-samples">revised its Foundational Skills Assessments in 2018 to include collaboration and self-reflection</a>. Alberta also made changes to large-scale provincial achievement tests to focus on <a href="https://globalnews.ca/news/548713/alberta-gets-rid-of-provincial-achievement-tests/">assessment <em>for</em> learning rather than assessment <em>of</em> learning</a>. </p>
<p>And in Ontario, a 2018 report to the premier recommended <a href="https://www.oise.utoronto.ca/preview/lhae/UserFiles/File/OntarioLearningProvince2018.pdf">replacing the Ontario secondary school literacy test</a>, now a graduation requirement. Researchers who conducted the review (including one of the authors of this story, Carol), as well as those invited to comment as assessment experts (Chris and Louis), made a number of other recommendations including integrating technology for large-scale asessment of students’ learning and progress. </p>
<p>If we are to support literacy skills for the 21st century then we must explore how large-scale testing might capture students’ contemporary literacy competencies, and also how the testing itself might integrate contemporary practices and <a href="https://en.unesco.org/themes/literacy-all">understandings of literacy</a>. </p>
<p>For example, computerized testing could allow for timely feedback that would close the gap between testing and feedback for learning. Right now, any curricular changes to address demonstrated gaps in learning are often communicated months after the large-scale test.</p>
<p>We need to change how we assess literacy. Ministries of education have the expertise and capacity to modernize our assessment systems. We are hoping there is the political will to do so.</p><img src="https://counter.theconversation.com/content/114154/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Louis Volante receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).</span></em></p><p class="fine-print"><em><span>Carol Campbell led the Independent Review of Assessment and Reporting for the Government of Ontario (2017-18).</span></em></p><p class="fine-print"><em><span>Christopher DeLuca receives funding from the Social Sciences and Humanities Research Council of Canada.</span></em></p><p class="fine-print"><em><span>Lorenzo Cherubini received funding from The Social Sciences and Humanities Research Council of Canada (previously funded). </span></em></p><p class="fine-print"><em><span>Jennifer Rowsell does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Are current forms of standardized literacy tests really measuring children’s capacity to read and interact with our rapidly-changing world?Louis Volante, Professor, Faculty of Education, Brock UniversityCarol Campbell, Associate Professor of Leadership and Educational Change, Ontario Institute for Studies in Education, University of TorontoChristopher DeLuca, Associate Professor in Classroom Assessment and Acting Associate Dean, Graduate Studies & Research, Faculty of Education, Queen's University, OntarioJennifer Rowsell, Canada Research Chair in Multiliteracies, Brock UniversityLorenzo Cherubini, Professor, Brock UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/974412018-06-07T20:27:16Z2018-06-07T20:27:16ZExplainer: what’s the difference between formative and summative assessment in schools?<figure><img src="https://images.theconversation.com/files/222093/original/file-20180607-137309-1bx3631.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Both formative and summative assessments are important parts of a well-rounded assessment program.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The recent Gonski <a href="https://docs.education.gov.au/system/files/doc/other/662684_tgta_accessible_final_0.pdf">report</a> argues Australia needs assessment and reporting models that capture both achievement progress and long-term learning progress. This, according to the review panel, involves low-stakes, low-key, and regular formative assessments to support learning progressions. The report used international <a href="https://grattan.edu.au/wp-content/uploads/2015/07/827-Targeted-Teaching.pdf">evidence</a> on individualised teaching to demonstrate ongoing formative assessment and feedback is fundamental to supporting students to do better in school. </p>
<p>The NSW Education Minister, Rob Stokes, has <a href="https://www.smh.com.au/education/naplan-is-being-used-abused-and-must-be-urgently-dumped-stokes-20180503-p4zd3z.html">called for</a> NAPLAN to be replaced in “haste” with less high stakes tests. Mark Scott, the secretary of the NSW Department of Education, echoed Stokes’ remarks. He <a href="http://www.abc.net.au/news/2018-05-29/naplan-will-look-a-little-dated-when-new-testing-catches-on/9796860">stated</a>: </p>
<blockquote>
<p>I think [NAPLAN] will become obsolete because the kinds of information that the new assessment schemes will give us will be richer and deeper and more meaningful for teachers, for parents and for education systems.</p>
</blockquote>
<p>So, what’s the difference between formative and summative assessment? And when should each be used? Formative and summative assessment have different purposes and <a href="http://gottesman.pressible.org/cjr2142/balanced-assessment-from-formative-to-summative">both have an important role to play</a> in a balanced assessment program.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222094/original/file-20180607-137309-8bwo1n.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Formative assessments provide students with feedback and show where gaps in learning are.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Formative assessment</h2>
<p><a href="http://www.nuffieldfoundation.org/sites/default/files/files/beyond_blackbox.pdf">Formative assessment</a> includes a range of strategies such as classroom discussions and quizzes designed to generate feedback on student performance. This is done so teachers can <a href="https://www.tandfonline.com/doi/abs/10.1080/0969595980050104">make changes</a> in teaching and learning based on what students need. </p>
<p>It involves finding out what students know and do not know, and continually monitoring student progress during learning. Both teachers and students <a href="https://www.tandfonline.com/doi/abs/10.1080/0969594970040304">are involved</a> in decisions about the next steps in learning. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/marking-answers-with-a-tick-or-cross-wont-enhance-learning-48732">Marking answers with a tick or cross won't enhance learning</a>
</strong>
</em>
</p>
<hr>
<p>Teachers use the feedback from formative tasks to identify what students are struggling with and adjust instruction appropriately. This could involve re-teaching key concepts, changing how they teach or modifying teaching resources to provide students with additional support. Students also use feedback from formative tasks to reflect on and improve their own work. </p>
<p><strong>Regular classroom tasks, whether formal (for example, traditional pen and paper tests) or informal (such as classroom discussions), can be adapted into effective formative tasks by:</strong> </p>
<ul>
<li><p>making students aware of the learning goals/success criteria using rubrics and carefully tracking student progress against them </p></li>
<li><p>including clear instructions to guide students through a series of activities to demonstrate the success criteria. A teacher might, for example, design a series of activities to guide students through an inquiry or research process in science</p></li>
<li><p>providing regular opportunities for feedback from the teacher, other students or parents (this feedback may be face-to face, written, or online)</p></li>
<li><p>making sure students have opportunities to reflect on and make use of feedback to improve their work. This may involve asking students to write a short reflection about the feedback on their draft essay and using this to improve their final version.</p></li>
</ul>
<p><strong>There are many advantages of formative assessment:</strong></p>
<ul>
<li><p>feedback from formative assessment helps students become aware of any
gaps between their goal and their current knowledge, understanding, or skill</p></li>
<li><p>tasks <a href="https://www.tandfonline.com/doi/abs/10.1080/0969595980050104">guide students</a> through the actions necessary to hit learning goals </p></li>
<li><p>tasks encourage students to focus their attention on the task (such as undertaking an inquiry or research process) rather than on simply getting the right answer </p></li>
<li><p>students and teachers receive ongoing feedback about student progress towards learning goals, which enables teachers to adjust their instructional approach in response to what students need</p></li>
<li><p>students build their <a href="https://blogs.deakin.edu.au/innovation-in-psychology/wp-content/uploads/sites/24/2013/11/Nichol_2006.pdf">self-regulation skills</a> by setting learning goals and monitoring their progress towards them </p></li>
<li><p>results of formative assessments can also be used for grading and reporting.</p></li>
</ul>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222095/original/file-20180607-137298-3lx69f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Summative assessments are generally standardised and rarely provide feedback.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Summative assessment</h2>
<p>This includes end of unit examinations and the NSW <a href="http://educationstandards.nsw.edu.au/wps/portal/nesa/11-12/hsc/about-HSC">Higher School Certificate</a> (HSC) examination.</p>
<p><a href="https://www.tandfonline.com/doi/abs/10.1080/0969594970040304">Summative assessment</a> provides students, teachers and parents with an understanding of the pupil’s overall learning. Most commonly thought of as formal, time-specific exams, these assessments may include major essays, projects, presentations, art works, creative portfolios, reports or research experiments. These assessments are designed to measure the student’s achievement relative to the subject’s overall learning goals as set out in the relevant curriculum standards. </p>
<p>The design and goals of summative assessments are generally standardised so they can be applied to large numbers of students, multiple cohorts and time periods. <a href="https://research.acer.edu.au/cgi/viewcontent.cgi?article=1004&context=aer">Data collected</a> on individual student, cohort, school or system performance provides schools and principals with a tool to evaluate student knowledge relative to the learning objectives. They can also compare them with previous cohorts and other schools. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/evidence-based-education-needs-standardised-assessment-87937">Evidence-based education needs standardised assessment</a>
</strong>
</em>
</p>
<hr>
<p>The measurement and evaluation of student achievement this way <a href="https://au.sagepub.com/en-gb/oce/assessment-of-learning/book230814">gives us necessary information</a> about how we can continuously improve learning and teaching. </p>
<p>There are a number of <a href="https://books.google.com.au/books?id=wPSIAgAAQBAJ&pg=PP4&lpg=PP4&dq=Assessment+and+Examination+in+the+Secondary+School:+A+Practical+Guide+for+Teachers+and+Trainers:+Taylor+%26+Francis&source=bl&ots=_yQGqNq5D-&sig=O5QOjNDVJHnbvVnFseS3eEBuZ28&hl=en&sa=X&ved=0ahUKEwj72ZvBr7nbAhXSq5QKHYB7BdYQ6AEIPjAF#v=onepage&q=limitations&f=false">limitations</a> of summative assessment. While formative assessments usually provide feedback for the student to review and develop their learning, summative assessments are rarely returned to students. When assessments provide only a numerical grade and little or no feedback, as the NSW HSC does, it’s hard for students and teachers to pinpoint learning needs and determine the way forward. </p>
<p>Additionally, being a form of “high stakes” assessment, results may be perceived as a way of ranking students. For high achieving students there is recognition and reward, while for the lower performing students there is potential embarrassment and shame. Neither of these things should be associated with an equal opportunity education system. </p>
<hr>
<p><em>The author would like to acknowledge the work of David McDonald, a PhD student at Macquarie University in assessment, in writing this article.</em></p>
<hr><img src="https://counter.theconversation.com/content/97441/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rod Lane does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>There are benefits and drawbacks to both formative and summative assessment. Both are important parts of a rigorous assessment program.Rod Lane, Senior Lecturer in Educational Assessment, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/889592017-12-21T22:56:53Z2017-12-21T22:56:53ZWhy we need to rethink supplementary examinations<figure><img src="https://images.theconversation.com/files/200292/original/file-20171220-4951-1cw5lbw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The biggest problem with supplementary examinations is the punitive nature of the assessment.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>In Australian schools, assessment drives learning. </p>
<p>If there’s one thing that preoccupies students during the examination period, it’s the fear of failure and having to resit the dreaded supplementary examination.</p>
<h2>What is a supplementary exam?</h2>
<p>A supplementary exam is a form of further assessment offered to students who have not satisfied the passing criteria set by the educational institution for a particular course. <a href="https://www.student.qut.edu.au/learning-and-assessment/grades-reviews-and-academic-issues/supplementary-assessment">Each</a> <a href="https://www.acu.edu.au/policy/student_policies/assessment_policy_and_assessment_procedures/supplementary_assessment">institution</a> has a different <a href="https://www.monash.edu/medicine/study/student-services/policies/assessments">supplementary</a> <a href="http://ask.unimelb.edu.au/app/answers/detail/a_id/4351/related/1/session/L2F2LzEvdGltZS8xNTEzNzMxMjk3L3NpZC9mVXZfb3ZMaFdaZElKdmJEaHlvWkc1d0dZMSU3RXMzUmlRQklESGdCQjdjWFhzaDVLeWJ6R1h1NFdwZ1kwVW9ZMHg0Z3REJTdFZ0JRZVMycklhMldsT19YQkdiNWNXMkd2aGUyUG1FblVXN2NWMzk1TzNZOEVPa1ZtVlRBJTIxJTIx">assessment</a> <a href="https://www.swinburne.edu.au/current-students/manage-course/exams-results-assessment/results/last-to-complete/">policy</a>.</p>
<p>Common characteristics of supplementary examinations are:</p>
<ul>
<li><p>they’re offered to students who achieved below the cut off score, normally 50% or 60% in a subject</p></li>
<li><p>they’re offered in the formal examination period, usually in the four to six weeks’ following the final examination</p></li>
<li><p>they’re assessed on a pass/fail basis</p></li>
<li><p>they’re recorded on the academic transcript the student has passed a supplementary assessment, which limits their future opportunities. Students are also ineligible for any awards or commendations.</p></li>
</ul>
<p>Some institutions define a specific range to be eligible for a supplementary assessment - below which students have to repeat the unit. In other cases, there is also a limit to the number of units for which a student can be offered a supplementary assessment per semester.</p>
<h2>What’s wrong with supplementary assessment?</h2>
<p>The biggest problem with supplementary examinations is the punitive nature of the assessment. It doesn’t take into account circumstances which may have affected a student’s performance on the day of examination. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-causes-mind-blanks-during-exams-67380">What causes mind blanks during exams?</a>
</strong>
</em>
</p>
<hr>
<p>Factors like anxiety may play a role, especially in situations where students are directly observed by an assessor and when there were limited opportunities for them to practice.</p>
<p>Supplementary assessment occurs very close to the final examination to ensure students can enrol in the next semester. This means there’s very little time for students to prepare and for faculty to offer any meaningful help in achieving the desired outcome. </p>
<p>There’s <a href="https://doi-org.ezproxy.library.uwa.edu.au/10.3109/0142159X.2012.643262">evidence</a> short-term help in this context has little to offer in terms of learning gains. From the faculty perspective, it adds to the workload, as academics have to prepare two sets of examinations with new and often equal numbers of items or scenarios for each set.</p>
<p>Take, for example, a scenario from a medical exam in final year. </p>
<p>A clinical examination is arranged where students are presented with sixteen clinical scenarios for management. There are 200 students, of which 70% passed at least 12 scenarios, 5% could manage only five cases and 25% managed to pass between six and 11 scenarios. </p>
<p>It’s the performance of the 25% of students on the day that raises doubts about their role as an intern. We do not know the reasons for each student’s poor performance but they will all be required to sit a supplementary examination. Additional help is offered to students, which is intense and takes up a lot of time on the part of the academics. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/200293/original/file-20171220-4957-1ckqogs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Short-term help before a supplementary exam is often not enough to achieve sufficient improvement.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>The supplementary assessment is offered after six weeks in a similar format with the same number of scenarios. Again, 5% of the students fail. There are still doubts about five other students in this group who are still on the margin, although they’ve managed to get through. </p>
<p>Improvement in marks on average is not more than 11%. Four to six weeks of focused teaching cannot compensate for the semester long content of the course, and will result in marginal improvement. </p>
<h2>What we can do?</h2>
<p>The current system would be much improved by replacing supplementary assessment with a new model called “<a href="http://dx.doi.org/10.1111/medu.12136">sequential assessment</a>” and a shorter duration of final assessment.</p>
<p>In the example discussed earlier, instead of 16 let’s present ten scenarios. </p>
<p>Students who pass up to eight scenarios are a clear pass, while those who failed up to four scenarios clearly fail. They don’t need evidence from 16 scenarios, but those who are in the range of five to seven failed scenarios need to provide more evidence they can graduate as a safe doctor. This evidence can be in the form of another examination, offering ten more scenarios. </p>
<p>If students provide the evidence then their marks from the final and reassessment can be averaged to give them a final score. There’s no need to record on the academic transcript the student passed the assessment in a supplementary exam. Those who still fail now have two assessments at different points in time, informing the institution they are not ready to graduate and will need to repeat the year or semester.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/should-we-do-away-with-exams-altogether-no-but-we-need-to-rethink-their-design-and-purpose-67647">Should we do away with exams altogether? No, but we need to rethink their design and purpose</a>
</strong>
</em>
</p>
<hr>
<p>This might also result in fewer or no appeals from students as it’s not about supplementary examination, but rather asking them to provide more evidence. </p>
<p>A student who has a bad day may be able to provide that evidence - while those who have not attained the learning outcomes need to repeat. A longer examination also provides more sampling of the content being examined and may be more reliable.</p>
<p>This puts less strain on faculty to plan final examinations. It also reduces time spent marking, which is economical in terms of time and resources, both physical and human.</p>
<p>Another approach to consider is <a href="http://onlinelibrary.wiley.com/doi/10.1111/1467-8535.00296/full">Computerised Adaptative Testing</a> for examinations. This approach would require additional resources to set the infrastructure, like computer labs and a large number of questions in the question bank. But it would provide a better estimate of the student’s ability.</p>
<h2>A better approach is needed</h2>
<p>In the current environment, academics have competing time demands and employers want work-ready employees. </p>
<p>A supplementary examination may assist students to cross over the line in the short term. But these students may fail again in the subsequent year - and it won’t necessarily equip students to cope with work demands. </p>
<p>A better approach would be for institutions to use time and resources to provide students with support and required skills during the semester.</p><img src="https://counter.theconversation.com/content/88959/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Zarrin Seema Siddiqui does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>In Australian schools, assessment drives learning, but there are better models to consider than the current system of supplementary examinations.Zarrin Seema Siddiqui, Associate Professor in Medical Education, The University of Western AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/879372017-11-23T19:11:21Z2017-11-23T19:11:21ZEvidence-based education needs standardised assessment<figure><img src="https://images.theconversation.com/files/195979/original/file-20171123-6027-1t7k8zl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Standardised assessments can inform what teachers teach, based on evidence of student learning.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The latest <a href="https://www.education.gov.au/review-achieve-educational-excellence-australian-schools">Gonski</a> review aims to improve evidence-based decision-making in Australian school education. It recognises that governments’ educational investment must be based on more than politics, just as teachers’ instructional decisions must be based on more than intuition. Like other professional sectors, Australian education must be guided by rigorous evidence of what works, for whom and in what contexts. </p>
<p>Standardised assessments, like <a href="https://www.nap.edu.au/naplan">NAPLAN</a>, are powerful tools in building a strong evidence base for education policy and practice. As NAPLAN enters its second decade, it is timely to reflect on how Australia can make best use of standardised assessment to drive system improvement. This does not deny valid criticisms of punitive standardised testing regimes. Instead, it considers how we might avoid a “baby and bathwater” scenario, and retain the benefits of standardised testing with fewer flaws.</p>
<h2>Comparison not competition</h2>
<p>Comparison of standardised assessments across systems, schools and classrooms can guide evidence-based policy and practice in many ways. Analysis of NAPLAN trends can help identify <a href="https://www.teachermagazine.com.au/columnists/geoff-masters/how-well-are-we-learning-from-naplan">policies and practices</a> that may have contributed to improvements. The first <a href="https://docs.education.gov.au/system/files/doc/other/what_is_the_schooling_resource_standard_and_how_does_it_work.pdf">Gonski review</a> used comparisons of NAPLAN data as evidence to estimate the costs of quality school education. </p>
<p>Australia participates in international standardised tests like <a href="http://www.oecd.org/pisa/">PISA</a>, <a href="https://timssandpirls.bc.edu/">TIMSS</a> and <a href="https://timssandpirls.bc.edu/">PIRLS</a>. This is part of a broader global conversation about how to make education systems work better for everyone. Many teachers and school leaders are now using standardised test data to <a href="https://research.acer.edu.au/cgi/viewcontent.cgi?article=1019&context=tll_misc">guide school improvement</a>.</p>
<p>On the other hand, standardised assessment can fuel unhealthy competition. The worst effects of MySchool can be seen in <a href="https://theconversation.com/unfair-funding-is-turning-public-schools-into-sinks-of-disadvantage-751">residualised</a> schools abandoned by students and families who can afford to go elsewhere. The worst effects of NAPLAN itself can be seen in students placed under pressure to gain the score they need to get into a selective school, or top-stream class. </p>
<p>Internationally, simplistic PISA league tables risk undermining the global improvement agenda that the assessment was designed to support. </p>
<p>Standardised testing does not have to be used this way. It is most effective when used for <a href="http://nepc.colorado.edu/publication/data-driven-improvement-accountability">system improvement</a>, not sanctions or exclusion. Australia has not followed other nations in linking assessment to sanctions for schools or pay for teachers. This is something to be celebrated and sustained.</p>
<h2>Standardised not homogenised</h2>
<p>Standardised assessments work best when they adapt to students’ individuality. For example, through “<a href="https://www.nap.edu.au/online-assessment">tailored testing</a>” in NAPLAN online. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/fbX8FudbeDs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>There is <a href="https://www.teachermagazine.com.au/columnists/geoff-masters/shifting-the-focus-of-naplan">potential</a> for Australia to go much further by assessing students across the full continuum of learning, instead of bundling them into year-level groups. </p>
<p>ACER is also <a href="https://www.acer.org/cari/projects/new-metric-projects/assessment-of-general-capabilities">developing</a> standardised assessments that use a wider range of methods to capture the skills of students who may not perform their best on a written test. This makes standardised tests more inclusive of different learning styles and cultures, as well as disability.</p>
<h2>Assessment for teaching</h2>
<p>Standardised assessments can inform what teachers teach, based on evidence of student learning. This happens most effectively when assessments are mapped to curriculum. More work needs to be done to strengthen the connection between curriculum and assessment in Australia. This would help teachers make better use of NAPLAN results to inform their teaching. Current work on describing national learning progressions in literacy and numeracy will help connect the Australian Curriculum to NAPLAN assessment. </p>
<p>We also need to assess the right things. Australia’s <a href="https://www.nap.edu.au/nap-sample-assessments/assessment-frameworks">National Assessment Program</a> covers a broad range of subject areas, beyond literacy and numeracy. <a href="https://rd.acer.org/article/assessing-general-capabilities">Research</a> is also underway about assessing general capabilities, such as critical and creative thinking, and collaboration, which are essential to students’ success in modern workplaces.</p>
<h2>Pluralism not hegemony</h2>
<p>A healthy education system will have multiple assessments (large-scale and small), each designed to suit the purpose at hand. NAPLAN is an imperfect measure by nature, and cannot be expected to measure children’s learning as competently as the teacher who spends hours with them every day. </p>
<p>On the other hand, individual teachers’ judgements cannot map learning across the entire education system. Teachers may be experts on the progress of their students, but they cannot compare that progress with students in the school down the road, let alone a school in another state or territory. Standardised assessment provides the best birds-eye view of where the system is working, and where additional attention is required.</p>
<p>Most importantly, standardised assessment is part of the social contract between governments and populations, to provide a quality education for every child. </p>
<p>ACER works with many countries developing standardised assessments, hungry for information about how well their system is working. In countries where government investment is limited, standardised assessments have even been developed by <a href="https://www.acer.org/gem/citizen-led-assessments-evaluation-reports">citizen-led groups</a> to meet parents’ demands for information about their children’s learning. This is the best illustration of the purpose of standardised assessment: as evidence that empowers education stakeholders to focus on positive change.</p><img src="https://counter.theconversation.com/content/87937/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jen Jackson works for the Australian Council for Educational Research. She receives funding from the Australian Government Department of Foreign Affairs and Trade.</span></em></p><p class="fine-print"><em><span>Raymond J Adams heads the Centre for Global Education Monitoring at ACER which is funded by ACER and DFAT. Ray chairs ACARA’s Measurement Advisory Group</span></em></p><p class="fine-print"><em><span>Ross Turner works for the Australian Council for Educational Research. </span></em></p>Standardised tests are a powerful tool for building an evidence base of what works to guide education policy.Jen Jackson, Research Fellow, Educational Monitoring and Research, Australian Council for Educational ResearchRaymond J Adams, Head Centre for Global Education Monitoring - ACER, Australian Council for Educational ResearchRoss Turner, Principal Research Fellow, Australian Council for Educational ResearchLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/860492017-11-16T19:09:53Z2017-11-16T19:09:53ZNAPLAN has done little to improve student outcomes<hr>
<p><em>Since it was introduced in the 1800s, standardised testing in Australian schools has attracted controversy and divided opinion. In this <a href="https://theconversation.com/au/topics/standardised-testing-series-46310">series</a>, we examine its pros and cons, including appropriate uses for standardised tests and which students are disadvantaged by them.</em></p>
<hr>
<p>In recent years, we have seen a global surge in standardised testing as nations attempt to improve student outcomes. Rich nations, as well as many middle- and low-income nations, have participated in international assessments such as the Programme for International Student Assessment (<a href="http://www.oecd.org/pisa/">PISA</a>), and also developed their own national standardised assessments. But can such assessments improve student outcomes?</p>
<h2>Information from standardised tests is too limited to improve outcomes</h2>
<p>The National Assessment Program – Literacy and Numeracy (<a href="https://www.nap.edu.au/">NAPLAN</a>) was introduced in Australia in 2008. It is a standardised test administered annually to all Australian students in Years 3, 5, 7 and 9. These tests are supposed to perform two functions: provide information to develop better schooling policies, and provide teachers with information to improve student outcomes.</p>
<p>However, a decade on and many millions of dollars later, student outcomes on NAPLAN have shown <a href="http://theconversation.com/naplan-is-ten-years-old-so-how-is-the-nation-faring-81565">little improvement</a>. Australia’s performance on international assessments such as PISA has <a href="http://www.oecd.org/edu/education-at-a-glance-19991487.htm">actually fallen</a> over these years. Standardised testing has not produced a positive effect on student learning outcomes. </p>
<p>Supporters of standardised testing see NAPLAN as necessary to know which schools and school systems are doing well and which ones are not. It is undoubtedly useful to know if certain parts of the country (such as regional or rural areas), or certain student populations (for example, students with an immigrant or low-SES background), are underperforming. Such information is also crucial when it comes to <a href="https://theconversation.com/gonski-model-was-corrupted-but-labor-and-coalition-are-both-to-blame-65875">arguing for resource redistribution</a>, as we see in debates about <a href="http://www.abc.net.au/news/2017-06-19/gonski-2.0-school-funding-explainer/8630594">Gonski</a>. </p>
<p>However, there are clear limits to what NAPLAN can tell us. While it helps us understand schooling at the system level, the information gained from NAPLAN about individual students, classrooms and schools is <a href="http://www.edmeasurement.com.au/_publications/margaret/NAPLAN_for_lay_person.pdf">too limited</a> and error-prone to be of use. </p>
<p>For instance, there is a limit to the number of questions NAPLAN can ask to assess a particular student’s skill or understanding. It may determine that a student cannot perform addition using “carrying over” based on their performance on one or two such items on the 40-item test. This means the error margins in these assessments are very high. </p>
<p>Such errors may be neutralised at a system level, when the test is performed at a sufficiently large scale and with a large sample of students, but when used at the level of individual students, classrooms or schools, NAPLAN assessment data <a href="http://www.edmeasurement.com.au/_publications/margaret/NAPLAN_for_lay_person.pdf">is seriously flawed</a>.</p>
<h2>Assessment versus standardised testing</h2>
<p>Assessment is integral to the teaching process and occurs almost constantly in good classrooms. Teachers have a range of assessment techniques, including questioning during the course of a lesson, setting assignments, using data from standardised testing, and developing more formal exams. These different assessment techniques fulfil a variety of different purposes: diagnosing student knowledge, shaping student learning and assessing what has been learned.</p>
<p>Increasingly, teachers are encouraged to individualise their teaching in order to accommodate the needs of individual students. This focus on “inclusion” extends to assessment, and teachers are expected to provide a variety of formats and opportunities for students to demonstrate their learning. Education policy statements, such as the 2008 <a href="http://www.curriculum.edu.au/verve/_resources/National_Declaration_on_the_Educational_Goals_for_Young_Australians.pdf">Melbourne Declaration on Educational Goals for Young Australians</a>, emphasise the valuing of student diversity.</p>
<p>Standardised assessments, on the other hand, assume that particular levels of achievement are expected of certain ages or year levels. Students are then classified as meeting, exceeding or being below these expectations. This flies in the face of the realities that teachers observe daily in their classrooms: students do not present themselves as “standardised” humans. </p>
<p>Geoff Masters, Chief Executive of the <a href="https://www.acer.org/">Australian Council for Educational Research</a>, <a href="https://research.acer.edu.au/cgi/viewcontent.cgi?article=1033&context=columnists">claims</a> that in any given classroom, the differences between students can be multiple years: </p>
<blockquote>
<p>Some Year 9 students perform at the same level as some Year 5, and possibly some Year 3, students. </p>
</blockquote>
<p>By this logic, the notion of providing a standardised NAPLAN test for all Year 3, 5, 7 and 9 students is inappropriate. </p>
<p>Teachers who see their students all year long will always have a deeper knowledge of their students than point-in-time standardised tests can offer. Teachers can make better, more nuanced, more useful and more timely assessments of their students. They may choose to include standardised assessments in the suite of approaches they use, but NAPLAN should not be solely privileged over teacher assessments. </p>
<p>Despite this, enormous amounts of money and time have been spent training teachers to use NAPLAN results to inform their teaching. This not only provides an unnecessary and misleading distraction to already over-burdened teachers but it undermines their own professional knowledge and judgement. </p>
<h2>Stepping up accountability doesn’t necessarily translate to better outcomes</h2>
<p>One of the goals of NAPLAN was to enhance accountability. By judging all schools on the same measure, comparing schools with similar populations, and then making these comparisons public, it was expected that all schools would lift their game. </p>
<p>This strategy assumed that schools could improve but were choosing not to, and that the inducement of market logics (such as <a href="https://www.schoolchoice.com.au/">school choice</a>) would motivate all schools to do better. It also ignored the many out-of-school factors, such as poverty and geography, that affect the ability of teachers and schools to improve student outcomes.</p>
<p>The other logic was that schools that performed worse could learn from schools that were doing better. Besides minimising the importance of local factors to student learning and suggesting there are universal “<a href="https://www.aare.edu.au/blog/?p=1755">silver bullets</a>”, setting schools in competition with one another hardly provides incentives for better performing schools to share their knowledge. </p>
<h2>Blame alone is not the answer</h2>
<p>Accountability is important and standardised testing can inform policies and improve accountability. But to function as an instrument of accountability, these tests should not be high-stakes, high-stress or high-visibility, particularly since they are so error prone at the student, classroom and school levels. </p>
<p>The use of sample-based tests, such as the United States’ National Assessment of Educational Progress (<a href="https://nces.ed.gov/nationsreportcard/about/">NAEP</a>), may instead provide useful information by state and territory, as well as by categories such as social capital, ethnicity and gender. This information could highlight problematic areas, and trigger closer and more targeted explorations. </p>
<p>To get this type of information, the tests need not be conducted every year, since effects of any reforms are seldom evident in one year. The error margins also make year-on-year comparisons of limited value. Sample-based tests will also remove the pressures placed on schools and students, which have proven so detrimental.</p>
<p>As <a href="http://www.abc.net.au/news/2016-12-06/australian-school-performance-in-absolute-decline-globally/8098028">recent NAPLAN results</a> have shown, “blame and shame” alone does not improve student learning. Indeed, focusing solely on NAPLAN scores distracts from broader efforts to provide teachers, schools and school systems with the support needed to ensure all students are given the best chance to learn and succeed.</p>
<p>To date, NAPLAN has been largely used by politicians and the education system to hold teachers and schools accountable. But accountability can work both ways. If NAPLAN is to be used, we should also use it to also hold the education system and politicians accountable for the resources and funding they provide to schools and to the local communities they serve. Perhaps then we would see some real and sustained improvements in student outcomes.</p><img src="https://counter.theconversation.com/content/86049/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>NAPLAN is good at measuring some aspects of education, including knowledge difference between demographics, but has not produced a positive effect on student learning outcomes.Radhika Gorur, DECRA Fellow and Senior Lecturer In Education, Deakin UniversitySteven Lewis, Alfred Deakin Postdoctoral Research Fellow, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/865412017-11-12T18:56:20Z2017-11-12T18:56:20ZSupport for standardised tests boils down to beliefs about who benefits from it<figure><img src="https://images.theconversation.com/files/194085/original/file-20171109-13317-1gvp3wn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">To make sure we get the most out of education, we may need to both broaden our narrative about standardised testing and try to minimise its negative influences.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><hr>
<p><em>Since it was introduced in the 1800s, standardised testing in Australian schools has attracted controversy and divided opinion. In this <a href="https://theconversation.com/au/topics/standardised-testing-series-46310">series</a>, we examine its pros and cons, including appropriate uses for standardised tests and which students are disadvantaged by them.</em></p>
<hr>
<p>If any topic is likely to divide a room of individuals interested in education, it is the use of standardised assessment. A standardised test is any test that requires all test-takers to respond to the same tasks in the same way. It is administered in a consistent manner and scored using a scale of standards in knowledge and skills. One example is <a href="https://www.nap.edu.au/naplan/the-tests">NAPLAN</a>. </p>
<h2>Standardised tests have been used in Australia for approximately 200 years</h2>
<p>Standardised testing <a href="http://www.emeraldinsight.com/doi/abs/10.1108/eb009684">began</a> in Australia in the 1800s. Itinerant school inspectors used it to monitor the quality of education being provided. External examinations boards assessed student achievement on sets of tasks in primary, secondary and tertiary education. </p>
<p>In the early 1900s, it was also used to assess learning ability. The famous “<a href="http://onlinelibrary.wiley.com/doi/10.1111/j.2044-8279.1951.tb02787.x/abstract">mazes</a>” tasks, a psychological test designed to measure psychological planning capacity and foresight, were used in many international intelligence assessments. They were first developed by <a href="http://onlinelibrary.wiley.com/doi/10.1111/j.2044-8279.1951.tb02787.x/abstract">Stanley Porteus</a>, the inaugural principal of the first state special school in Victoria in 1913 as part of a range of screening tools.</p>
<p>The threat of the USA losing the space race in the 1950-60s <a href="https://news.harvard.edu/gazette/story/2007/10/how-sputnik-changed-u-s-education/">led to a focus</a> on the quality of educational outcomes. Standardised assessment procedures were used nationally to monitor this. Over the decades since, testing has become more frequent and more centralised, with a shift to making schools and educators accountable for scores. </p>
<h2>Five narratives that influence support for standardised testing</h2>
<p>Reasons <a href="https://standardizedtests.procon.org">for and against</a> their use in these and other ways are numerous. </p>
<p>Our opinions and views on this issue are shaped by the dialogue in which we participate. In 2017, <a href="https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?referer=https://www.google.com.au/&httpsredir=1&article=7448&context=etd">Jensen and colleagues</a> analysed the perspectives of over 120 prominent authors from the domains of education, policy, economic, psychology/psychometry and history, published across the last century. They identified the most common narratives each domain and its consequences. The most common theme of all domains is its use to control education. They differ in their disposition to this control and how it is implemented: </p>
<ul>
<li><p>The education domain sees this as a political and negative control mechanism. Its dialogue rarely explores how testing can benefit the educational process. </p></li>
<li><p>The policy domain sees control as positive. Standardised tests provide “pure”, “trustworthy” measures of achievement that have improved school accountability, classroom practices and learning. </p></li>
<li><p>The economic domain also sees it as positive. Standardised test data predict economic outcomes. Policy makers use these economic analyses more than educators, especially practitioners. </p></li>
<li><p>The psychology/psychometry domain notes the frequent inappropriate use of standardised tests, with misinterpreted outcomes. For example, the belief that test scores are precise and can be interpreted as such.</p></li>
<li><p>The history discipline sees testing used to control what is valued as knowledge (the curriculum), who gets to learn it (sorting and selecting students) and school organisation and teaching practices. It discusses how testing is used to make teachers and pupils accountable. </p></li>
</ul>
<p>Four additional stakeholders with a voice in education not examined directly by this research are parents, the community, industry and politicians. <a href="http://www.worldcat.org/title/defending-standardized-testing/oclc/58678532">Phelps</a>, in 2005, noted that in the USA at least, all were strongly committed to it. </p>
<p>In other words, like other concepts in contemporary education, there are multiple perspectives on standardised assessment. Our position on its value, relevance and valid use is informed by our more fundamental beliefs about the purposes of education in a culture, our conception of students, our roles and responsibilities as educators and our understanding of learning and teaching. Awareness of the five narratives can contribute to our personal views about standardised testing.</p>
<h2>How is test data used?</h2>
<p>One reason for the debate relates to how standardised test data is used. Some of the most common purposes are to inform decisions about: </p>
<ul>
<li><p>the knowledge and skills students can display independently at any time, </p></li>
<li><p>a student’s learning profile,</p></li>
<li><p>the teaching that matches a student’s learning profile, </p></li>
<li><p>the additional knowledge and skills a student needs to meet particular educational criteria,</p></li>
<li><p>the success or effectiveness of educational provision in a school, </p></li>
<li><p>“academic standards” and comparative educational performance between schools, states or countries, and </p></li>
<li><p>resourcing educational provision.</p></li>
</ul>
<h2>Standardised testing is necessary, but not sufficient</h2>
<p>Standardised assessment data plays a key role in my work as an educator. Part of this involves identifying the most appropriate learning pathways for students who learn differently from their peers. Standardised assessment data helps me see where and how they differ. </p>
<p>But this is insufficient. I also need to analyse more specifically how each student learns, often using individual interviews and error analysis. I use dynamic assessment procedures to examine how they interpret and respond to regular and to differentiated teaching.</p>
<p>I also need data that standardised assessment procedures have difficulty providing. For example, a student’s emotional engagement with the teaching, their attitudes to it, their identities as learners, their ability to manage and direct their learning activity and how culturally relevant they see the teaching. Standardised assessments contribute to my data collection but are certainly not enough. </p>
<p>Standardised testing is likely to be with us for some time. As educators, many of us object to how and why it is used beyond the teaching-learning context and the preference and priority it is given over other forms of assessment. To ensure that it benefits optimally the learning outcomes of our students, we may need to both broaden our narrative about it and take whatever steps we can to minimise its negative influences.</p><img src="https://counter.theconversation.com/content/86541/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>John Munro has received received research council grants in the past and has been contracted to evaluate assessment tools. </span></em></p>The use of standardised testing is a divisive topic, and most of the disagreement comes down to beliefs about whether using it to control education is a good or bad thing.John Munro, Professor, Faculty of Education and Arts, Australian Catholic UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/743012017-03-13T19:21:06Z2017-03-13T19:21:06ZParents shouldn’t rely on My School data when choosing a school for their child<figure><img src="https://images.theconversation.com/files/160262/original/image-20170310-3700-19kkc7h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The most reliable way to find the best school for your child is to visit and find out about its philosophy and programs.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Parents are often overwhelmed by the decision about choosing the right school for their children. The <a href="https://www.myschool.edu.au/">My School</a> website, which was updated last week, provides useful information, but should not be used as the sole guide to the best school for your child.</p>
<h2>What is the My School website?</h2>
<p>My School provides national comparable data on all primary and secondary schools. It shows annual NAPLAN results and demographic and financial reporting, so that parents can supposedly make informed choices about where to send their children.</p>
<p>The site allows users to compare schools by postcode and also with “like” schools that share similar student populations. It contains useful information on things such as school finances, staffing, student background and other contextual factors.</p>
<p>It is popular, with over 1.4 million site visits in 2016.</p>
<p>In 2008, the prime minister, Kevin Rudd, <a href="http://www.smh.com.au/news/national/rudd-sets-tough-rules-for-school-funding/2008/08/27/1219516564909.html">announced plans</a> to introduce these national school comparisons. He said this was <a href="http://press-files.anu.edu.au/downloads/press/p6031/pdf/ch09.pdf">because</a> school standards were not high enough and parents should “<a href="http://www.smh.com.au/news/national/rudd-sets-tough-rules-for-school-funding/2008/08/27/1219516564909.html">vote with their feet</a>” in choosing more successful schools.</p>
<p>In other words, if you pick a school with higher NAPLAN results, you’ll be guaranteed a return on investment. </p>
<p>But this is simply wrong.</p>
<p>In his book, <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674035218">Measuring Up</a>, Daniel Koretz warns of the dangers of using test scores to make judgments about schools, students and teachers. He argues that the two common misunderstandings about testing are:</p>
<blockquote>
<p>That scores on a single test tell us all we need to know about student achievement, and that this information tells us all we need to know about school quality.</p>
</blockquote>
<h2>Things to keep in mind when using My School</h2>
<p>Recent news reports suggested that parents could use MySchool data to <a href="http://www.abc.net.au/news/2017-03-08/my-school-website-publishes-data-from-naplan-test-results/8333872">compare children’s results</a>, referred to students as <a href="http://www.theage.com.au/victoria/struggler-coaster-or-improver-what-sort-of-naplan-student-are-you-20170307-guspke.html">strugglers, coasters or improvers</a>, and generated <a href="http://www.dailytelegraph.com.au/newslocal/manly-daily/naplan-results-balgowlah-boys-win-most-improved-while-manly-selective-comes-top/news-story/8fb488ef31ff0a214e58b57e1a624346">league tables</a> of schools.</p>
<p>Each of these reports misuse or misunderstand the data in some way and are therefore misleading for parents.</p>
<p>There are two main issues with using My School and NAPLAN data to make judgments about schools and teachers.</p>
<h2>1. Test results say nothing about teaching quality</h2>
<p>There is no <a href="https://theconversation.com/test-scores-arent-good-quality-indicators-for-schools-or-students-43475">clear link</a> between student achievement on tests and school performance or the quality of teaching within particular classrooms. </p>
<p>Take a hypothetical Year 9 student, who has multiple teachers across the school week, not to mention an array of teachers before starting Year 9. How much of that student’s success on NAPLAN can be attributed to their Year 9 history teacher, as opposed to their Year 8 mathematics teacher?</p>
<p>What if our hypothetical student has moved around a lot during their schooling, or has English as a second or third language, or parents who are unemployed and have low literacy levels? </p>
<p>How are any of these individual student factors accounted for in the aggregated scores presented on My School? The answer is simple: they are not. </p>
<p>While the <a href="http://www.acara.edu.au/resources/About_icsea_2014.pdf">Index of Community Socio-Educational Advantage (ICSEA)</a> attempts to allow for socioeconomic differences at the school level, so that schools with similar student populations can be compared, it has its <a href="http://insidestory.org.au/what-my-school-really-says-about-our-schools">own issues</a>. </p>
<h2>2. School rankings and comparisons are misleading</h2>
<p>There are multiple concerns with the overly simplified way that test data are extrapolated to construct <a href="http://www.theaustralian.com.au/national-affairs/in-depth/schools/interactive#browse">league tables</a> of schools, despite this being <a href="http://www.acara.edu.au/docs/default-source/Media-Releases/20170308-my-school-2017-media-release.pdf?sfvrsn=2">inappropriate</a> as a measure of <a href="http://www.aph.gov.au/DocumentStore.ashx?id=dab4b1dc-d4a7-47a6-bfc8-77c89c5e9f74">school performance</a>.</p>
<p>Most of the <a href="http://dx.doi.org/10.1080/10714413.2012.643737">difference</a> in student performance can be attributed to socioeconomic background. The <a href="https://acaraweb.blob.core.windows.net/resources/ICSEA_2015_technical_report.pdf">2015 ICSEA Technical Report</a> states that:</p>
<blockquote>
<p>78% of variance in school performance is accounted for by ICSEA values.</p>
</blockquote>
<p>In other words, less than a quarter of school performance is due to school-based factors such as the quality of teaching.</p>
<p>The attempt to make <a href="https://theconversation.com/naplan-data-is-not-comparable-across-school-years-63703">comparisons across years</a> for individual students, schools or states is problematic due to assumptions about the <a href="http://www.edmeasurement.com.au/_publications/margaret/NAPLAN_for_lay_person.pdf">reliability</a> and comparability of the data.</p>
<p>Also, the onus is placed on parents to calculate the worth and performance of schools through using My School. As Curtin University researcher Brad Gobby <a href="http://dx.doi.org/10.1080/02680939.2015.1083124">explains</a>, it is assumed that:</p>
<blockquote>
<p>parents share the website’s normative assumptions about the value of tests, the quality of the measures, and what defines performance.</p>
</blockquote>
<h2>What is important when choosing a school?</h2>
<p>A <a href="https://docs.education.gov.au/documents/review-my-school-website-0">recent review</a> of My School, commissioned by the federal government, argued that:</p>
<blockquote>
<p>… parent choice of school is informed by a range of factors such as the “feel” of a school, relationships and behaviour management, extra-curricular activities and other qualitative factors that are best determined by visiting a school and talking to teachers and other parents.</p>
</blockquote>
<p>The best way to get a sense of the community and culture of a school is to visit that school, speak with the principal and see the students and staff in the classroom and out in the playground.</p>
<p>Attending school information evenings and parent-teacher conferences, as well as talking with other parents, also provides useful information.</p>
<p>My School cannot tell you about the quality of a school’s facilities, nor about the engagement, respect and relationships in the school. It does not tell you if there is a vibrant arts program or whether sporting activities are available. It also tells you nothing about the subject options available, particularly for secondary schools.</p>
<p>Parents can end up spending many thousands of dollars sending their children to expensive private schools with glossy brochures and lush sporting fields.</p>
<p>But often the <a href="https://theconversation.com/why-im-choosing-the-local-state-school-even-though-it-doesnt-have-all-the-bells-and-whistles-48154">best school</a> for your child is the public school just down the road.</p><img src="https://counter.theconversation.com/content/74301/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stewart Riddle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>My School data does not show the quality of teaching, and school comparisons and rankings can be misleading.Stewart Riddle, Senior Lecturer, University of Southern QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/712192017-02-06T19:14:45Z2017-02-06T19:14:45ZRethinking how we assess learning in schools<figure><img src="https://images.theconversation.com/files/152511/original/image-20170112-25897-13ytzq4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Students don't always know if they are making any progress in their learning.</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p><em>In <a href="https://theconversation.com/au/topics/educating-australia-35445">this series</a> we’ll explore how to improve schools in Australia. Some of the most prominent experts in the sector tackle key questions, including why we are not seeing much progress; whether we are assessing children in the most effective way; why parents need to listen to what the evidence tells us, and much more.</em></p>
<hr>
<p>There is a major flaw in the way we currently assess school students. By labelling them as either “good” or “poor” learners based on their overall grades at the end of each year, students have no clear idea whether they are making progress over extended periods of time.</p>
<p>We need to move away from focusing on what grade a child will get at the end of a year, to assessing the progress that students make over time.</p>
<h2>How students are assessed</h2>
<p>This is how most parents, teachers and students likely view the school process:</p>
<p>It begins with a curriculum that spells out what teachers should teach and students should learn in each year of school.</p>
<p>The role of teachers is to deliver this curriculum by making it engaging and meaningful, and ensuring that all students have an opportunity to learn what the curriculum prescribes. </p>
<p>The role of students is to learn what teachers teach, and it is accepted that some students – the better learners – will learn more of this than others.</p>
<p>The role of assessment is to establish how well students have learnt what teachers have taught. This can be done at the end of a period of teaching such as a semester or school year. Such assessments are sometimes called “summative” or assessments of learning. </p>
<p>Alternatively, assessments can be undertaken during teaching to establish how well students have learnt so far. These assessments are sometimes called “formative” or assessments for learning, because they provide information about gaps in learning and material that may need to be retaught.</p>
<p>Students are then graded on how well they have learnt the curriculum for their year level. Those who can demonstrate most of this curriculum receive high grades; those who demonstrate relatively little receive low grades.</p>
<h2>Unintended consequences</h2>
<p>In support of this way of organising teaching and learning is the argument that the best way to raise achievement levels in schools is to set clear curriculum standards for each year of school, rigorously assess how well students meet those expectations and report performances honestly and fearlessly. If a student has failed, say so.</p>
<p>All of this may be appropriate if all students in each year of school began the year at the same starting point. This is patently not the case. </p>
<p>In any year of school, the gap between the most advanced 10% of students and the least advanced 10% is the equivalent of at <a href="http://www.nap.edu.au/docs/default-source/default-document-library/2016-naplan-national-report.pdf">least five to six years of school</a>. If school were a running race, students would begin the year widely spread out along the running track. Despite this, all students would be judged against the same finish line (the year-level expectations).</p>
<p>And the consequences are predictable. Students at the back of the pack, who are two or three years behind the bulk of students and the year-level curriculum, struggle and generally achieve low grades, often year after year. </p>
<p>A student who receives a “D” this year, a “D” next year and a “D” the year after is given little sense of the progress they are actually making and, worse, may conclude that there is something stable about their ability to learn (they are a “D student”). Many of these students <a href="http://www.aitsl.edu.au/docs/default-source/default-document-library/engagement_in_australian_schools__grattan">eventually disengage</a> from the schooling process.</p>
<p>At the front of the pack, more advanced students generally begin the school year on track to receive high grades. Many receive high grades on the middling expectations for their age group without being overly stretched or challenged. There is <a href="http://www.theage.com.au/national/education/results-flatline-fortop-students-20130109-2cgud.html">evidence</a> that least year-on-year progress is often made by these students.</p>
<h2>An alternative – monitoring learning</h2>
<p>An alternative is to recognise that the fundamental purpose of assessment is to establish and understand where individuals are in their long-term learning progress at the time of assessment. </p>
<p>This usually means establishing what they know, understand and can do – something that can be done before, during or after teaching, or without reference to a course of instruction at all.</p>
<p>Underpinning this alternative is a belief that every learner is capable of further progress if they can be engaged, motivated to make the appropriate effort and provided with targeted learning opportunities. </p>
<p>This is a more positive and optimistic view than a belief that there are inherently good and poor learners as confirmed by their performances on year-level expectations. </p>
<p>It also recognises that successful learning is unlikely when material is much too difficult or too easy, but depends instead on providing every learner with well-targeted, personalised stretch challenges.</p>
<p>A good understanding of where students are in their learning provides starting points for teaching and a basis for monitoring learning progress over time. </p>
<p>One of the best ways to build students’ confidence as learners is to help them see the progress they are making over extended periods of time.</p>
<p>A focus on monitoring learning encourages a long-term perspective. Rather than being defined only in terms of year-level expectations, successful learning is defined as the progress or growth that students make over time. </p>
<p>Under this approach, every student is expected to make excellent progress every year towards the achievement of high standards – regardless of their current levels of attainment.</p>
<hr>
<p>• <em>Geoff Masters explores this theme further in a new book called <a href="https://www.mup.com.au/items/165663">Educating Australia: Challenges for the Decade Ahead</a>.</em></p><img src="https://counter.theconversation.com/content/71219/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Geoff Masters is the CEO of the Australian Council for Educational Research, a body that provides assessment resources to schools and advice to governments. </span></em></p>Our current way of assessing students doesn’t let them see the progress they are making over extended periods of time.Geoff Masters, CEO, Australian Council for Educational ResearchLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/720802017-02-02T00:51:29Z2017-02-02T00:51:29ZWhy do we need a phonics test for six-year-olds?<figure><img src="https://images.theconversation.com/files/155093/original/image-20170201-12672-16auiog.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Children need to learn how to sound out words they haven't seen before.</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>Education Minister Simon Birmingham has announced that all Australian six-year-olds will soon be <a href="http://www.senatorbirmingham.com.au/Media-Centre/Media-Releases/ID/3350/Literacy-and-numeracy-check-for-all-Aussie-schools-under-the-Turnbull-Governments-quality-reforms">required to do a phonics test</a>.</p>
<p>Researchers, parents and others concerned about our system’s failure to identify children who initially struggle to learn to read – and can go on to have a reading disability – have pushed for this test to ensure children are getting the support they need early on.</p>
<p>But the announcement has divided opinions. And at first glance it does look like yet another impost from on high – on already overwhelmed teachers. </p>
<p><a href="https://theconversation.com/a-new-phonics-test-is-pointless-we-shouldnt-waste-precious-money-buying-it-from-england-69355">Some are concerned</a> it is an unnecessary waste of money that should be channelled into intervention, or that the test will prompt teachers to practise test items. </p>
<p>Okay, measuring children’s early phonics skills alone won’t make a difference to how early reading develops. But if you don’t properly measure something, you can’t properly manage it. </p>
<p>And arguments about children practising how to read short, real and made-up words is precisely what will help develop phonic knowledge, and should be encouraged. </p>
<p>But, looking more closely, this test has significant potential to reduce teacher workloads across the school system by identifying students at risk of reading failure early. </p>
<p>This provides targeted support and prevents the need for teachers to cater for an increasingly wide ability range of students as they move through primary and into secondary school. </p>
<p>It also has the potential to <a href="https://theconversation.com/why-australia-should-trial-the-new-phonics-screening-check-69717">sharpen teachers’ focus</a> on a key area – reading – that students nationwide continue to struggle with. </p>
<p>While national average performance may have shown a statistically significant, but relatively small, <a href="https://theconversation.com/naplan-results-reveal-little-change-in-literacy-and-numeracy-performance-here-are-some-key-takeaway-findings-70208">improvement</a> since national testing (such as NAPLAN) was introduced, this is yet to be seen in high school years. And not all states have improved to the same extent.</p>
<h2>Is it actually a test?</h2>
<p>The word “test” conjures up ideas of an external assessor and associated stresses, but the child’s classroom teacher would administer the literacy screener individually. </p>
<p>It will be not unlike the <a href="http://www.det.wa.edu.au/educationalmeasurement/detcms/navigation/on-entry/">on-entry assessments</a> five-year-olds typically complete when they begin the foundation year of school in some states.</p>
<p>Children would be presented with a list of real and made-up words – and teachers would record their score. </p>
<p>This in itself is highly informative for teachers. And it’s preferable to sending students to a literacy specialist for assessment, which is common practice in many schools. </p>
<p>After listening to each child, teachers will know whether children can blend single sounds, or which letter combinations (for example, /sh/) they need to reteach.</p>
<p>The test should take between five and seven minutes per child. The aim is to identify children who aren’t learning to sound words out well. And to detect this early before they fall too far behind their peers.</p>
<p>Many young children can give the false impression that they are learning to read, when in fact they are mostly guessing words from pictures or context. </p>
<p>This guesswork is often aided by the provision of repetitive, predictable texts. </p>
<p>It is also sometimes encouraged by teachers taught the “three-cueing” model of reading at many universities, and promoted by some government and non-government education authorities who recommend particular methods. </p>
<p>Rather than apply the letter-sound relationships to systematically decode words, children are encouraged to use unreliable strategies such as looking at the illustrations, rereading the sentence, saying the first sound, or guessing what word might “fit”.</p>
<p><a href="http://www.balancedreading.com/3cue-adams.html">Research shows</a> that the three-cueing model lacks a scientific basis. Yet people continue to use it because it is familiar and it is marketed as a strategy to promote reading comprehension.</p>
<p>While the goal of reading is undeniably to extract meaning, children who cannot accurately read the words on the page are invariably very poor comprehenders. </p>
<p>To become a strong reader, a young child must learn how to sound words out accurately and quickly. No exceptions. Decades of <a href="https://seidenbergreading.net">research</a> back this up.</p>
<p>Sounding out words is very difficult for around 20% of children in the general population, and typically a much higher percentage in areas of disadvantage. We know that such children, if left unassisted, usually <a href="http://www.readingrockets.org/article/waiting-rarely-works-late-bloomers-usually-just-wilt">never catch up</a>.</p>
<h2>But don’t teachers already do this?</h2>
<p>Regular monitoring of the critical precursor skills young children need to become fluent and accurate readers, such as identifying the first sound in spoken words, is something effective teachers already do. </p>
<p>For those who don’t, the requirement to listen to every six-year-old read the same list of made-up and real words will at the very least flag those children who are struggling and draw attention to their instructional needs. </p>
<p>Many schools use <a href="https://dibels.uoregon.edu/market/assessment/dibels">free one-minute assessments</a> to <a href="http://www.motif.org.au">test</a> these skills. These are very similar to the literacy test being proposed. The cost of the UK Phonics Check <a href="https://www.cis.org.au/app/uploads/2016/11/rr22.pdf?">has been estimated</a> at £10-12 (around A$20) per child.</p>
<p>The most useful tests investigate children’s ability to read both real words and short made-up-words like lib, mep, gax; these are examples used in the <a href="https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/532604/2016_Phonics_screening_check_pupils__materials_-_standard__STA167501e_.pdf">2016 Phonics Test in England</a> – the model Australia will be using.</p>
<p>What’s important is that students have not seen these made-up words before. </p>
<p>If they have been taught the precursor skills – letter sound knowledge (phonics) and the strategy of decoding – this assessment will show it.</p>
<p>All of us have to be able to attack words we’ve never seen before. Look at Pokémon cards featuring names such as Pikachu and Nidoran; place names such as Naringal; brands like Bupa; or characters in a book, such as Hagrid. </p>
<p>The earlier children can develop this skill, the better their chance of reading and spelling well.</p>
<h2>Current assessments in schools</h2>
<p>The problem is that the assessments some schools use don’t always include made-up words. </p>
<p>Some children start school being able to recognise words because of their shape or associated picture clue, but cannot independently decode. Made-up words are objective and favour no child.</p>
<p>The assessments of reading used, like the <a href="https://readingrecovery.org/reading-recovery/teaching-children/observation-survey">Observation Survey</a> or <a href="http://www.learnnc.org/lp/editions/readassess/1.0">Running Record</a> – which tend to be more labour-intensive – focus more on reading comprehension, vocabulary and fluency. </p>
<p>These are important, but if a student is struggling in any of these areas, the main reason is often due to poor sounding-out skills.</p>
<p>Children who struggle to sound out words must be identified and given extra help as early as possible, both at a classroom level and then in small groups. </p>
<p>Many parents quietly pay tutors for expert help outside school hours. Many other parents can’t afford this. </p>
<p>The <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4906364">consequence for taxpayers </a> is a much larger bill for things like unemployment benefits, forgone taxes, adult literacy courses and prisons. The school-to-prison pipeline is real. </p>
<h2>How will the phonics test be implemented?</h2>
<p>We don’t know yet exactly how the phonics test will work here because the minister’s expert panel hasn’t done its work yet. </p>
<p>However, we can be encouraged by <a href="https://www.gov.uk/government/publications/phonics-screening-check-evaluation-final-report">research into the impact</a> of a similar test in England. </p>
<p>There is some evidence that, in helping sharpen teachers’ focus on phonics, the test led to a greater emphasis on systematically and explicitly teaching children about sounds and their spellings. This was something our national inquiry into the teaching of reading <a href="http://tinyurl.com/d6v2v9y">recommended over a decade ago</a>.</p>
<p>For teachers who are ideologically opposed to <a href="https://theconversation.com/explainer-what-is-phonics-and-why-is-it-important-70522">explicit, systematic phonics instruction</a>, this literacy check is an unwelcome impost. </p>
<p>However, for many schools that include phonological awareness and systematic decoding instruction, it is simply a validation of their effective early reading instruction.</p>
<p>• <em>This piece was co-authored by Alison Clarke, a speech pathologist at the Clifton Hill Child and Adolescent Therapy Group in Melbourne.</em></p><img src="https://counter.theconversation.com/content/72080/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lorraine Hammond is the President of Learning Difficulties Australia. </span></em></p>Many young children can give the false impression that they are learning to read, when in fact they are mostly guessing words from pictures or context. This test will help to identify these students.Lorraine Hammond, Senior Lecturer in Education, Edith Cowan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/720922017-02-01T19:06:42Z2017-02-01T19:06:42ZEducating Australia – why our schools aren’t improving<figure><img src="https://images.theconversation.com/files/154687/original/image-20170130-27056-1jv94zc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">New evidence-based methods of teaching and learning are being taken up very slowly.</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p><em>In this series we’ll explore how to improve schools in Australia. Some of the most prominent experts in the sector tackle key questions, including why we are not seeing much progress; whether we are assessing children in the most effective way; why parents need to listen to what the evidence tells us, and much more.</em></p>
<hr>
<p>Australian schooling has undergone major changes over the last decade, mainly through national policy reforms agreed by federal and state governments. These include:</p>
<ul>
<li><p>an <a href="http://www.australiancurriculum.edu.au">Australian Curriculum</a></p></li>
<li><p>standardised national assessments in literacy and numeracy (<a href="https://www.nap.edu.au">NAPLAN</a>)</p></li>
<li><p>national reporting on schools through the <a href="https://www.myschool.edu.au/">My School website</a></p></li>
<li><p>professional standards for <a href="http://www.aitsl.edu.au/australian-professional-standards-for-teachers/standards/list">teachers</a> and <a href="http://www.aitsl.edu.au/australian-professional-standard-for-principals">principals</a></p></li>
<li><p>a <a href="https://www.education.gov.au/universal-access-early-childhood-education">universally accessible</a> year of preschool</p></li>
<li><p>partial implementation of the <a href="https://theconversation.com/gonski-model-was-corrupted-but-labor-and-coalition-are-both-to-blame-65875">“Gonski”</a> needs-based funding reforms.</p></li>
</ul>
<p>During the same decade, rapid economic, social, technological and cultural changes have generated new pressures and possibilities for education systems – and the people who work in them.</p>
<p>For example, Australia continues to become more ethnically and culturally diverse, and more closely connected to the Asia-Pacific region. The nation is more active in its use of mobile and digital technology, more urbanised and more unequal in wealth and income.</p>
<p>These broader shifts, and the political responses to them, increasingly place education in a vice. It faces mounting pressure to achieve better outcomes for more people, while expected simultaneously to innovate and solve wider problems of society. And this is all to be done in a context of growing fiscal austerity.</p>
<h2>Lots of change, but very little impact</h2>
<p>Despite significant reforms over the past decade, there is unfortunately very little sign of positive impacts or outcomes. For example:</p>
<ul>
<li><p>The percentage of Australian students successfully completing Year 12 is <a href="http://www.mitchellinstitute.org.au/fact-sheets/senior-school-years-school-completion-uneven-across-australia/">not improving</a>.</p></li>
<li><p>State and federal school funding policies are still <a href="http://research.acer.edu.au/aer/14/">reproducing a status quo</a> that entrenches sectoral division and elitism.</p></li>
<li><p>New evidence-informed methods, such as <a href="http://education.unimelb.edu.au/about_us/clinical-teaching">clinical</a> and <a href="https://grattan.edu.au/report/targeted-teaching-how-better-use-of-data-can-improve-student-learning/">targeted</a> teaching models (which focus on careful monitoring and evaluation of individual student progress and teaching impact), are being taken up very slowly in teacher education degrees and schools.</p></li>
<li><p>The <a href="https://www.ncver.edu.au/publications/publications/all-publications/entry-to-vocations-strengthening-vet-in-schools#">status and efficacy of vocational learning</a> have shown little meaningful improvement.</p></li>
<li><p>NAPLAN and My School have not led to improvements in literacy and numeracy, with <a href="http://www.nap.edu.au/docs/default-source/default-document-library/2016-naplan-national-report.pdf?sfvrsn=2">2016 data</a> showing either stagnation or decline.</p></li>
<li><p>The performance of Australian students in international assessments of maths, science and literacy skills has <a href="https://theconversation.com/australias-pisa-slump-is-big-news-but-whats-the-real-story-20964">steadily declined</a>.</p></li>
</ul>
<h2>Replicating a failing system</h2>
<p>The national reforms since the mid-2000s were designed to address many of these persistent issues. </p>
<p>Yet somehow, despite hard-fought political battles and reforms, and the daily efforts of system leaders, teachers, parents and students across the nation, we continue to replicate a system in which key indicators of impact and equity are stagnating or going backwards.</p>
<p>The school funding impasse exemplifies this problem. </p>
<p>The policy area is continuously bedevilled by the difficulties of achieving effective collaboration between governments and school sectors in our federal system.</p>
<p>It also remains hamstrung by highly inequitable funding settlements, established over many decades. These continue to entrench privilege in elite schools, while consistently failing to provide “needs-based” funding to schools and young people who need the most support.</p>
<p>As a result, educational opportunities and outcomes become further polarised. Young people from privileged backgrounds are accruing further advantage. Those from disadvantaged backgrounds are increasingly <a href="https://www.sprc.unsw.edu.au/media/SPRCFile/Unpacking_Youth_Unemployment__Final_report.pdf">locked out of competitive education and job markets</a>. </p>
<p>The global growth of identity politics, fostering conflict over class, race, gender and migration, puts these trends in stark context.</p>
<h2>So what are we doing wrong?</h2>
<p>In <a href="https://www.mup.com.au/items/165663">Educating Australia: Challenges for the Decade Ahead</a>, we tackle this question and seek to create a more innovative and productive interaction between ideas, evidence, policy and practice in education.</p>
<p>The scholars, practitioners and policy thinkers involved in the book examine key issues in education and canvas opportunities for improving outcomes on a wide scale. This includes areas like teaching, assessment, curriculum, funding and system-wide collaboration.</p>
<p>Across all these areas, it is clear that huge value would be created in Australia if the ways of framing and delivering teaching, learning and community engagement were adjusted to reflect new methods and perspectives arising from innovative practice and research.</p>
<p>Yet this is easier said than done. And despite many commentators claiming so, there are no silver-bullet solutions.</p>
<p>Over the past decade, the policy landscape has become riddled with reform “solutions”. Theset subject students, teachers, administrators and policymakers to mounting levels of pressure and stress. The short-term cyclical churn of today’s politics and media clearly exacerbates these problems.</p>
<p>There have, however, been some important and substantive reforms that prove not all political change is superficial. And not all aspects of national reform have failed to generate positive impacts.</p>
<p>For example, the Gonski reforms have channelled powerful resources to some schools. And My School has allowed us to see clearly where inequalities lie and interventions must be targeted.</p>
<p>Policy interventions, however, rarely achieve their objectives in isolation, or in predictable or linear ways, when they encounter complex systems and realities.</p>
<p>That is why we need to rethink the purposes of education as we go. We need to align these with the workings of curriculum, assessment, regulation and funding, along with the daily efforts of teachers, students and other community members.</p>
<p>Discussions about purposes will not thrive if separated or abstracted from the practices and politics of education: the places and spaces where policies are implemented, where students experience schooling, where professional identities are formed and challenged.</p>
<p>As such, far greater attention and skill are needed to craft and build the institutional capabilities that render goals achievable, ensure fairness and foster innovation and systemic learning in the public interest.</p>
<p>Practical lessons arising from recent innovations in teacher education, professional learning, curriculum alignment and inter-school collaboration can help here.</p>
<p>We also need to move beyond a fascination with divisions between governments in Australia’s federal system. We must focus instead on harnessing the potential of networks and collaborations across systems. </p>
<p>That is why a coherent reform “narrative” that genuinely reflects evidence about the nature of effective learning and teaching matters so much.</p>
<p>Ultimately, the future success of Australian school education hinges on whether powerful ideas can be realised in practice, across tens of thousands of classrooms and communities.</p>
<p>If we want reforms to be effective, their design must be grounded in wide-ranging dialogue about the nature of the problems and evidence about what will help to solve them.</p><img src="https://counter.theconversation.com/content/72092/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Glenn C. Savage receives funding from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Tom Bentley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Despite significant reform agendas over the past decade, no real progress in outcomes has been achieved.Tom Bentley, Principal Adviser to the Vice Chancellor, RMIT UniversityGlenn C Savage, Senior Lecturer in Public Policy and Sociology of Education, and ARC DECRA Fellow (2016-19), The University of Western AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/708532017-01-05T08:49:23Z2017-01-05T08:49:23ZWhy caution is called for when analysing South Africa’s matric results<figure><img src="https://images.theconversation.com/files/151797/original/image-20170105-18650-1kp9fgr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">There are big problems in South Africa's school systems. These aren't often discussed when matric results are released.</span> <span class="attribution"><span class="source">REUTERS/Siphiwe Sibeko</span></span></figcaption></figure><p><em>South Africans are poring over the latest set of <a href="http://www.enca.com/south-africa/2016-matric-results-by-numbers">matric results</a> which show how the country’s school leavers performed in their final exams after 12 years of formal schooling. Nearly 718 000 people wrote the exams and 72.5% of them passed – a small increase on last year.</em></p>
<p><em>The results always generate a great deal of debate – and often anger. The Conversation Africa’s education editor Natasha Joseph asked Associate Professor Elizabeth Walton to explain the results and why it’s crucial to remember the young people behind the numbers.</em></p>
<p><strong>There’s a huge focus on matric results every year, particularly on the national pass rate. Is this a useful obsession?</strong></p>
<p>I am not convinced that this annual obsession with matric results is productive. The national pass rate is a very blunt instrument with which to dissect South Africa’s very complex educational problems. The national pass rate obscures important differences in provincial achievements, the urban/rural divide and the unequal outcomes for learners in poorer schools. </p>
<p>It also does not tell us much about the quality of the passes, nor about the subjects taken. The national pass rate also reflects only the learners who sat the exam. It does not take into account the <a href="http://www.education.gov.za/Portals/0/Documents/Publications/General%20Household%20Survey%202013.pdf?ver=2015-07-07-111309-287">numbers of early school leavers</a> who did not make it to matric. </p>
<p>This year the <a href="http://www.education.gov.za/Newsroom/Speeches/tabid/950/ctl/Details/mid/3816/ItemID/4238/Default.aspx">announcement</a> by Angie Motshekga, the Minister of Basic Education, showed 828 020 candidates registered for the examinations. But only 717 971 – full time and part time – actually wrote the exams. This means that more than 100 000 learners made it to grade 12, but fell before the final hurdle.</p>
<p><strong>Is a final set of exams at the end of 12 years of schooling the best way for South Africa to judge pupils’ readiness for entering the world of work or continuing on to tertiary education? What other options exist?</strong></p>
<p>Many education systems around the world combine a school-based assessment component with some external standardised assessment as a school leaving qualification. But it seems to me that we should not be looking at a major change at this stage. The system needs to settle and mature. I do think, though, it would be good to revisit South African academic Professor Stephanie Allais’ <a href="https://theconversation.com/south-africa-should-scrap-simple-pass-or-fail-exam-results-for-school-leavers-34928">proposal</a> that the current pass or fail system be scrapped.</p>
<p>She suggests that learners should instead be allowed to complete grade 12 with a basket of subjects and results which could then be presented to an employer or institution of higher learning. This would shift the focus from the national pass rate to the enrolment and results of individual subjects. It might also mean that schools could be less concerned with an overall school pass rate and rather focus on subject-level improvement over time. </p>
<p>It is possible to improve a school’s pass rate without actually improving teaching and learning; for example by finding ways to exclude learners who may compromise a school’s results, or by not offering subjects that are perceived to be difficult, like maths and physical science. </p>
<p>I also think we need to be realistic in terms of what we expect a matric qualification to signal. The minister of basic education has <a href="http://www.education.gov.za/Newsroom/Speeches/tabid/950/ctl/Details/mid/3816/ItemID/4238/Default.aspx">noted</a> that it is an exit qualification and not primarily a tool for evaluating the progress of the system.</p>
<p>For those who are not looking to pursue further education, a matric certificate is expected to provide proof of preparation for the world of work. Others expect it provide evidence of the foundations of academic literacy and subject competence that will enable success in higher learning. These expectations are not always compatible with what South Africans regard as “basic education”. </p>
<p>To address this “one-size-fits-all” matric, the Department of Basic Education has <a href="http://www.gov.za/speeches/basic-education-department-briefs-portfolio-committee-skills-revolution-15-mar-2016-0000">proposed</a> a three stream education system with an Academic Stream, a Technical Vocational Stream and a Technical Occupational Stream. This is expected to address the problem of early school leaving and prepare learners for the world of work. </p>
<p><strong>Maths and science results often get the most attention. They are obviously important “canaries in the coal mine” that point to the system’s overall health. But are there subjects that deserve more attention and whose results can paint a picture of what’s going wrong – or right?</strong></p>
<p>I think it is vital that maths and science retain our attention, for several reasons. These are gateway subjects for the science, technology, engineering and maths occupations South Africa <a href="http://www.dhet.gov.za/Gazette/Government%20Gazette%20No%2039604,%2019%20January%202016.%20List%20of%20Occupations%20in%20High%20Demand%202015.pdf">urgently needs</a> to develop. They’re also subjects that bear huge scars of apartheid’s legacy.</p>
<p>They also build sequentially: poor foundations are not easily addressed by late interventions. Having said that I do think that languages, particularly indigenous African languages, also need our focus to secure their growth and development. The introduction of <a href="http://www.education.gov.za/LinkClick.aspx?fileticket=RMnJX-XwQYE%3d&tabid=420&portalid=0&mid=2373">South African Sign Language</a> as a home language examined at matric level is a definite success story.</p>
<p><strong>What if you’re a young person who’s failed matric? What’s your best option?</strong></p>
<p>This is an important question, because any analysis of the matric results must hold in tension the system and the individual. We cannot ignore the fact that there are real young people with hopes and dreams behind all the numbers. Failure is devastating – particularly in the face of a trend that sees South Africans celebrating individual “top achievers” in newspapers and at prestigious events.</p>
<p>I think we should be wary of this. It assumes that success at school is purely the result of individual effort and ability. Those who don’t succeed are <a href="http://randburgsun.co.za/311238/update-public-shocked-by-lowered-maths-pass-rate/">presumed to be lazy</a> and disinterested in education. These celebrations convey the message that everyone is equally positioned to succeed in a meritocratic process. </p>
<p>In fact, educational success in South Africa has much to do with <a href="http://ci.org.za/depts/ci/pubs/pdf/general/gauge2015/Child_Gauge_2015-Schooling.pdf">household income</a>, the <a href="http://www.ekon.sun.ac.za/wpapers/2007/wp162007/wp-16-2007.pdf">location of the school </a> and good early childhood and foundation phase <a href="http://www.scielo.org.za/scielo.php?script=sci_arttext&pid=S2223-76822015000200003">education opportunities</a>.</p>
<p>Some learners will be upset because they expected to do well; this sometimes happens when the demands of school-based assessment have not been as rigorous as the <a href="http://www.education.gov.za/Curriculum/NationalSeniorCertificate(NSC)Examinations.aspx">National Senior Certificate</a> exams set by the Department of Basic Education. </p>
<p>There are opportunities to rewrite through the department’s <a href="http://www.education.gov.za/Curriculum/NationalSeniorCertificate(NSC)Examinations/SecondChanceProgramme/tabid/956/Default.aspx">Second Chance Programme </a>. Learners should also seek <a href="http://lifelinesa.co.za/">counselling support</a> for persistent feelings of hopelessness.</p><img src="https://counter.theconversation.com/content/70853/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth Walton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>South Africa’s annual matric pass rate obscures important differences in provincial achievements, the rural and urban divide and the unequal outcomes for learners in poorer schools.Elizabeth Walton, Associate professor, University of the WitwatersrandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/703312016-12-14T03:11:56Z2016-12-14T03:11:56ZNAPLAN results: moving beyond our obsession with numbers<figure><img src="https://images.theconversation.com/files/149777/original/image-20161213-25506-1s5uoms.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How can we use data from international tests to improve student learning?</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>The <a href="https://theconversation.com/naplan-results-reveal-little-change-in-literacy-and-numeracy-performance-here-are-some-key-takeaway-findings-70208">latest national report</a> from the National Assessment Program – Literacy and Numeracy (NAPLAN) has led to yet another round of calls for the <a href="https://theconversation.com/for-australia-to-improve-in-maths-policymakers-need-to-make-a-plan-and-stick-to-it-69892">need to reform</a> schooling practices.</p>
<p>This is typically construed as a debate between advocates of <a href="https://theconversation.com/does-more-money-for-schools-improve-educational-outcomes-57656">increased funding</a> on the one hand, and those who decry how such increases have <a href="http://www.skynews.com.au/news/politics/federal/2016/04/05/school-funding-shift-a--historic-mistake-.html">resulted in very little return</a>. </p>
<p>This debate is neither edifying nor educative. </p>
<p>Poor results are generally <a href="https://www.equalitytrust.org.uk/resources/the-spirit-level">associated with socioeconomic disadvantage</a>, and resources are necessary to help redress such limitations. </p>
<p>At the same time, these resources need to be deployed carefully to maximise students’ learning. </p>
<p>It is also clear that <a href="https://theconversation.com/focusing-on-tests-and-invalid-assessments-is-the-wrong-way-to-measure-teacher-quality-63931">quality teaching matters</a>.</p>
<p>Instead of being enamoured of or despondent about standardised test results, we should look closely at how the data generated <a href="https://theconversation.com/australian-schools-continue-to-fall-behind-other-countries-in-maths-and-science-69341">through these national and international tests</a> has been developed – how data is actually deployed, and how various test results <a href="https://theconversation.com/governments-need-to-look-beyond-education-rankings-and-focus-on-inequities-in-the-system-69715">can be drawn on</a> to make improvements. </p>
<h2>How should we respond to test results?</h2>
<p>This deployment of data needs to be undertaken cautiously. </p>
<p>Various forms of testing that reduce students’ knowledge, capacities, understandings and skills to a single number – or series of numbers – cannot of themselves help inform improvement. </p>
<p>They also cannot be used in isolation from actual teaching and learning practices. </p>
<p>The cultivation of professional learning on the part of teachers is essential. </p>
<p>This may be challenging at times and require a much more long-term approach than is associated with <a href="https://theconversation.com/gonski-model-was-corrupted-but-labor-and-coalition-are-both-to-blame-65875">more typical educational policy cycles</a>.</p>
<p>At present, there is a strong focus on – and <a href="https://pdfs.semanticscholar.org/aaa2/5d59159013df9c2d3d21662de1d7c9c2a347.pdf">history of</a> – governing education through numbers, particularly standardised tests. </p>
<p>There is a tendency to rely on these numbers as the sole measurement for achievement, without taking into consideration the range of other influences (such as socioeconomic or geographical factors) that impact on educational attainment.</p>
<p>Teachers’ work and the effectiveness of their teaching is also measured through these numbers. This can result in a reliance on standardised test results to show how a teacher is performing, which <a href="http://files.eric.ed.gov/fulltext/EJ1016271.pdf">minimises professional judgement</a> and doesn’t really present a true picture of what is happening in the classroom.</p>
<h2>When to use standardised testing</h2>
<p>Standardised testing can be used much more productively. But for this to happen, it needs to be done in a context that values teacher judgement about their practice. </p>
<p>This involves fostering the conditions in which teachers can develop:</p>
<ul>
<li>subject matter knowledge</li>
<li><a href="https://people.ucsc.edu/%7Ektellez/shulman.pdf">understanding of how children best learn</a></li>
<li>deep understandings of students’ existing knowledge</li>
<li>perceptions of students’ own learning abilities</li>
<li>understandings of culture</li>
<li>ways of <a href="http://www.ascd.org/publications/educational-leadership/feb98/vol55/num05/Teacher-Learning-That-Supports-Student-Learning.aspx">assessing students</a> </li>
</ul>
<p>It is also useful for teachers to collaborate with colleagues and talk about their students’ learning. This is done well by using <a href="http://www.ascd.org/publications/educational-leadership/feb98/vol55/num05/Teacher-Learning-That-Supports-Student-Learning.aspx">specific samples of student work</a> as evidence of this learning, and through developing an <a href="http://www-personal.umich.edu/%7Edkcohen/downloads/developingpractice.pdf">inquiry-oriented approach</a> to better research their practice.</p>
<p>These approaches need to be continuous and ongoing. There needs to be support from bodies that are external to schools. Teachers also need to be supported so that they can <a href="http://www.aitsl.edu.au/docs/default-source/default-document-library/professional_learning_an_introduction_to_research_literature">develop a theoretical understanding</a> of their practice. </p>
<p>Intensive and sustained professional development that focuses on the content of the subject that teachers teach is seen as being most effective – especially when it involves active learning and collective participation. </p>
<p>Such approaches value and <a href="http://www.tandfonline.com/doi/abs/10.1080/09650792.2015.1012175?journalCode=reac20">validate teachers’ learning</a> and can genuinely change education.</p>
<p>Of course, none of this is easy, but it is much more likely to lead to improved engagement and better student learning outcomes on a range of measures, including NAPLAN.</p><img src="https://counter.theconversation.com/content/70331/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ian Hardy receives funding from the Australian Research Council. </span></em></p>Various forms of testing that reduce students’ knowledge, capacities and skills to a single number cannot of themselves help inform improvement.Ian Hardy, Senior Lecturer in Educational Policy and Practice, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/697822016-12-07T04:23:35Z2016-12-07T04:23:35ZAustralia is very average when it comes to maths and science performance – here’s what needs to change<figure><img src="https://images.theconversation.com/files/149009/original/image-20161207-13648-12sbsq3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Australia is one of only three countries with significantly decreased maths and science scores in the latest round of PISA.</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>As a school student, I awaited the arrival of the end-of-year report with a bracing mix of hope and fear.</p>
<p>Now, as Australia’s Chief Scientist, I’m worried once again about school reports.</p>
<p>Our proudly first-class country, with a prosperous economy and an egalitarian spirit, must not be fair-to-middling when it comes to science and maths in schools. On the <a href="https://theconversation.com/australian-schools-continue-to-fall-behind-other-countries-in-maths-and-science-69341">evidence before me</a>, we are.</p>
<p>Do I believe that international testing can capture everything of importance in Australian education? No.</p>
<p>But do I take these findings seriously? Yes, I do.</p>
<p>Be it the international studies <a href="https://theconversation.com/pisa-results-dont-look-good-but-before-we-panic-lets-look-at-what-we-can-learn-from-the-latest-test-69470">Programme for International Student Assessment (PISA)</a> and <a href="https://theconversation.com/australian-schools-continue-to-fall-behind-other-countries-in-maths-and-science-69341">Trends in International Mathematics and Science Study (TIMSS)</a>, or the national scheme National Assessment Program – Literacy and Numeracy (NAPLAN), the message is clear.</p>
<p>Our performance in absolute terms is stalling, or in decline, and our position in global rankings continues to fall.</p>
<h2>International comparisons</h2>
<p>Canada now scores significantly higher across all PISA and Year 8 TIMSS domains. England has improved its TIMSS performance, while also decreasing the proportion of low-performing students.</p>
<p>Australia, by contrast, is one of only three countries with significantly decreased maths and science scores in this round of PISA. And the difference between children in Australia’s highest and lowest socioeconomic quartiles recorded by PISA is the equivalent of three full years of school.</p>
<p>While we demand to be top ten in sport, we are barely scraping top 20 in schools. </p>
<p>In PISA maths, we have fallen as low as 25. How much lower are we prepared to go?</p>
<p>My concern is not the temporary wound to national pride. It is the enduring harm we do when students leave school with malnourished potential – or worse, no interest at all – in disciplines that they require to navigate their world. We need to improve.</p>
<p>Let’s start by defining the aim: the best possible education in maths and science (and literacy) for every child, irrespective of gender, region, income or incoming ability. </p>
<p>In the 21st century, we can no more write off a child because “he’s not into numbers” any more than we would accept that “she’s not keen on the alphabet”.</p>
<p>Maths is not just the language of science and technology, but the foundation of commerce, the core of engineering, and the bread and butter of every trade from cooking to construction.</p>
<p>How can we hold governments to account if journalists can’t interpret data and citizens can’t make sense of charts?</p>
<p>How can we resist the prophets of the post-truth world? When everything we value is at stake, surely nothing less than our utmost will do.</p>
<p>So with that aim in mind, let’s agree to share the task: yes, we do bear individual responsibility; but, no, we cannot lay the blame solely on individuals, be they principals, teachers, parents or students.</p>
<p>There is no point in exhorting individuals to aim high unless we help them to make the leap. If we want excellence, we have to provide a system with the incentives, enablers and rewards for improvement built in.</p>
<h2>Policy responses</h2>
<p>For me, that comes down to a new three Rs for education.</p>
<p><strong>Restore maths prerequisites for courses</strong></p>
<p>Restore meaningful <a href="https://theconversation.com/universities-should-require-science-engineering-and-commerce-students-to-know-their-maths-56423">maths prerequisites</a> for all university courses that, no-one could argue, need numbers. </p>
<p>This would reverse the exodus from advanced maths courses and set students up for success – in commerce and accounting, as well as science and engineering. Just as importantly, it would give principals a reason to make the quality of their maths programs a priority all the way from kindergarten to Year 12. </p>
<p><strong>Respect teaching</strong></p>
<p>The single most important factor in the classroom is the human up the front. The education system must be engineered around that fundamental premise, so that high-achieving students become highly qualified teachers with well-targeted professional development.</p>
<p>Crucially, teacher training and development need a strong discipline-specific focus. It should be expected that our <a href="https://theconversation.com/why-is-it-so-hard-to-recruit-good-maths-and-science-teachers-55697">science and maths teachers are experts</a> in their fields, with both the technical and pedagogical knowledge to teach them well. </p>
<p>The Commonwealth Science Council strongly endorsed this principle at its <a href="http://www.chiefscientist.gov.au/2016/09/commonwealth-science-council-fourth-meeting/">last meeting in September</a>, and requested the Department of Education to investigate options to bring it about.</p>
<p><strong>Recognise the influence of school leaders</strong></p>
<p>Principals set the tone in their schools and, with the right strategic focus, they can drive a culture of constant improvement. Without that senior leadership, it is simply too hard for individual teachers to keep the bar consistently high – another reality the Commonwealth Science Council has acknowledged.</p>
<p>Of course, ambitious aims have investment pathways attached. But money spent is not a proxy for effort invested, and it is certainly not a reliable predictor of success. </p>
<p>As a businessman, I learned that no project delivers what you want unless the how comes before the how much.</p>
<p>Face the hard truths, aim high, be strategic – and we might just receive a school report we can be proud to display.</p><img src="https://counter.theconversation.com/content/69782/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alan Finkel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If we want excellence in our schools, we have to provide a system with the incentives, enablers and rewards for improvement built in.Alan Finkel, Chief Scientist for Australia, Office of the Chief ScientistLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/697152016-12-04T21:44:02Z2016-12-04T21:44:02ZGovernments need to look beyond education rankings and focus on inequities in the system<figure><img src="https://images.theconversation.com/files/148220/original/image-20161201-17786-u2xgln.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Should we base education reforms solely on Australia's international ranking?</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>The latest Programme of International Student Assessment (PISA) results will be released around the world on December 6. And as usual, there will be a flurry in the news media. </p>
<p>Australia will likely have dropped further in the rankings which test 15 year olds in reading, maths and scientific literacy. And if so, it will be in keeping with the trend over the <a href="https://theconversation.com/new-pisa-results-show-education-decline-its-time-to-stop-the-slide-21054">last several cycles</a>.</p>
<p>Some hand-wringing will occur over what our students are not able to do, and how far behind Shanghai and Korea – to name a few – they are. </p>
<p>The rankings, which are based on the average performance of all the test-takers in Australia, attract the most attention. </p>
<p>But the real stories on which we should focus are the within-country variations that are obscured by global rankings.</p>
<h2>The real story: inequity in our education system</h2>
<p>Australia’s rankings conceal wide variations in performance. </p>
<p>Some states and territories perform much better than others. In <a href="http://www.oecd.org/pisa/keyfindings/pisa-2012-results.htm">PISA 2012</a> – the last reported PISA cycle – the Australian Capital Territory, Western Australia, New South Wales and Queensland scored significantly higher than the OECD average in maths literacy, but Tasmania and the Northern Territory performed significantly below the OECD average. </p>
<p>The difference in scores between the highest and lowest performing states represents a significant <a href="https://www.acer.edu.au/documents/PISA-2012-Report.pdf">1.5 years of schooling</a>. Similar differences exist in reading and in scientific literacy. </p>
<p>The average scores for Indigenous students in maths literacy was 417 points in PISA 2012, compared with the non-Indigenous average of 507, pointing to serious inequity. This difference represents 2.5 years of schooling. The gap is again similar in reading and in scientific literacy. </p>
<p>Another story that does not always make it to the headlines is that of difference between types of school. </p>
<p>In PISA 2012, students in independent schools scored significantly higher than students in government schools. Students in Catholic schools also scored higher than government school students. </p>
<p>Outcomes were also lower for students in remote and rural schools.</p>
<p>These important differences are obscured when we only look at Australia’s ranking on global league tables.</p>
<h2>The mismatch between the evidence and the policies</h2>
<p>When we look beyond rankings, the evidence does not point to a widespread, national crisis in Australian education, as the <a href="http://www.abc.net.au/news/2016-11-30/australia-declines-in-global-education-report/8077474">media often report</a>. </p>
<p>The sustained variations in performance, with some states, schools and groups of students performing significantly better than others, points to the need for a targeted and focused, strategic policy approach to tackle inequity.</p>
<p>Over the past decade, however, significant and expensive reforms have been at a national scale rather than focused and targeted initiatives to reduce inequity. </p>
<p>The most significant of these have been:</p>
<ul>
<li><p>The introduction in 2008 of the National Assessment Program – Literacy and Numeracy (NAPLAN), replacing statewide tests that previously tracked student progress</p></li>
<li><p>The introduction of the My School website in 2010 to provide comparative information on schools nationwide</p></li>
<li><p>The introduction, in 2014, of the Australian Institute for Teaching and School Leadership which has developed national professional standards for teachers.</p></li>
</ul>
<p>Many of these reforms have their origins in the Rudd-Gillard government’s “<a href="http://apo.org.au/files/Resource/deewar_quality-education_2008.pdf">Education Revolution</a>” of 2008, which placed education at the heart of the “productivity agenda”. </p>
<p>Although Gillard expressed the concern to reduce inequity, the vision at the heart of the Education Revolution was for,</p>
<blockquote>
<p><a href="http://walabor.org.au/download/now/education_revolution.pdf">“Australia to become the most educated country, the most skilled economy and the best trained workforce in the world”</a>. </p>
</blockquote>
<p>Introducing the Education Revolution, <a href="http://australianpolitics.com/2007/01/23/rudd-calls-for-an-education-revolution.html">Kevin Rudd cited a study</a> that found that,</p>
<blockquote>
<p>“countries able to achieve literacy scores 1% higher than the international average will increase their living standards by a factor of 1.5% of GDP per capita”. </p>
</blockquote>
<p>This view of education as being the key to winning a global economic race has made rankings on international league tables an obsession in Australian politics.</p>
<p>And Australia’s declining ranking on these league tables has only served to heighten this obsession. </p>
<h2>Policies not working</h2>
<p>These sweeping, costly national reforms do not appear to be working. </p>
<p>Australia’s performance on PISA has been <a href="http://www.news.com.au/national/pisa-report-finds-australian-teenagers-education-worse-than-10-years-ago/story-fncynjr2-1226774541525">declining since 2003</a>, and has made <a href="https://theconversation.com/australian-schools-continue-to-fall-behind-other-countries-in-maths-and-science-69341">no gains in Trends in Mathematics and Science Studies (TIMSS)</a> – the other major international assessment in which it participates. </p>
<p>There have been no sustained improvements in performance on
<a href="http://www.abc.net.au/news/2016-08-03/naplan-results-show-literacy-numeracy-skills-have-stalled/7683244">national tests either</a>. </p>
<p>Indeed, there is evidence that the widespread reforms across the nation – particularly the controversial NAPLAN and My School – have likely contributed to a range of <a href="https://theconversation.com/naplan-testing-does-more-harm-than-good-26923">negative consequences</a>, not least of which are the de-professionalisation of teachers and the <a href="https://theconversation.com/teachers-are-leaving-the-profession-heres-how-to-make-them-stay-52697">high attrition rates</a> in teaching. </p>
<p>And significantly, <a href="https://theconversation.com/australian-schools-engines-of-inequality-23979">inequities continue to exist</a>.</p>
<h2>Looking beyond the rankings headlines</h2>
<p>Instead of the national rhetoric of “plummeting performance” that is likely to dominate the media tomorrow, we should be celebrating how well some of the states perform, and resolve to focus seriously on remedying the inequities across them. </p>
<p>The inequitable outcomes for Indigenous students, the gap in performance between urban schools and remote and rural schools, the variation in performance and reported confidence between males and females, and the differences between sectors are all issues that deserve focused policy attention.</p>
<p>At best international rankings are a distraction – but basing policies on the rankings while ignoring the more important evidence these international surveys present is unlikely to address the key issue of inequity in our system.</p><img src="https://counter.theconversation.com/content/69715/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Radhika Gorur has previously been supported by the Collaborative Research Network (Australian Government) and has recently been awarded a grant by the Australian Research Council. </span></em></p>The furore over Australia’s international ranking in science, maths and English obscures what we should really be focusing on.Radhika Gorur, Senior Lecturer In Education (Pedagogy & Curriculum), Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/676472016-11-30T19:19:04Z2016-11-30T19:19:04ZShould we do away with exams altogether? No, but we need to rethink their design and purpose<figure><img src="https://images.theconversation.com/files/143213/original/image-20161026-4729-cs15zv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Some exam questions are poorly designed and written – this needs to change. </span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p><em>In our five-part series, <a href="https://theconversation.com/au/topics/making-sense-of-exams-32567">Making Sense of Exams</a>, we’ll discuss the purpose of exams, whether they can be done online, overcoming exam anxiety, and effective revision techniques.</em></p>
<hr>
<p>Over the past two decades there have been frequent calls to <a href="https://theconversation.com/why-we-should-abolish-the-university-exam-1329">abandon exams</a>. </p>
<p>The major criticisms of exams in schools and universities tend to relate to either the <a href="https://theconversation.com/why-we-should-abolish-the-university-exam-1329">misuse</a> or overuse of exams, and not to the sensible use of exams in <a href="http://www.tandfonline.com/doi/pdf/10.1080/09695940701478321">partnership</a> with other assessment tasks such as presentations, research reports, creative responses, essays, reflective journals etc.</p>
<p>Rethinking the way in which some exams are delivered does not require us to abandon all exams in favour of other assessment tasks. This is akin to throwing the baby out with the bathwater.</p>
<p>Exams allow students to demonstrate their breadth of knowledge across a particular subject. This is more difficult to achieve with other forms of assessment. </p>
<p>Students also demonstrate their ability to retrieve and apply knowledge on the spot: a skill necessary in many professions.</p>
<p>But we need to look at what the evidence tells us about when exams are effective – and when other types of assessment are more suitable. </p>
<p>In debates about exams, the same myths are often brought up again and again. Here’s what the research tells us about three of the most common exam myths: </p>
<h2>Myth 1: exams only test for the recall of facts</h2>
<p>One of the most common arguments offered against exams is that they test for rote recall only <a href="https://theconversation.com/why-we-should-abolish-the-university-exam-1329">and not</a> for deeper understanding. </p>
<p>Like others, we have experienced the frustration of sitting for an exam that focuses almost exclusively on the recall of isolated facts. Research shows that such exams are more common when teachers either write questions quickly or rely on published tests from testing banks. In both cases, the teacher has <a href="https://books.google.com.au/books?hl=en&lr=&id=AFIxeGsV6SMC&oi=fnd&pg=PA1&dq=SOLO%2Bexam+questions%2Bhigh+order+thinking&ots=W6gnZHeeVa&sig=HWUOPuK1scTZ24qwdUqU7kD5sVU#v=onepage&q=SOLO%2Bexam%20questions%2Bhigh%20order%20thinking&f=false">less opportunity to review</a> whether or not the questions require deep understanding and higher-order thinking, which require the learner to both hold a strong body of <a href="http://link.springer.com/article/10.1007/BF02300500">disciplinary knowledge</a> and be capable of applying it.</p>
<p>The solution is not to abandon exams, but to change how poorly designed exam questions are written. </p>
<p>A well-designed exam <a href="http://www.learningsolutionsmag.com/articles/804/writing-multiple-choice-questions-for-higher-level-thinking">will</a> assess the application of knowledge to real-world scenarios, the synthesis of knowledge across sub-topics, the ability to think critically, or to solve well-defined problems within a discipline. </p>
<p>These higher-order processes depend entirely on the question being asked. According to research, even quite short professional development programs for teachers <a href="https://books.google.com.au/books?hl=en&lr=&id=AFIxeGsV6SMC&oi=fnd&pg=PA1&dq=SOLO%2Bexam+questions%2Bhigh+order+thinking&ots=W6gnZHeeVa&sig=HWUOPuK1scTZ24qwdUqU7kD5sVU#v=onepage&q=SOLO%2Bexam%20questions%2Bhigh%20order%20thinking&f=false">are effective</a> in changing the way they write exam questions. </p>
<p>Exams should not be used to assess the recall of meaningless facts: this is a misuse of the format.</p>
<h2>Myth 2: Google renders exams irrelevant</h2>
<p>A second argument sometimes offered against exams is that everything can be found on Google anyway. </p>
<p>The implication, of course, is that <a href="https://www.theguardian.com/lifeandstyle/2016/aug/28/inner-life-does-knowledge-matter-in-the-age-of-google">we no longer need knowledge</a> in our brains when we have phones in our pockets. </p>
<p>A variant of this argument is that internet access should always be permitted during <a href="https://theconversation.com/outdated-exams-are-holding-children-back-not-computers-in-the-classroom-47810">exams</a> as this mirrors our experiences in real life. </p>
<p>These arguments are problematic for two reasons. </p>
<p>First, research shows that people without knowledge in a particular field are <a href="https://www.hachettebookgroup.com/titles/william-poundstone/head-in-the-cloud/9780316256537/">surprisingly poor</a> at finding accurate information on Google. They are more likely to find and believe conspiracy theories, for example, less likely to know what search terms to use, and less likely to reason logically about the information they find. </p>
<p>Second, looking up information on Google is not the same as accessing a pre-existing network of knowledge in the brain. </p>
<p>Pre-existing knowledge is critical because it guides the way in which we interpret new information and <a href="http://www.aft.org/sites/default/files/periodicals/Crit_Thinking.pdf">underpins critical thinking and problem solving</a>.</p>
<p>Even if a student is taught generic skills in critical thinking and analysis, a wide breadth of knowledge is also needed to know what arguments are relevant in a particular domain and how they might be applied. This breadth of knowledge cannot be obtained simply by Googling.</p>
<p>It is precisely because our teachers, surgeons, scientists and building engineers have an established network of knowledge in their fields, held in <a href="https://theconversation.com/revising-for-exams-why-cramming-the-night-before-rarely-works-67459">long-term memory</a>, that they are able to instantaneously apply this knowledge in the workplace, <a href="http://ctl.ok.ubc.ca/__shared/assets/ct-conceptualize45378.pdf">critically assess</a> the validity of incoming information, and <a href="https://books.google.com.au/books?id=SSLdo1MLIywC&pg=PA20&lpg=PA20&dq=expertise+long+term+memory+de+groot&source=bl&ots=uTAdM0U4Du&sig=xj6UglGk-vBgvhzK_kZaYlKhnT8&hl=en&sa=X&ved=0ahUKEwif5t3vnPPPAhUHilQKHdh9AeMQ6AEITjAI#v=onepage&q=expertise%20long%20term%20memory%20de%20groot&f=false">solve emerging problems</a> on the run.</p>
<h2>Myth 3: exam study does not enhance learning</h2>
<p>Exams do not just assess learning, they <a href="https://aps.psychologicalscience.org/publications/observer/2006/march-06/test-enhanced-learning-2.html">promote learning</a> in several ways: </p>
<ul>
<li><p>Organising yourself to study promotes self-regulation and metacognition (that is, your understanding and control of your own learning processes). </p></li>
<li><p>Re-organising and <a href="http://makeitstick.net/chapter_8.php">elaborating</a> on the to-be-tested material during study enables deeper understanding of the material.</p></li>
<li><p>The process of actively <a href="http://cdp.sagepub.com/content/21/3/157.short">retrieving</a> and applying that material multiple times during study is one of the best possible ways to strengthen knowledge. Just as practice helps muscles grow stronger during exercise, so too does it make connections in the brain grow stronger during study.</p></li>
</ul>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/147835/original/image-20161128-22751-aqqe5k.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Far from being superficial, well-designed exams and proper study enhance memory and learning.</span>
<span class="attribution"><span class="source">from www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>Of course, some study techniques <a href="http://www.theatlantic.com/education/archive/2013/09/when-memorization-gets-in-the-way-of-learning/279425/">are better</a> than others. </p>
<p>Research shows that study in which students mentally manipulate the material – perhaps by forming their own questions, or by considering how different topics relate to one another – is <a href="http://users.ugent.be/%7Emvalcke/CV/Lecture_questions.pdf">more effective</a> than study in which students passively scan their notes. </p>
<p>These techniques are a form of <a href="https://books.google.com.au/books?hl=en&lr=&id=w2GLAwAAQBAJ&oi=fnd&pg=PA89&dq=shallow+deep+encoding+elaboration&ots=oa0ZagllPJ&sig=MZgRJyvV7WOR0AU0AhstWZPbIw4#v=onepage&q=shallow%20deep%20encoding%20elaboration&f=false">“deep encoding”</a>, in which the student is required to actively negotiate meaning and to make decisions about what goes with what. </p>
<p>Research also shows that spacing out study over time is more <a href="http://tdlc.ucsd.edu/educators/educators_ask_the_scientist_kang.html">effective</a> for retaining information than <a href="https://theconversation.com/revising-for-exams-why-cramming-the-night-before-rarely-works-67459">cramming the night before</a>. </p>
<p>With this knowledge, teachers can support students to study in the most effective ways possible.</p>
<h2>Exams should be used within a balanced assessment program</h2>
<p>The goal of any assessment program is to enable students to demonstrate what they <a href="https://books.google.com.au/books?hl=en&lr=&id=xMlxAgAAQBAJ&oi=fnd&pg=PT15&dq=Knowing+what+Students+Know++The+Science+and+Design+of+Educational+Assessment&ots=HL4phqT5Rv&sig=u4qV2ksoiuyVENxUrhdZvc6Hsck%23v=onepage&q=Knowing%20what%20Students%20Know%25#v=onepage&q=Knowing%2520what%2520Students%2520Know%25&f=false">know and can do</a>. Within this program, exams have specific advantages. </p>
<p>Exams should not be used in all assessments (or even in all disciplines). Some types of assessments are clearly <a href="http://www.tandfonline.com/doi/pdf/10.1080/09695940701478321">better suited</a> to particular kinds of knowledge and skills than others. </p>
<p>Where research skills are important, a research proposal or report may be more appropriate. </p>
<p>Where oral communication skills are important, a presentation task may be more appropriate. </p>
<p>And where depth of knowledge of a single topic is important – either because of the specific topic itself or because a more focused investigation will allow the student to practise and refine particular learning skills – then an essay, class debate, or similar assessment may be more appropriate. </p>
<p>But arguing that exams cannot do everything is not the same as arguing they can do nothing. In nearly all school and university courses there are multiple goals, therefore a balanced assessment program is <a href="http://www.ccsso.org/Documents/Balanced%20Assessment%20Systems%20GONG.pdf">critical</a>. </p>
<h2>When considering the purpose of exams</h2>
<p>We need to be careful when considering the use of exams in schools and universities. </p>
<p>We need to know that they are appropriate to the knowledge and skills being assessed, and that they form part of a balanced assessment program with a range of different assessment tasks. </p>
<p>We also must be aware of the unintended consequences that emerge in specific testing circumstances. </p>
<p>This is true for national testing programs such as the National Assessment Program – Literacy and Numeracy (NAPLAN), for example, where the potential to publicly rank schools has led to concerns about “<a href="http://www.theaustralian.com.au/news/naplan-puts-focus-more-on-passing-tests-than-teaching/story-e6frg6n6-1226523826536">teaching to the test</a>” and narrowing the curriculum. These unintended consequences must be addressed. </p>
<p>When used well, however, exams offer several advantages for learning.</p>
<p><em>• <a href="https://theconversation.com/au/topics/making-sense-of-exams-32567">Read more</a> from the series.</em></p><img src="https://counter.theconversation.com/content/67647/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Penny Van Bergen has previously received funding from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Rod Lane does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Exams do have a purpose, but they shouldn’t be used to assess the recall of meaningless facts.Penny Van Bergen, Senior Lecturer in Educational Psychology, Macquarie UniversityRod Lane, Senior Lecturer in Educational Assessment, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/687492016-11-20T19:04:35Z2016-11-20T19:04:35ZSchools will teach ‘soft skills’ from 2017, but assessing them presents a challenge<figure><img src="https://images.theconversation.com/files/145914/original/image-20161115-15965-1kkr4ze.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">From 2017, students will be assessed on skills such as problem solving. </span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>When students go back to school in January 2017 there will be some significant changes to their timetables. As well as learning areas like English, maths and science, there will be some new things to grapple with called “<a href="http://www.australiancurriculum.edu.au/generalcapabilities/overview/introduction">capabilities</a>”. </p>
<p>The Australian curriculum will be focusing not just on the 3Rs – reading, writing and arithmetic – but also on the kinds of “soft” skills young people will need if they are to be successful throughout their lives.</p>
<p>The new capabilities are: </p>
<ul>
<li>Information and communication technology - using technology to access information, create products and solve problems</li>
<li>Critical and creative thinking - learning how to think and find ways to approach problems</li>
<li>Personal and social - recognising others’ emotions, supporting diversity and working together</li>
<li>Ethical - understanding values and concepts that underpin views</li>
<li>Intercultural - learning about your own and others’ cultures and beliefs. </li>
</ul>
<p>From 2017 teachers will be expected to teach and assess these capabilities, although state and territory education authorities can determine whether the capabilities will be assessed. </p>
<p>In <a href="http://www.education.vic.gov.au/about/educationstate/Pages/targets.aspx">Victoria</a> schools will be required to assess progress in the development of students’ capabilities, and there will be a specific focus on improving critical and creative thinking.</p>
<h2>Why “soft” skills are important</h2>
<p>To call these skills “soft” is actually unhelpful. It implies they are not as important or demanding as the so-called hard stuff like the 3Rs. They are. They include attributes such as collaboration, perseverance, problem solving, empathy and self reflection. The rationale for their inclusion in any curriculum is sound. </p>
<p>Economists like <a href="http://www.nber.org/papers/w18121.pdf">James Heckman</a> have made the case in terms of improved life outcomes such as higher employment rates and lower rates of crime.</p>
<p><a href="https://www.sas.upenn.edu/%7Educkwort/images/publications/DuckworthSeligman_2005_Self-DisciplinePredictsAcademicAchievement.pdf">Psychologists</a> such as Angela Duckworth and Martin Seligman have shown how capabilities predict success in education more powerfully than conventional measures such as IQ. For example, students with greater self discipline apply themselves more to their schoolwork and are less likely to be distracted.</p>
<p><a href="http://www.cbi.org.uk/first-steps/3._Change_is_possible__but__1.html">Employers</a> the world over acknowledge that they are vital to the future prosperity. In a global world, people need to understand different cultures to collaborate across borders.</p>
<p>And the globally regarded <a href="https://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf">Programme for International Student Assessment (PISA)</a> tests selected collaborative problem-solving in 2015 to sit alongside English, maths and science – a sure indication that this capability is both important and assessable.</p>
<h2>But how do you assess these skills?</h2>
<p>At the school level, just how do you assess these more generic capabilities? And when you get beneath the surface, what exactly is being assessed? </p>
<p><a href="https://asiasociety.org/files/gcen-measuring21cskills.pdf">Work on assessing capabilities</a> is underway in Asia and North America. A <a href="http://www.oecd.org/edu/ceri/assessingprogressionincreativeandcriticalthinkingskillsineducation.htm">major study</a> by the Organisation for Economic Development (OECD) into the assessment of creative and critical thinking is taking place in fourteen countries including Wales, France and Brazil.</p>
<p>From work like this, we know that we need to think about teaching methods (how useful assessment is for learners); practicalities (how doable it is for teachers in busy classrooms); and various technical issues of assessments (being sure results are reliable, valid and fair).</p>
<p>Most of all, we cannot helpfully assess any capability unless we can precisely define it for students, teachers, parents and employers. </p>
<p>The Victorian Curriculum and Assessment Authority (VCAA) has helped by selecting four key capabilities and mapping out in detail likely <a href="http://victoriancurriculum.vcaa.vic.edu.au/critical-and-creative-thinking/introduction/scope-and-sequence">progression</a> from Foundation through to Year 10. </p>
<p>Elsewhere in the state there is a trial being supported by the <a href="http://www.mitchellinstitute.org.au/our-work/capabilities-in-action/">Mitchell Institute</a> involving eleven schools. </p>
<p>Early learning from these schools suggests teachers need to change the way they teach to encourage more rigorous group work, better project planning, more effective feedback and the use of well-framed questions to drive authentic enquiries into real-world problems.</p>
<p>Assessing capabilities is harder than assessing subjects – and the evidence base is much less well-formed. </p>
<p>Knowing that a student achieved a level 8b in critical and creative thinking is not particularly useful. </p>
<p>But from the trial we are finding that students need to become more critically reflective and develop digital portfolios of evidence. </p>
<p>Digital portfolios are collections of student work that demonstrate their achievements either to the school or also, as in the case of <a href="https://openbadges.org/">open badges</a>, publicly.</p>
<h2>More nuanced assessment needed</h2>
<p>Teachers have to use progress criteria more reliably. Experts from outside school can provide expert, authentic feedback. </p>
<p>Online tests developed by VCAA are now available for some aspects of capabilities. </p>
<p>In our own work for the <a href="http://repository.winchester.ac.uk/303/1/Lucas_A%20Five%20Dimensional%20Model%20of%20Creativity%20Gold%20OA.pdf">OECD</a> we have discovered how assessment needs to be more nuanced than the production of simple grades producing feedback specifically designed to improve learners’ progress.</p>
<p>Across the world there are initiatives which demonstrate that capabilities can be both developed and assessed. </p>
<p>These include <a href="https://www.buildinglearningpower.com/">Building Learning Power</a>, the <a href="http://www.p21.org/">Partnership for 21st Century Learning</a>, <a href="http://123userdocs.s3.amazonaws.com/d/e7/32/285697108886368999/0581de98-65af-49f3-8065-0692cc176966/Assessment%20and%20Teaching%20of%2021st%20Century%20Skills.pdf">AC21S</a>, and <a href="http://npdl.global/">New Pedagogies for Deeper Learning</a> which is now being tried in more than 70 Victorian schools.</p>
<p>For more than a hundred years we have focused on teaching and assessing disciplinary knowledge in schools. Now we need to <a href="http://www.educatingruby.org/">focus on capabilities</a> as well. While it will not necessarily come naturally to all teachers, it is vital work.</p><img src="https://counter.theconversation.com/content/68749/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bill Lucas does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Teaching students skills such as creative thinking and problem solving will become part of the curriculum from 2017. But in order to assess these capabilities, teaching styles will have to change.Bill Lucas, International adviser, Mitchell Institute, Victoria UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/640702016-08-18T16:18:26Z2016-08-18T16:18:26ZThe slow death of the AS-Level<p>A <a href="https://www.gov.uk/government/news/a-level-results-day">record number</a> of 420,000 students had already secured a place at university on August 18, the day the A-Level results came out. But amid the joy and disappointment on display, the results show how a short-lived stalwart of the examination landscape is slowly fading away. The number of students taking AS-Levels – a one-year qualification that used to count towards 50% of a full A-Level – has fallen sharply by 13.7% between 2015 and 2016. </p>
<p>AS-Levels allowed students, teachers and universities to get an insight into eventual A-Level performance. Students could also indulge in broadening their post-16 diet, taking, say mathematics, biology, chemistry and French. And they had something to show for their first year’s efforts. Those options are now closing to young people.</p>
<h2>Teething problems</h2>
<p>The backdrop to this is an exam system in England, Wales and Northern Ireland that has been under seemingly continual reform in the first decade of the 21st century. It began with a set of reforms to school-leaving exams called <a href="https://www.theguardian.com/education/2003/dec/24/schools.uk">Curriculum 2000</a>, introduced by the then-Labour government in 2000-2002. The aim was to encourage more 16- to 19-year-olds to study more subjects, shifting from an average of two A-Levels to three, and adding a fourth subject by way of an additional exam taken in their penultimate year of school – the AS-level. </p>
<p>This was part of a plan to encourage more students to aim for higher education as well as to discourage 17-year-olds from dropping out at the end of the first year of sixth form. But if they did drop out, at least they would leave with some meaningful qualifications, because AS-Levels were one-year courses – a qualification in their own right. AS and A-Levels also became fully modular, with assessments done at the end of individual units, rather than all or nothing in the last year of school. </p>
<p>Once Curriculum 2000 was in place, a majority of students took at least four AS subjects. At first, some teachers were unsure of what the AS standard was since exams at the end of the first year were new. Teaching time seemed truncated because schools concentrated on exam preparation after only two terms and the introduction of units meant that students could be sitting as many as 12 exams during their first year. </p>
<p>Soon teachers felt more confident about the AS – in a 2003 survey (which is no longer accessible) run by the Qualifications and Curriculum Authority, 81% of teachers said they were confident that they knew the standard required for the AS, although many of them remained concerned by the amount of external assessment. New AS and A-Levels introduced in 2008 largely fixed that problem as the A-Level qualification now had only four units (and the AS two).</p>
<h2>AS-Levels on the decline</h2>
<p>Now, in the summer of 2016 and we are witnessing all of those things changing. The Coalition and Conservative governments <a href="https://theconversation.com/abolishing-as-levels-will-make-it-harder-to-get-into-university-30547">have abolished</a> qualifications examined in units in what they claim to be an effort to increase rigour and raise standards. New, reformed A-Levels <a href="https://theconversation.com/the-gove-generation-first-pupils-to-live-through-a-level-reforms-wait-for-results-45532">are being phased in</a>, and some will be examined for the first time in 2017. </p>
<p>Although AS levels will still exist and are also being phased in – some were taught and examined this academic year – they will no longer count towards the overall A-Level, which will be examined only at the end of the second year. The reformed AS is an entirely separate qualification and fewer students are taking it, as we see by this summer’s results and the graph below shows. </p>
<iframe id="datawrapper-chart-Ule9k" src="https://datawrapper.dwcdn.net/Ule9k/1/" frameborder="0" allowtransparency="true" allowfullscreen="allowfullscreen" webkitallowfullscreen="webkitallowfullscreen" mozallowfullscreen="mozallowfullscreen" oallowfullscreen="oallowfullscreen" msallowfullscreen="msallowfullscreen" width="100%" height="300"></iframe>
<p>Overall entries for biology have decreased by almost 17%, while English is down by 23%. The decreases are even higher among those 17 years of age and under (excluding older learners who are taking AS-Levels), with a 30% decrease in entries for English for this cohort. </p>
<p>While we may not have a definitive answer as to why the decreases have happened, I would venture to guess that if AS outcomes no longer count toward A-Level outcomes, offering these qualifications becomes a luxury for schools and colleges, who have to concentrate their efforts on preparing students for the full A-Level.</p>
<p>I would not be surprised if a few years down the line the AS disappears almost entirely along with insight into how students are progressing with their studies, a broader curriculum and, yes, a fallback if things don’t work out at the end of year one.</p><img src="https://counter.theconversation.com/content/64070/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tina Isaacs is a member of the Ofqual Standards Advisory Committee. </span></em></p>Why we should lament the sharp drop in the number of teenagers taking the one-year qualifications.Tina Isaacs, Programme Leader, MA in Educational Assessment, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/530072016-02-22T10:45:16Z2016-02-22T10:45:16ZThe importance of play: what universities can learn from preschools<figure><img src="https://images.theconversation.com/files/110965/original/image-20160210-12170-jo10ss.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Back to basics. </span> <span class="attribution"><span class="source">Sergey Novikov/www.shutterstock.com</span></span></figcaption></figure><p>Almost as soon as they begin school, children start getting tested. With the introduction of <a href="https://theconversation.com/why-testing-four-year-olds-as-they-start-school-is-a-bad-idea-24929">tests for four-year-olds</a> and the explicit link between <a href="https://www.gov.uk/school-performance-tables">test results and school performance</a>, education policies of successive governments have led to an increased emphasis on results at all levels of schooling. </p>
<p>This focus has led to a stigmatisation of failure, even though it is <a href="http://www.theguardian.com/teacher-network/2012/aug/16/a-level-student-success-failure">fundamental to the learning process</a> from preschool all the way to university. </p>
<p>This ill-prepares learners for real life, which does not provide set answers to problems with neat scores to gauge progress. The real world is messy and diverse, and young people need to be creative, resourceful and resilient to succeed in it. One of the best ways to achieve this is through play. </p>
<p>The best kind of learning is <a href="http://psychology.about.com/od/motivation/f/difference-between-extrinsic-and-intrinsic-motivation.htm">“intrinsically motivated”</a>, where students want to learn because it is interesting, purposeful and personally relevant, not because it is assessed. Learning takes place through action, failure, reflection, and practice. But while making mistakes is an inevitable part of this process, our school system fails to recognise this. </p>
<p>Exam grades are often seen as more important than fostering a love of learning – and as a result schools are overlooking the value of learning that does not fit into a specified curriculum. </p>
<p>When students reach university, most have learned that grades (and their impact on job opportunities) are of prime importance. For many, the magic of learning out of interest and passion has been eclipsed. The <a href="http://www.newstatesman.com/politics/2015/02/tuition-fees-turn-students-customers-thats-bad-news-learning">introduction of tuition</a> fees has only increased the expectation that the role of university is to provide qualifications rather than focus on the intrinsic value of education. </p>
<p>This shift in expectation is hardly surprising given that students have to consider their personal investments and the returns they are likely to receive. This makes perfect sense for an individual student, but does not take into account what is best for society, which needs people to be creative and take risks, not simply focus on scoring highly in a test.</p>
<h2>The need to fail</h2>
<p>While many students fail university modules and drop out of courses, this is often seen as a last resort and universities are becoming increasingly <a href="http://www.communitycare.co.uk/2015/08/18/resigned-lecturer-university-fail-social-work-students/">averse to failing their students</a>. A focus on one-shot assessments does not give students opportunities to fail regularly on a less catastrophic level. </p>
<p>The ability to manage failure, both emotionally and practically, increases the ability to manage risk. It is only by taking risks that we can explore new possibilities and ways of thinking. We are in danger of creating a generation of <a href="http://www.thetimes.co.uk/tto/education/article4605416.ece">risk-averse students</a>. The possibility of failure can also actually increase a person’s intrinsic motivation: if success is certain, there is little challenge and so little motivation.</p>
<p>One way to develop a generation who can take risks is through playful learning. <a href="https://www.psychologytoday.com/blog/freedom-learn/200811/the-value-play-i-the-definition-play-gives-insights">Play</a> supports socialisation and decreases stress, develops imagination and creativity, enables learners to have new experiences, and learn from their mistakes. </p>
<p>While it is integral to early years education, a focus on assessment has all but driven play out of schools. The relative flexibility of higher education curricula and teaching approaches provide opportunities to give learners chances to play, experiment, experience, and fail – and, most importantly, learn from those failures. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=419&fit=crop&dpr=1 600w, https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=419&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=419&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=527&fit=crop&dpr=1 754w, https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=527&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/110969/original/image-20160210-12175-sxqapz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=527&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Make it worth their while.</span>
<span class="attribution"><span class="source">JHershPhoto/www.shutterstock.com</span></span>
</figcaption>
</figure>
<h2>Playtime at university</h2>
<p>Several UK universities are already embracing elements of playful learning. For example, the University of Portsmouth uses <a href="https://moerg.wordpress.com/tag/pervasive-learning-activity/">“pervasive learning” activities</a>, where courses are taught through playful, detailed simulations in which students work together to solve problems and make mistakes away from the real consequences of assessment. </p>
<p>The <a href="http://www.slideserve.com/mohammad-carlson/the-great-history-conundrum">Great History Conundrum</a> at the University of Leicester, which runs every year for first-year students, uses an online puzzle-solving card game to teach critical historical literacy. Students play as long as they like to collect enough points to pass the course: if they fail on one puzzle they can move on to the next. </p>
<p>Students at Manchester Metropolitan University play the <a href="http://www.stayingthecourse.mmu.ac.uk/supportindex.php">Staying the Course</a> game during induction to highlight the range of university support available. The University of Brighton has also used <a href="http://www.slideshare.net/katiepiatt/university-of-brighton-studentquest-1-who-is-herring-hale">alternate reality games</a> during induction, which allow students to work together to solve online and physical puzzles, and <a href="http://katiepiatt.blogspot.co.uk/2009/05/never-ending-uni-quizhttpwwwbloggercomi.html">large-scale multi-player quizzes</a> to engage new students and orientate them to university life in novel ways. </p>
<p>These kind of approaches do not work in every context, and will inevitably meet resistance from some students and academics. We have to make the case that far from trivialising education, playful learning makes it richer, more purposeful, and more useful for life after education. </p>
<p>Playful learning is not an easy option. It is more academically challenging, making students less reliant on rote learning and established ideas. To <a href="http://conference.playthinklearn.net/blog/">embrace playful learning</a>, we need to create more opportunities for students to fail safely and focus on the development of intrinsic motivation, passion and curiosity. Crucially, we must radically rethink how, and why, we assess our students.</p><img src="https://counter.theconversation.com/content/53007/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nicola Whitton receives funding from AHRC, EEF, Wellcome Trust, EU. </span></em></p>Testing takes the magic out of education – playful learning may be the answer.Nicola Whitton, Professor in Education, Manchester Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/502252015-11-05T11:19:46Z2015-11-05T11:19:46ZWe’ve tested seven-year-olds in schools before – here’s why we stopped<figure><img src="https://images.theconversation.com/files/100797/original/image-20151104-29070-j43ip9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Could seven-year-olds be sitting SATs again?</span> <span class="attribution"><span class="source">Monkey Business Images/www.shutterstock.com</span></span></figcaption></figure><p>Nicky Morgan’s <a href="https://www.gov.uk/government/speeches/nicky-morgan-one-nation-education">announcement</a> that seven-year-olds at state-maintained schools could have to sit national, standardised tests could re-open a can of worms that was shut when such tests were phased out in 2005. But the move is not surprising and echoes the current political rhetoric of school accountability and Whitehall’s micro-management of teachers. </p>
<p>Some tests are useful and valuable. For example, I want to know how my child is progressing as a reader or I want to be sure that the pilot who is flying me on holiday is capable and appropriately trained. So I have no principled objection to testing per se; rather my concern about this new proposal is the legitimacy of its goals. </p>
<p>Schools are still required to provide assessment of their pupils at the end of <a href="https://www.gov.uk/national-curriculum/key-stage-1-and-2">Key Stage 1</a>, when children are seven-years-old, but teachers administer the process and schools choose when it is carried out. Yet Morgan seems to be suggesting that this is not enough – or not good enough – and will consult on the reintroduction of national standardised tests. In a speech at the Policy Exchange think-tank she <a href="https://www.gov.uk/government/speeches/nicky-morgan-one-nation-education">said</a>:</p>
<blockquote>
<p>To be really confident that students are progressing well through primary school, we will be looking at the assessment of pupils at age seven to make sure it is as robust and rigorous as it needs to be.</p>
</blockquote>
<p>She argues that tests administered by an external body will drive up standards, yet there is very little evidence to support this idea. It is more likely that such test results will be used primarily as an accountability stick rather than as a carrot to improve achievement. </p>
<p>The tests themselves are not a bad thing – it is the way that the results are used that could undermine their validity.</p>
<h2>Tests first introduced in the 1980s</h2>
<p>Tests for seven-year-olds were introduced as the National Curriculum began to take shape in the late 1980s. These were the first national, standardised tests (commonly known as SATs) in primary schools and were introduced for seven-year-olds due to the fact that Key Stage 1 is the shortest key stage, lasting from age five to seven. But it was acknowledged at the time that <a href="http://www.tandfonline.com/doi/abs/10.1080/00131880902891222">this was also a problem</a> as the large-scale testing of children at this age was unknown territory. </p>
<p>While teachers were adept at assessing pupils and reporting to parents on a relatively informal basis, the new system meant negotiation of a compulsory and universal testing regime. Development and administration of the tests was run by three external consortia, comprising universities (including the Institute of Education), publishers and local authorities.</p>
<p>The original intention was a positive one, to provide a broad system that allowed an individual child to “<a href="http://www.tandfonline.com/doi/abs/10.1080/00131880902891222">demonstrate their best”</a> – but this proved to be too challenging in such a large-scale testing situation. </p>
<p>The process of developing these tests provided unique data sets on infant classroom assessment and it changed many teachers’ perceptions of how to assess children. But problems with managing them, and the increasing concern about the effects of the tests on <a href="http://onlinelibrary.wiley.com/doi/10.1080/0141192990250305/abstract">children</a> and their <a href="http://onlinelibrary.wiley.com/doi/10.1080/01411920600775225/abstract">teachers</a> ultimately precipitated their demise. </p>
<p>The pressure from the testing regimes resulted in some resistance and notable demoralisation among the teaching profession, with unions declaring that the new systems reframed the role of teachers as coaches whose job it was to ensure their pupils passed standardised tests. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/100800/original/image-20151104-29079-1tudnd1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Testing culture came under pressure.</span>
<span class="attribution"><span class="source">Mukhina Viktoriia/www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>The impact of this was <a href="https://books.google.co.uk/books/about/Changing_English_primary_schools.html?id=hRlKAAAAYAAJ&redir_esc=y">clearly described in reports</a> from the early 1990s that found stressed teachers, stressed pupils and what was supposed to be a broad national curriculum being narrowed to ensure pupils were adequately prepared for the subjects that were most important: English and mathematics. The time allocated to preparing pupils for testing in these two subjects increased, and as a consequence reduced time allocated to other subjects.</p>
<h2>Teachers back in charge</h2>
<p>Public debates about the nature of school assessments in the 2000s revealed growing concern about the effects of testing in schools and this was deemed important enough for the Qualifications and Curriculum Authority to commission <a href="http://dera.ioe.ac.uk/4980/">research</a> into new assessments for seven-year-olds in 2004. </p>
<p>The results of trials for different assessment practices reversed the system, putting the emphasis back on teacher assessment, rather than national tests. This proved successful, so from 2005 this “new” system was <a href="http://www.nuffieldfoundation.org/sites/default/files/files/The-role-of-teachers-in-the-assessment-of-learning.pdf">used in all primary schools</a>. This was not simply a return to how it had been in the early 1980s, rather schools had to use approved tests and procedures as part of the assessment process, which was conducted by teachers. It has continued until <a href="https://www.gov.uk/government/publications/key-stage-1-assessment-and-reporting-arrangements-ara/end-of-key-stage-1-assessment-arrangements">today</a>.</p>
<p>Bearing in mind the fraught history of national standardised testing at a tender age, it seems perplexing that we may well see something similar reintroduced into state-maintained schools for seven-year-olds. While there is some evidence that the test results at the end of Key Stage 2, when children are 11, can provide <a href="http://www.publications.parliament.uk/pa/cm201213/cmselect/cmeduc/writev/gcse/Q39%20-%20PREDICTING%20GCSE%20OUTCOMES%20Paper.pdf">a reasonable indicator</a> for future academic success at GCSE, there is no such evidence that advocates such a regime at the end of Key Stage 1.</p>
<p>Morgan is <a href="https://theconversation.com/squad-of-super-teachers-is-an-uncertain-cure-for-englands-failing-schools-50159">keen to recruit</a> more excellent teachers at a time when unprecedented numbers are <a href="https://theconversation.com/are-teachers-suffering-from-a-crisis-of-motivation-48637">thinking of leaving the profession</a>, many because they feel de-professionalised. If yet more layers of accountability and public judgement are inserted into our primary schools via tests for seven-year-olds, then teaching may become an even less appetising career choice. </p>
<p>It will also narrow the learning experience for our youngest pupils at the very time that it should be becoming broader. The bottom line is that teaching for a test is not a strong model for nurturing a love of learning.</p><img src="https://counter.theconversation.com/content/50225/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mary Richardson receives funding from the Economic and Social Research Council. </span></em></p>A decade after they were phased out, the government could reintroduce national tests for seven-year-olds.Mary Richardson, Senior Lecturer in Educational Assessment, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/473822015-09-17T05:33:21Z2015-09-17T05:33:21ZYoung people must be consulted on reforms to A-levels and GCSEs<p>In a society where exams play such a huge part in the lives of young people, it’s surprising that substantial reforms to qualifications in the UK are taking place without their consultation. </p>
<p>In a presentation at the <a href="https://www.bera.ac.uk/beraconference-2015">British Educational Research Association</a> annual conference in Belfast, I <a href="http://assessment2025blog.aqa.org.uk/2014/06/09/can-and-should-young-people-play-a-role-in-designing-assessments/">argued</a> that this lack of student consultation on reforms to qualifications is a grave omission. Young people – both those who have already done exams and those about to sit them – can and should be asked for their views before changes are rolled out. </p>
<p>While there is a proliferation of <a href="https://www.gov.uk/government/collections/reform-of-gcse-qualifications-by-ofqual">government consultations</a> on <a href="https://www.gov.uk/government/collections/reform-of-as-and-a-level-qualifications-by-ofqual">reforms to examinations</a>, young people’s views are omitted as a matter of course. </p>
<p>In mid-September, the qualifications regulator Ofqual announced a new <a href="https://www.gov.uk/government/news/ofqual-launches-further-consultation-on-reformed-qualifcations-for-2017">consultation</a> seeking views on a second phase of changes to GCSEs and A-level subjects including statistics, media studies and film studies. The views of young people have not been specifically sought out on these subjects – nor were they for subjects reformed in the first phase, some of which are already being taught in schools. This lack of participation by young people in the policy development is a missed opportunity. </p>
<h2>Changes set in motion</h2>
<p><a href="https://theconversation.com/the-gove-generation-first-pupils-to-live-through-a-level-reforms-wait-for-results-45532">Initial reforms</a> of the GCSE and A-level curriculum, begun by the former education secretary Michael Gove, mean that some students who sat exams in summer 2015 already took very different exams to their peers a few years ahead of them. </p>
<p>Now, a cohort of students have just started GCSE and A-level courses this September with new content and new rules. These changes are certain to <a href="https://www.gov.uk/government/publications/gcse-changes-a-summary">have major ramifications</a> for young people’s future educational and employment opportunities.</p>
<p>There are new specifications for exams, based on revised subject content and assessment objectives in key subjects such as maths and English. These qualifications are now linear, assessed solely by examinations with no modules and in some subjects, no coursework assessment. At A-level, there will be a <a href="https://theconversation.com/the-gove-generation-first-pupils-to-live-through-a-level-reforms-wait-for-results-45532">“de-coupling” of the AS-level</a> exams pupils sit in Year 12 with the final A2 exams in Year 13, and a reduction in resit opportunities. <a href="https://theconversation.com/removing-practicals-from-a-level-sciences-will-leave-students-poorly-equipped-25563">Practical science assessment</a> will no longer count towards students’ final grades.</p>
<p>At GCSE, a <a href="http://www.ocr.org.uk/qualifications/gcse-and-a-level-reform/gcse-reform/">new 9-1 grading scale</a> will replace the former A* to U system (where nine is the top mark). And <a href="http://www.theguardian.com/education/2013/aug/29/gcse-english-speaking-listening-drop">speaking and listening</a> has been removed from students’ overall grades. </p>
<h2>Young peoples’ views ignored</h2>
<p>Students have no history of any meaningful input into what reforms of these qualifications might look like. The <a href="http://webarchive.nationalarchives.gov.uk/20130401151715/http://www.education.gov.uk/publications/eOrderingDownload/DeliveringReform.pdf">last Labour government</a> saw the promise of “student voice” as a crucial dimension to the successful implementation of many of its 14–19 initiatives. </p>
<p>But in the present landscape of reform, there is no direct policy that drives the government to carry out consultations with students. Many <a href="http://onlinelibrary.wiley.com/doi/10.1111/j.1099-0860.2012.00442.x/abstract">in-roads</a> that were made into treating young people as equal decision-makers with regard to education policy have stalled. </p>
<p>These are worrying developments, specifically in terms of children’s rights. The <a href="http://www.unicef.org/crc/files/Rights_overview.pdf">UN Convention on the Rights of the Child</a>, to which the UK is a signatory, stipulates that children and young people are rights holders and are entitled to engage in processes that affect them directly. This includes the development of policies and services (in this instance educational ones) through research and consultation. </p>
<p>But looking at the impact of assessment on young people in terms of rights is rare – and any real effort to enforce <a href="http://www.tandfonline.com/doi/abs/10.1080/02671522.2010.498150">compliance</a> with international children’s rights standards in the development of qualifications systems is rarer still. </p>
<h2>Worried about their future</h2>
<p>Through national research <a href="http://qub.library.ingentaconnect.com/content/ioep/clre/2013/00000011/00000002/art00002?crawler=true&mimetype=application/pdf">that has engaged with young people</a> who were just about to do exams, we are beginning to know a great deal more about what they think about reforms. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=902&fit=crop&dpr=1 600w, https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=902&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=902&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1134&fit=crop&dpr=1 754w, https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1134&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/94917/original/image-20150915-29620-10h9pm6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1134&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">High stakes.</span>
<span class="attribution"><span class="source">Teen boy via eurobanks/www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>During my 2012 research with nearly 250 students <a href="http://www.tandfonline.com/doi/abs/10.1080/0305764X.2012.733347#.Vfg1LxFVhBc">from across England</a> they told us that examinations structured through modules (and re-sits) allow for any mistakes to be made better and take the stress off having to do everything in one sitting. Students thought that it was only fair to have a mixture of examinations and coursework because: “we don’t all like the same things”. </p>
<p>They felt insulted at the annual circus of debates in the media around falling <a href="http://www.telegraph.co.uk/education/educationnews/9366785/Exam-standards-fall-in-race-to-the-bottom-MPs-will-say.html">exam standards</a>, which they saw as degrading their own achievements. They were also concerned that changes to examinations are introduced “live”, rather than being piloted in advance, and felt their future successes might be “messed up” as a result. All of these changes could have considerable impact on their final grades and they argue this is too high a price to pay. </p>
<p>There are a number of ways that young people could be listened to more effectively. Qualification awarding bodies, the Department for Education and Ofqual could set up panels with young people so that their views can be fed directly into assessment design and implementation. Education officials and politicians could attend focused policy briefings with young people in order to obtain input into current debates. And there could be an attempt to reach out directly to students via social media to gauge their opinions on reforms. </p>
<p>But we should also ask young people what they think is the most effective way to engage with them directly, and change our practice accordingly. When it comes to reforming exams that form such an important step in any young person’s life, it’s vital that all students have their voices heard.</p><img src="https://counter.theconversation.com/content/47382/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jannette Elwood is a Trustee of AQA Education. She has received funding from various educational charities and government departments, recently the Schools Examination Commission, Ireland.</span></em></p>New look GCSEs and A-levels will be sat by young people – but they haven’t been asked about the reforms.Jannette Elwood, Professor of Education, Queen's University BelfastLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/429992015-06-11T19:59:10Z2015-06-11T19:59:10ZPolicing won’t be enough to prevent pay-for plagiarism<figure><img src="https://images.theconversation.com/files/84654/original/image-20150611-9352-1l2vsc9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's impossible to compare student work against a database of sources because each pay-for plagiarised assignment is a bespoke creation.</span> <span class="attribution"><span class="source">AAP/Alan Porritt</span></span></figcaption></figure><p>Buying and selling high-stakes assessments is bad for education. It undermines community confidence because we can’t be sure if a grade was earned or bought. Plagiarism hurts plagiarists too, because they miss out on the learning opportunities that the assessment was supposed to provide. Tensions around plagiarism may be part of a <a href="http://www.tandfonline.com/doi/abs/10.1080/02602930801895786">culture of distrust</a> between teachers and students.</p>
<p>Recently, it was revealed that high school students in NSW are buying essays made-to-order online for <a href="http://www.smh.com.au/national/education/cheating-endemic-in-nsw-high-schools-20150607-ggw8h9">little more than A$100</a>. University assignments can be more expensive, costing <a href="http://www.smh.com.au/nsw/mymaster-essay-cheating-scandal-more-than-70-university-students-face-suspension-20150318-1425oe.html">up to $1000</a> from the controversial (and now-defunct) MyMaster website.</p>
<p>With the recent media attention, we could be fooled into thinking pay-for plagiarism is a modern, high-tech invention. However, the internet merely supports the logistics. Pay-for plagiarism is much older than computers – many of your favorite books were <a href="http://julieannamos.hubpages.com/hub/Ghostwriting-Exposed---The-Top-50-Ghostwritten-Books">“ghostwritten”</a>.</p>
<h2>The difficulties in policing</h2>
<p>The problem is that pay-for plagiarism is very difficult to police. Unlike “copy-paste” plagiarism or using an assignment that a previous student submitted, each pay-for assignment is made-to-order. We can’t just compare student work against a database of sources because each assignment is a bespoke creation. </p>
<p>Identifying exactly who wrote a particular piece of text is a hard problem. Disputes about authorship date back to biblical times – even the bible itself has <a href="https://en.wikipedia.org/wiki/Authorship_of_the_Pauline_epistles">books with disputed authorship</a>. New technology may help discern if a student wrote a particular piece, but it is <a href="http://llc.oxfordjournals.org/content/27/2/183">far from perfect</a>, and far from application in a mass education context. </p>
<p>As anti-plagiarism enforcement gets smarter, so do the plagiarists. While we may be able to spot a ghostwritten university-level essay submitted by a struggling high school student, this is a rookie pay-for plagiarism mistake. Smart plagiarists rework the essays they pay for, or even employ techniques like <a href="http://www.tandfonline.com/doi/abs/10.1080/02602938.2014.950553">“back-translation”</a> by running plagiarised text through tools like Google Translate. </p>
<p>Some high-end services will even produce a tailored assignment just for you, based on analysis of your previous writing style. Techniques like these make it difficult to detect plagiarised work.</p>
<h2>The possible way forward</h2>
<p>Policing pay-for plagiarism may work to some extent, but it won’t completely solve the problem. So, what are our alternatives? How can we complement an enforcement approach?</p>
<p>NSW Teachers Federation president <a href="http://www.smh.com.au/national/education/teachers-demand-change-to-combat-endemic-cheating-20150608-ghiqik">Maurie Mulheron</a> favours requiring students to complete all assessments in class. Students can’t pay for someone else to do their work for them if the teacher is watching. </p>
<p>However, this approach creates further problems. The classroom environment is not an <a href="http://www.tandfonline.com/doi/abs/10.1080/02602938.2013.819566">“authentic”</a> environment for some of the tasks teachers set students. Consider an in-class essay versus a take-home essay assignment. Even in disciplines like history where an essay might be a true representation of what professional practitioners do, a stressful classroom and time limit can lead to students producing different work. </p>
<p>Mulheron’s approach would tell us much about what students are capable of within a classroom environment, but surely we want to know what they can do in the real world too.</p>
<p>Clever assessment design may be another part of the solution. Assessment that builds on the student’s own experiences, classwork, prior drafts and feedback is more challenging to ghostwrite. We can also build sequences of tasks that have a small mandatory supervised component. This is commonly implemented at universities as an exam that needs to be passed to pass a unit.</p>
<p>Above all else, we should examine the root causes of pay-for plagiarism. One <a href="http://www.tandfonline.com/doi/abs/10.1080/07294360701310805">study</a> into the reasons higher education students plagiarise – the study was not restricted to pay-for plagiarism – found a variety of factors that we can learn from. One of these factors was pressure: time pressure, stress, pressure from family, and pressure from society.</p>
<p>This may be a factor for students paying for HSC assignments as well. For example, students at one school were apparently told they would be <a href="http://www.smh.com.au/national/education/teachers-demand-change-to-combat-endemic-cheating-20150608-ghiqik">kicked out</a> if their work was not good enough. Perceptions that poor performance will be punished, rather than addressed with support, may make pay-for plagiarism an attractive option.</p>
<p>Other issues in the study included teaching and learning issues (ranging from workload to bad teaching), laziness or convenience, and – my favourite – “pride in plagiarising”. Better detection of ghostwriting will not completely address these issues.</p>
<p>Solving the pay-for plagiarism problem requires us to understand why paying $1000 seems like a better choice than completing a particular assignment. Cheating students are definitely in the wrong, but when placed in a high-stakes, high-stress environment, they may feel like they have few other options. We need to change this.</p>
<hr>
<p>Read more of The Conversation’s coverage of <a href="https://theconversation.com/au/topics/academic-dishonesty-in-australia">Academic dishonesty in Australia</a> here.</p><img src="https://counter.theconversation.com/content/42999/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Phillip Dawson receives funding from the Office for Learning and Teaching.</span></em></p>We could be fooled into thinking pay-for plagiarism is a modern, high-tech invention. However, the internet merely supports the logistics.Phillip Dawson, Associate Professor and Associate Director, Centre for Research in Assessment and Digital Learning, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/365582015-01-21T14:37:03Z2015-01-21T14:37:03ZWe must scrap new baseline tests for primary school children<p>From September 2016, within five weeks of starting primary school, all children in England will receive an assessment that will stay on their record. But we have been here before and it didn’t work. </p>
<p>The planned reintroduction of baseline assessments means that another deeply flawed process is set to be brought back against the best judgement of many professionals who know about young children’s learning, <a href="https://www.change.org/p/nicky-morgan-mp-scrap-baseline-assessment-in-reception-classes-and-keep-the-eyfs-profile-as-the-measure-of-children-s-progress-at-the-end-of-the-early-years-foundation-stage-eyfs">many concerned parents</a>, and against a recent lesson of history. </p>
<p>A baseline assessment of children as they start primary school was first introduced in 1997 and abandoned in 2002 because it was an ineffective and damaging policy. The testing of children’s <a href="https://theconversation.com/why-testing-four-year-olds-as-they-start-school-is-a-bad-idea-24929">knowledge at just four years old</a>, when they are only just becoming accustomed to the school routine, is not a reliable way to identify young children’s learning and development. </p>
<p>Paradoxically, this is the only test for which it might be in a school’s interest for their pupils to perform badly. That’s because this “baseline” will later be used to identify progress made on other targets over the years – and allow the school to make claims about how well they support children’s learning. </p>
<h2>A waste of time</h2>
<p>But baseline assessment does not support learning, in fact, it takes teachers away from teaching and so wastes learning time. It is not in the interests of young children, whose learning and other developmental needs <a href="http://books.google.co.uk/books/about/Contemporary_Issues_in_the_Early_Years.html?id=m8hCu8mYHjsC">are better identified</a> – over time – by well-qualified early years practitioners who <a href="http://books.google.co.uk/books/about/Thinking_Children.html?id=Ue4-OQAACAAJ">observe and interact</a> with young children as they play. </p>
<p>Through their work with young children, teachers and other early childhood educators make judgements about children’s ideas, what children know, their motivation, their abilities, their thinking, their disposition – and how their interests and ideas might further be developed. </p>
<h2>Don’t fix what isn’t broken</h2>
<p>The present system in England supports this process of assessing young children’s learning through observation and interaction, with judgements about each child’s learning – and learning needs – based upon what children do and say, rather than formal tests. </p>
<p>At the moment, such ongoing assessment begins with careful observation, and focuses on the all-round development of each individual child. Known as the <a href="http://www.educationengland.org.uk/documents/pdfs/2013-eyfs-profile-handbook.pdf">Early Years Foundation Stage (EYFS) Profile</a>, this statutory framework was published by the Department for Education as recently as 2012. The framework requires that the EYFS profile is carried out in the final term of the year in which the child reaches age five, using practitioner observation, professional knowledge and parental contributions. </p>
<p>Despite the fact that a new EYFS profile was only introduced in 2012, the intention is to replace the present system with a baseline assessment to be administered to four-year-olds in the first few weeks of their starting school. From September 2016, schools will have the option of continuing to use the EYFS profile – but it will no longer be a requirement – and they can also choose to use one of a number of approved baseline tests. The Standards and Testing Agency is <a href="https://www.gov.uk/reception-baseline-approval-process-for-assessments">set to publish</a> a list of approved tests soon. </p>
<p>This will move early assessment from a single profile which is developed over time for all children, to an unnecessary (and costly) diversification of approaches. It will mean that rather than assessing children at the end of their first year in school, they are tested within five weeks of beginning. There is also a danger of inappropriate testing practices, which can be stressful for young children, their parents and their teachers. </p>
<p>There are many reasons why this is wrong – and<a href="http://tactyc.org.uk/wp-content/uploads/2014/10/TACTYC-Baseline-position-paper-1.pdf"> a briefing by TACTYC</a>, the association for professional development in the early years, makes clear the problems that are looming. </p>
<h2>We’ve been here before</h2>
<p>The baseline assessment system was replaced in 2002 because it did not (and could not) yield the data on school performance that is was introduced to provide. This was because as is planned now, there was a choice about which “baseline” to use and so it was not a case of comparing like with like. <a href="http://www.tandfonline.com/doi/abs/10.1080/0957514980190107#.VL-rZi7eKmc">Issues of reliability and validity</a> over the different tests meant that the value-added element could not be calculated. At the same time, the tests offered little information that teachers did not already know about children in their classes. </p>
<p>The <a href="http://www.educationengland.org.uk/documents/wp1997/excellence-in-schools.html">National Framework of Baseline Assessment</a> was introduced in September 1998, requiring all schools to carry out a baseline assessment of children within the first half-term of their beginning compulsory schooling – regardless of whether or not the children were, themselves, of compulsory school age. </p>
<p>Strong protests and professional dissatisfaction eventually led to its withdrawal in favour of a more holistic and formative assessment process for three to five-year-olds. This was introduced in the form of the <a href="http://www.qca.org.uk/160.html">Foundation Stage Profile</a> in 2002, which was <a href="http://webarchive.nationalarchives.gov.uk/20130401151715/http://www.education.gov.uk/publications/standard/publicationDetail/Page1/DCSF-00261-2008">revised in 2008</a> and again in <a href="http://www.educationengland.org.uk/documents/pdfs/2013-eyfs-profile-handbook.pdf">2012</a>. Considerable effort and investment has gone into the assessment of children under five since 1997 – and we are now about to return to a system that was agreed to be flawed and ineffective in 2002. </p>
<h2>Baseline Assessment won’t enhance learning</h2>
<p>Part of the problem lies in a lack of understanding about what assessment is for. It is important to distinguish between assessment for learning and assessment for school management and accountability. No one instrument can be fit for both purposes. Assessment for learning is ongoing and informs the teaching and learning process. It extends children’s learning because it enhances teaching and tells each child’s individual learning story. This is best done by education professionals making ongoing observations, working with children, talking with parents and the children themselves. </p>
<p>All other forms of assessment, including baseline assessment, serve as checks on whether or not learning has occurred, not as a means – in themselves – of bringing about learning. Driven by management and accountability, these kind of assessments elevate scores over narrative accounts of children’s learning, in a format that can allow the “value added” by the school to be calculated. </p>
<p>Assessment is different from testing and measurement. The present system of assessment in the early years means that young children’s learning is identified and recorded as part of an ongoing process. It helps those who work with them to decide what next steps might be useful and it identifies what children can do and helps practitioners to plan new learning opportunities. </p>
<p>Baseline assessment is a single point assessment, more like a test, taking no account of individuals and so it’s not very useful to the practitioners who will be required to carry it out.<a href="https://www.gov.uk/government/consultations/new-national-curriculum-primary-assessment-and-accountability"> The government consultation</a> in 2014 resulted in a response urging the rejection a formal baseline at the start of school. </p>
<p>An <a href="https://www.change.org/p/nicky-morgan-mp-scrap-baseline-assessment-in-reception-classes-and-keep-the-eyfs-profile-as-the-measure-of-children-s-progress-at-the-end-of-the-early-years-foundation-stage-eyfs">online petition</a> against the re-introduction of the baseline test launched on January 9 has so far gathered more than 1,600 signatories. Parents, practitioners and others interested in how we assess young children say “don’t do this”. Recent history says “don’t do this”. If ever there was a time when the evidence is overwhelmingly against the reintroduction of baseline assessment, it’s now.</p><img src="https://counter.theconversation.com/content/36558/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Cathy Nutbrown was the author of the 2012 Nutbrown Review into early education and childcare qualifications for the government. She has previously received funding from the Economic and Social Research Council. </span></em></p>From September 2016, within five weeks of starting primary school, all children in England will receive an assessment that will stay on their record. But we have been here before and it didn’t work. The…Cathy Nutbrown, Professor of Education and Head of the School of Education, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.