tag:theconversation.com,2011:/fr/topics/online-testing-9847/articlesonline testing – The Conversation2016-02-17T15:40:33Ztag:theconversation.com,2011:article/548692016-02-17T15:40:33Z2016-02-17T15:40:33ZHow shift to computer-based tests could shake up PISA education rankings<figure><img src="https://images.theconversation.com/files/111793/original/image-20160217-19250-1ez3vmv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Testing times: online exams produce different results to paper ones. </span> <span class="attribution"><span class="source">NarongchaiHlaw/www.shutterstock.com</span></span></figcaption></figure><p>The world’s most important examination is moving online. Since the Organisation for Economic Cooperation and Development launched the <a href="http://www.oecd.org/pisa/">Programme for International Student Assessment</a> (PISA) in 2000, it has provided an influential and timely update every three years of how 15-year-old school children’s mathematics, science and reading skills compare across the globe. </p>
<p>Poor performance has “shocked” a number of national governments into action, and they have embarked on a range of extensive <a href="https://theconversation.com/reforms-based-on-pisa-tests-alone-wont-fix-gcse-standards-25251">reforms</a> to their school systems.</p>
<p>Whereas each of the five cycles of tests completed between 2000 and 2012 were completed on paper, 58 of the 72 economies who participated in PISA 2015 between November and December last year administered the PISA test using computers – including the UK.</p>
<p>My new [research](http://johnjerrim.com/papers/](http://johnjerrim.com/papers/) starts to show that this shift is likely to influence the results of PISA 2015, which are due to be published towards the end of this year. </p>
<p>I drew upon data from 32 countries that completed both a paper and a computer mathematics test as part of PISA 2012 – the last round of this important global assessment.</p>
<p>There are some striking results. Average PISA paper and computer scores differ by more than ten test points in around a third of countries. The <a href="http://dx.doi.org/10.1787/9789264096660-en?">OECD has previously</a> suggested that differences of such a magnitude are substantial.</p>
<h2>Shifts in results</h2>
<p>Shanghai is a particularly striking example, where average PISA scores under computer assessment fell by 50 PISA test points – equivalent to more than an entire year of schooling. In contrast, young people in the US saw their performance improve significantly – by, on average, 17 test points.</p>
<iframe src="https://datawrapper.dwcdn.net/iEnqF/2/" frameborder="0" allowtransparency="true" allowfullscreen="allowfullscreen" webkitallowfullscreen="webkitallowfullscreen" mozallowfullscreen="mozallowfullscreen" oallowfullscreen="oallowfullscreen" msallowfullscreen="msallowfullscreen" width="100%" height="400"></iframe>
<p>There are also differences between sub-groups of students who took the test. Somewhat against my expectations, the difference in test scores between rich and poorer pupils was actually smaller by an average of around five PISA test points across countries when the PISA test was taken on computer, compared to paper. </p>
<p>Meanwhile, in Sweden and Russia, boys were suddenly better at mathematics than girls on the computer-based test – despite no gender differences being observed in these countries on the paper-based version of the test.</p>
<h2>Why computer tests change things</h2>
<p>There are several possible explanations for these stark differences in results. <a href="http://www.scientificamerican.com/article/reading-paper-screens/">Previous research</a> has suggested that performing tasks on paper and computer require different cognitive processes. </p>
<p>Important test-taking strategies, such as leaving the most challenging questions to one side to tackle at the end, are no longer possible. Students cannot move onto the next part of a question, or the next question, until they have finished the previous one. Computer tests can also be more engaging, particularly when answering the question correctly involves the use of on-screen interactive tools. On the flip side, we have all felt the frustration of working on slow operating systems and of our computer crashing.</p>
<p>The findings highlight some issues that will be important when it comes to interpreting the results of the forthcoming PISA 2015. Will we be able to compare results to previous cycles in order to monitor trends over time? Will results between countries taking paper and computer versions of the 2015 PISA test be comparable? Should we expect to see a fall in differences in PISA test scores between socio-economic groups? And what will this mean for international comparisons of differences in educational achievement between boys and girls?</p>
<p>Although we must wait to find out the answer to these questions, it is nevertheless clear that when children take paper and computer versions of similar tests, it can lead to quite notable differences in the results.</p><img src="https://counter.theconversation.com/content/54869/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>John Jerrim does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Some countries experienced big changes when the global test of 15-year-olds moved from paper to online.John Jerrim, Lecturer in Economics and Social Statistics, UCLLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/426132015-06-09T20:03:43Z2015-06-09T20:03:43ZTailored online NAPLAN better for monitoring high and low achievers<p>Australia’s national literacy and numeracy tests (NAPLAN) will be available online from 2017. While there is little difference in test scores between typical paper-and-pencil tests and computerised versions of the same test, some are concerned about the proposal to use “computerised adaptive testing”.</p>
<p>Computerised adaptive tests start with an item that is known to be of medium difficulty for the population to which a student belongs. If he/she solves it correctly, the next item will be harder. If he/she answers the item incorrectly, the following item will be easier. </p>
<p>Thus, the difficulty level of each subsequent item is adjusted depending on the answers provided to previous items. Individual students will no longer get the traditional, “one-size-fits-all”, tests as a set of chosen items are adjusted to fit each student’s ability level. NAPLAN developers refer to this approach as tailored testing.</p>
<p>The actual computerised NAPLAN tests will not employ individual items. Instead, a set of items (called “testlets”) varying in difficulty level will be used.</p>
<h2>One-size-fits-all vs tailored testing</h2>
<p>Tailored testing has a long history. The idea was <a href="http://onlinelibrary.wiley.com/doi/10.1002/j.2333-8504.1968.tb00562.x/pdf">first put forward in the late 1960s</a>, widely <a href="http://apm.sagepub.com/content/1/1/95.full.pdf">researched in the 1980s</a>, and implemented in large-scale testing in the <a href="http://eric.ed.gov/?id=EJ738891?">mid-1990s</a>. The main selling point is the use of a smaller number of items and, consequently, a shorter time needed for testing. </p>
<p>A <a href="http://www.nap.edu.au/verve/_resources/Tailored_test_design_study_2013_summary_research_report.pdf">general consensus in the research community</a> is that tailored testing provides an equally valid assessment of students’ abilities as do the traditional tests. A suit tailored specifically for you may look similar to one from a department store, but you can feel the difference because it fits you better. Similarly, the “one-size-fits-all” and tailored tests will produce similar scores but a tailored test will fit each individual student better. The items are more carefully selected.</p>
<p>An additional advantage is that test scores of students at the ends of the ability spectrum (either low or high) are likely to be more precisely measured. By contrast, the typical tests consist mostly of items of medium difficulty with a few easy and difficult items added.</p>
<p>In addition, the long reporting time – which is one of the most heated issues in NAPLAN testing – may be resolved. In tailored tests a student’s score will be calculated and produced online at the end of a testing session.</p>
<h2>Some tailored myths</h2>
<p>Proponents of tailored testing sometimes make statements that cannot be supported by the evidence. One of these is the claim that since test items will be of appropriate difficulty for a student, he/she will experience less anxiety. This claim has no empirical support. </p>
<p><a href="http://ncme.org/linkservid/477753F3-1320-5CAE-6E252080748AE491/showMeta/0/">Main causes of student anxiety over NAPLAN</a> are likely to be externally generated and have little to do with student experience during the testing session itself. Perhaps it can be said that students will be better challenged and less frustrated during the testing.</p>
<p>Another myth is that tailored testing will provide teachers, parents and other stakeholders with “greater insight” into students’ abilities. In general, tailored testing does not provide any new or additional information or “descriptions” about individual students above what a traditional test can produce.</p>
<h2>Further work needed</h2>
<p><a href="https://theconversation.com/naplan-online-will-australian-schools-and-students-be-ready-25327">Several issues</a> may need to be addressed prior to the launch of NAPLAN online. On the technical side, there are still some remaining questions about the effects of using different gadgets – computers, tablets and smartphones for test administration.</p>
<p>Another issue is the assessment of students with different kinds of disabilities. Computer administration provides for the possibility of developing test items that are different from those employed in paper-and-pencil tests. Sounds, moving pictures, sequential presentation of the elements of the tasks and many other options become available.</p>
<p>And last but not least is the possibility of having open-ended items that include short or even longer written material which can be scored by the machines. The computers may not be able to understand jokes and other persuasive writing techniques.</p>
<p>Overall, attempts to introduce computerised large-scale assessments have been <a href="http://www.smarterbalanced.org/resources-events/publications-resources/technology-and-computer-adaptive-testing/">successful in other parts of the world</a>. Australia will be one of the first countries in the world to employ it at the national level.</p><img src="https://counter.theconversation.com/content/42613/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Tailored testing where the test gets easier or harder depending on how the student is faring actually gives us a better idea of how students are going.Jihyun Lee, Senior Lecturer, MEd in Assessment and Evaluation, UNSW SydneyLazar Stankov, Professor, Institute for Positive Psychology and Education, Australian Catholic UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/253272014-04-17T21:51:42Z2014-04-17T21:51:42ZNAPLAN online: will Australian schools and students be ready?<figure><img src="https://images.theconversation.com/files/45938/original/k4mnnbpw-1397020569.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Paper and pencil NAPLAN testing will go online in 2016, but will schools be ready? </span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/downloading_tips.mhtml?code=&id=177793622&size=medium&image_format=jpg&method=download&super_url=http%3A%2F%2Fdownload.shutterstock.com%2Fgatekeeper%2FW3siZSI6MTM5NzA0OTI2MiwiYyI6Il9waG90b19zZXNzaW9uX2lkIiwiZGMiOiJpZGxfMTc3NzkzNjIyIiwicCI6InYxfDEwMTI3NTg4fDE3Nzc5MzYyMiIsImsiOiJwaG90by8xNzc3OTM2MjIvbWVkaXVtLmpwZyIsIm0iOiIxIiwiZCI6InNodXR0ZXJzdG9jay1tZWRpYSJ9LCJHaGMvWlN1cWpFb011ZEYrWTdnazk2Q1pJSUkiXQ%2Fshutterstock_177793622.jpg&racksite_id=ny&chosen_subscription=1&license=standard&src=KVA3-X1g9bcXeBAjVhnGjA-1-129">bibiphoto / Shutterstock.com</a></span></figcaption></figure><p>The Australian Government <a href="http://www.nap.edu.au/online-assessment/naplan-online/naplan-online.html">plans</a> to conduct the National Assessment Program – Literacy and Numeracy (NAPLAN) online from 2016. This presents a significant challenge for Australia’s 9,500 <a href="http://www.myschool.edu.au/">schools.</a></p>
<p>Conducting NAPLAN online has many potential benefits. As the Australian Curriculum, Assessment and Reporting Authority accurately <a href="http://www.acara.edu.au/assessment/research.html">indicated</a>, this will enable “tailored testing” and will provide more timely marking, feedback and results. A greater range of things can be tested once no longer restricted by paper and pencil tests, including testing students’ ability to read, understand and apply digital texts.</p>
<p>However, the Australian Curriculum, Assessment and Reporting Authority acknowledges that NAPLAN online will present challenges for schools’ digital capabilities. Plans need to be developed to manage this massive task.</p>
<p>So how do Australian schools and students get ready for NAPLAN Online? There are some important considerations relating to the physical implementation of NAPLAN online, as well as how the scope of NAPLAN can be broadened once it’s in an online format.</p>
<h2>Understand the scale of the task</h2>
<p>The scale of NAPLAN being implemented online in all Australian schools is considerable. When the nation-wide roll-out of NAPLAN online occurs, it is likely situations could occur where students are unable to sit the test due to technical and infrastructure issues. </p>
<p>Pilot research and trialling studies have already been undertaken. It would be wise to conduct further trials in various sites and school contexts before scaling up to nationwide implementation.</p>
<h2>Ensure school infrastructure is ready</h2>
<p>The Australian Curriculum, Assessment and Reporting Authority already understands that most schools do not have enough computers or internet bandwidth to enable all Year 3, 5, 7 and 9 students to sit NAPLAN online at the same time. There are many gaps in digital technologies and infrastructure within and between schools. There are serious doubts that all schools will be technologically ready for their students to do NAPLAN online by 2016.</p>
<p>Audits on each school’s readiness need to be conducted. Having the infrastructure capability for NAPLAN online is essential. If this is done well, then more timely feedback can be provided to students, parents, and schools.</p>
<h2>Recognise the importance of school leadership</h2>
<p>School leaders will need to design and implement strategies to prepare schools for NAPLAN online. This requires more than preparing the technological infrastructure, but will require leadership vision, strategies and tactics to build online capabilities in the teachers and students themselves. </p>
<p>The strategies used to prepare students for the paper-based NAPLAN test will no longer be sufficient. School leadership will be needed to seriously shift attention from paper-based learning, teaching and assessment practices to online learning, teaching and assessment approaches. </p>
<h2>Understand that many students use a range of devices at school and at home</h2>
<p>While the focus will likely be on ensuring strict conditions for NAPLAN online to ensure integrity of the testing, instead greater focus should be on enabling students to demonstrate what they know about how to use digital technologies. Many students already use a range of devices and applications at school and at home, so the tests should be designed to enable students to demonstrate what they know about technology, and what they can do with what they know, using a range of devices.</p>
<p>As we move into a post-PC world, with many schools moving to a phenomenon known as “bring your own device”, this presents a significant challenge for the Australian Curriculum, Assessment and Reporting Authority. The authority must design and implement a test that understands the array of new and emerging technologies available.</p><img src="https://counter.theconversation.com/content/25327/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>There are no conflicts of interest.</span></em></p>The Australian Government plans to conduct the National Assessment Program – Literacy and Numeracy (NAPLAN) online from 2016. This presents a significant challenge for Australia’s 9,500 schools. Conducting…Glenn Finger, Professor of Education, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.