tag:theconversation.com,2011:/uk/topics/digital-native-16654/articlesDigital native – The Conversation2017-12-18T23:17:34Ztag:theconversation.com,2011:article/888302017-12-18T23:17:34Z2017-12-18T23:17:34ZHow reading fiction can help you improve yourself and your relationship to others<figure><img src="https://images.theconversation.com/files/199162/original/file-20171214-27593-1iajaa8.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Reading fiction can make you happier, nicer towards others and better focused in your activities.</span> <span class="attribution"><a class="source" href="https://www.pexels.com/photo/woman-reading-a-book-256546/">Pixabay/Pexels</a></span></figcaption></figure><p>There is a fair chance that someone in your close circle of friends and family will be using a smartphone, e-reader or tablet computer to read the latest bestseller over the holidays this year. Since the 2007 and <a href="https://gizmodo.com/5844662/the-history-of-amazons-kindle-so-far/">the introduction of Kindle</a> in 2010, such devices have changed the way that people <a href="https://www.theguardian.com/commentisfree/2010/oct/14/digital-reading-ebook-kindle-ipad">engage with books</a>. Most newspapers, including the 166-year-old <em>New York Times</em> have completed their digital transition, and some are now exclusively online. In academia, journal articles are increasingly published first in digital form, and sometimes exclusively so. But when it comes to books, the paper form has shown unexpected resilience.</p>
<h2>Ground-breaking inventions</h2>
<p>The digital or screen revolution we are living today is on the same scale as two other major events that radically changed humanity: the invention of writing 6,000 years ago in the form of <a href="https://www.metmuseum.org/toah/hd/wrtg/hd_wrtg.htm">clay inscriptions in Mesopotamia</a> and <a href="https://owlcation.com/humanities/Johannes-Gutenberg-and-the-Printing-Press-Revolution">Gutenberg’s invention of printing with moveable type in the 15th century</a>.</p>
<p>These ground-breaking inventions were regarded with suspicion by many contemporaries. Plato considered writing a <a href="https://books.google.co.in/books?id=7w2tAgAAQBAJ&pg=PA54&lpg=PA54&dq=Plato+considered+writing+a+threat+to+human+memory&source=bl&ots=-LpFFdtUCF&sig=aI75lZgBPbkcuGFw0jAhJQHoKlA&hl=en&sa=X&ved=0ahUKEwir6ubdvIjYAhXDQ48KHdvbDaAQ6AEIJjAA#v=onepage&q=Plato%20considered%20writing%20a%20threat%20to%20human%20memory&f=false">threat to human memory</a>, and monks agonised over the demise of their way of life. In 1492, for example, the abbot Johannes Trithemius wrote <a href="http://daten.digitale-sammlungen.de/%7Edb/0003/bsb00037424/images/"><em>De laude scriptorum manualium</em></a>, which he had printed in 1494 for better effect. In both cases the contemporary sceptics were right.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=563&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=563&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=563&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=707&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=707&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199141/original/file-20171214-27583-1hfjwx9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=707&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Scriptorium monk at work.</span>
<span class="attribution"><a class="source" href="https://upload.wikimedia.org/wikipedia/commons/7/7f/Scriptorium-monk-at-work.jpg">William Blades/Wikimedia</a></span>
</figcaption>
</figure>
<p>Human powers of memory are nowhere near what they were before the advent of writing, and printing spelled the end of the <a href="https://sites.dartmouth.edu/ancientbooks/2016/05/24/medieval-book-production-and-monastic-life/">monastic scriptorium</a>.</p>
<p>Not surprisingly, the screen revolution, too, has its detractors, and once again, their criticisms frequently have their basis in truth. In <a href="http://ereadcost.eu/">the E-READ research network</a>, to which we belong, we are trying to understand the function of reading in the digital age especially at a time where research points out the negative impact of screens.</p>
<h2>Screen addiction</h2>
<p>Smartphone usage among teenagers has been compared <a href="https://www.nytimes.com/2017/03/13/health/teenagers-drugs-smartphones.html">to drug addiction</a>. Worldwide surveys show <a href="http://www.abc.net.au/news/2017-09-20/teens-smartphones-resilience-adulthood/8960618">that a whole generation</a> has been growing up constantly online, checking the smartphone up to 75 times a day. These so-called <a href="https://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf">“digital natives”</a> are, according to a recent Italian study, <a href="http://www.dipendenze.com/associazione">less autonomous and less happy</a> than their predecessors. They face new social anxieties like “FOMO” (fear of missing out) and “vamping” <a href="https://www.nytimes.com/2014/07/06/fashion/vamping-teenagers-are-up-all-night-texting.html">(late-night texting)</a>.</p>
<p>The device as a medium is not to blame, but its online nature encourages a 24/7 connectivity, inducing greater distraction and more fragmented reading habits. The prime victim here is <a href="https://www.npr.org/templates/story/story.php?storyId=129348373">deep or immersive reading</a>, whether in literature or all sorts of narrative and argumentative texts, included academic ones.</p>
<p>Is it possible for us to counter these <a href="https://www.theatlantic.com/magazine/archive/2017/09/has-the-smartphone-destroyed-a-generation/534198">unwelcome side effects</a> of the digital revolution? The good news is yes. Part of the solution lies in changing our habits by reading fiction and enjoying solitude.</p>
<h2>Experience solitude</h2>
<p><a href="https://www.sas.rochester.edu/psy/people/gradstudents/nguyen_thuy-vy/index.html">Thuy-vy Nguyen</a>, along with his colleagues from Rochester University <a href="http://journals.sagepub.com/doi/abs/10.1177/0146167217733073">found that solitude can lead to relaxation and stress reduction</a>. The researchers define solitude as “as being alone for a period of time with no access to devices, personal interactions, external stimuli, or activities.” In all four studies, the solitude lasted 15 minutes and the subjects were instructed to sit alone and not engage in any activities or to carry out an activity alone as to think either positive or neutral thoughts.</p>
<p>In one part of the experiment, subjects were provided with a short recreational reading, titled <a href="https://longreads.com/2015/02/10/glamorous-crossing-how-pan-am-airways-dominated-international-travel-in-the-1930s/">“Glamorous Crossing: How Pan-Am Airways Dominated International Travel in the 1930s”</a>. The results were similar to that of the other activities carried out in solitude: people were more relaxed and calm.</p>
<p>Reading carried out in <a href="http://www.hup.harvard.edu/catalog.php?content=reviews&isbn=9780674634633">“fertile solitude”</a> fosters readers’ resilience and greater impermeability to social pressures and expectations, such as those encountered especially on social media. It may of course also incite people to do their reading on paper, simply because they find that more relaxing than using an online device. But if reading is good for you, reading fiction is even better.</p>
<h2>Find the good in you</h2>
<p>Scholars have indeed recently provided empirical evidence for claims that literary reading positively affects <a href="https://benjamins.com/#catalog/journals/ssol.6.1.04kid/details">social cognition</a> social skills and <a href="https://repub.eur.nl/pub/98670">empathy</a>. Psychologist Raymond Mar and his colleagues found that the more fiction people read – and it can be of any kind – the <a href="http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=753778B351C73B7B8950A3A352E5DD88?doi=10.1.1.502.9329&rep=rep1&type=pdf">better they scored</a> on tests that measure <a href="http://onlinelibrary.wiley.com/doi/10.1111/1469-7610.00715/%20%20abstract">a form of empathy</a>.</p>
<p>In another experiment, professor of psychology <a href="https://benjamins.com/#catalog/journals/ssol.3.1.08joh/details">Dan Johnson found</a> that participants who read an excerpt from a novel about the plight of an Arab-Muslim woman showed a significant increase in empathy for Arab-Muslims and in their intrinsic motivation to reduce prejudice.</p>
<p>Literary readers could actually counterbalance the unhealthy trend of hatred and indifference which is rife on the Internet. In experiments related to an ongoing project at the <a href="http://www.paris-iea.fr/fr/liste-des-residents/massimo-salgaro">Institute for Advanced Study in Paris</a>, we could show that literary readers feel compassion for morally positive characters but not for morally bad ones. At the Max-Planck-Institute for <a href="https://www.aesthetics.mpg.de/en/research/department-of-language-and-literature/aesthetic-emotions/projects/sympathy-devil.html">Empirical Aesthetics in Frankfurt</a>, we ran an experiment manipulating a literary text.</p>
<p>In one version, the protagonist is a doctor volunteering in Africa, in the other he is a Nazi who fled to South Africa. In the two versions we changed only four sentences concerning the moral nature of the protagonist and preserved the rest of the text almost entirely in form and content. 120 German subjects read either the text with the morally positive protagonist or the one with morally negative protagonist and afterwards rated the aesthetic and moral value of the text and answered several empathy/sympathy-related questions. While the results have not yet been published, they clearly show that the sympathy measures were affected by the moral nature of the protagonist.</p>
<p>Thus, literature can be considered a moral laboratory which enhances our <a href="https://www.youtube.com/watch?time_continue=14&v=bmgv7VcwNjs">prosocial traits</a>.</p>
<p>Now how do we get young people – our so-called digital natives – to read without being constantly distracted by their various social networks and other communication tools ? Here are a few more tips.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=380&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=380&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=380&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=477&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=477&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199166/original/file-20171214-27572-1oyxa6b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=477&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Reading a good story without any distraction: do we really need anything else?</span>
<span class="attribution"><a class="source" href="https://www.pexels.com/photo/background-bench-blur-book-346735/">Porapak Apichodilok/Pexels</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Teach people how to (really) read</h2>
<p>To ensure that literature regains (or gains) a central place in people’s reading aspirations, the didactics of literature in the digital age must be drastically overhauled. While text and author-oriented approaches are still dominant in European schools, <a href="http://as.wiley.com/WileyCDA/WileyTitle/productCd-0631226249.html">new studies</a> show the need to implement “experiential approaches” where the focus should be on the addressee of the text – for example, students. <a href="https://books.google.co.in/books?id=sT-EDAAAQBAJ&pg=PA12&lpg=PA12&dq=Experiencing+or+Interpreting+Literature:+Wording+Instructions.+In+M.+Burke,+S&source=bl&ots=c5XCjvtnH1&sig=jj5VLpGYV9witff78GlJ2nIcHoI&hl=en&sa=X&ved=0ahUKEwiMg6XjiIrYAhUlSo8KHeRdBhQQ6AEIKzAB#v=onepage&q=Experiencing%20or%20Interpreting%20Literature%3A%20Wording%20Instructions.%20In%20M.%20Burke%2C%20S&f=false">Research suggests</a> that listening to student preferences (instead of imposing texts) and helping them choose the right book to fit a particular moment in their lives instigates much greater engagement.</p>
<p>That way we can make literary fiction mean a great deal more to adolescent readers and enhance what they can gain from it for their social lives and <a href="https://l1.publication-archive.com/publication/1/1592">for their personal development</a>.</p>
<p>Incidentally, it may be that paper itself is an invaluable ally here. <a href="https://www.psychologytoday.com/blog/nature-brain-and-culture/201102/the-problem-the-web-and-e-books-is-there-s-no-space-them">Research has been shown</a> that its material properties suit our memory better. <a href="https://www.ncbi.nlm.nih.gov/pubmed/21443378">Psychologist Rakefet Ackermann</a>, also a member of the E-READ network, explained that despite immense technological advances, learners still prefer studying text from hardcopy pages rather than computer screens. She revealed that while learning performance on digital devices is not worse than on paper, surprisingly, it’s our metacognitive capacity fails us. When reading an article on screen a person is less prepared to evaluate how much she has understood or – in case of studying also – memorized.</p>
<p>What our own experiments on the <a href="http://ereadcost.eu/the-aura-study/">appeal of paper books showed</a> is that readers of works on paper engage more deeply while those who use digital devices tend to read more shallowly.</p>
<p>So do we want to really sharply contrast digital and paper reading, and do we oppose reading from screens? No. We need to adapt our tools to our needs and to develop them in order deliberately to make reading a profound component of our social and cultural habits. The more we understand about digital reading the more we can salvage from the precious past we inherited.</p><img src="https://counter.theconversation.com/content/88830/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Massimo Salgaro receives funding from Institute for Advanced Study, Paris. </span></em></p><p class="fine-print"><em><span>A. van der Weel receives funding from the COST European framework programme. He is affiliated with the Dr. P.A. Tiele-Stichting, a Dutch foundation for research on textual media in digital, printed and manuscript form.</span></em></p>To counter the unbalanced effects of the digital age, reading literature is the key.Massimo Salgaro, RFIEA Fellows 2017-2018, IEA Paris, Researcher in Literary theory, University of VeronaAdriaan van der Weel, Researcher Book and Digital media studies, Leiden UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/854522017-10-16T18:48:09Z2017-10-16T18:48:09ZThree strategies to help students navigate dodgy online content<figure><img src="https://images.theconversation.com/files/190297/original/file-20171016-27757-1w0159r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Readers should cast a more critical eye over information they use from the web, to make sure the knowledge built from it is trustworthy and accurate.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A recent <a href="https://www.scribd.com/document/331996442/Stanford-History-Education-Group-Evaluating-Information-The-Cornerstone-of-Civic-Online-Reasoning">Stanford University Report</a> revealed that students’ abilities to distinguish between questionable and valid online content needed work. </p>
<p>In one example cited in the report, researchers set high school and university students a task to evaluate the credibility of information found on the <em><a href="https://www.minimumwage.com/">MinimumWage.com</a></em> site. Only 9% of high school students and 6% of university students could identify the site was actually a front for a right-wing think-tank. </p>
<p>The lack of critical judgement displayed by high school and university students in this example is, as the report’s authors identified, a challenge that’s <a href="https://www.aft.org/ae/fall2017/mcgrew_ortega_breakstone_wineburg">bigger than fake news</a>.</p>
<p>It doesn’t just affect young people, either. In analysing the issue, the problem is not so much how we educate to identify hoax sites, as generally these are low frequency and quickly identifiable. The real challenge is how we educate people, both young and old, to critically evaluate the perspectives, aims and purposes of a website. In short, how do we help people distinguish between fact and opinion? </p>
<p>Here are three strategies based on the findings of the <a href="https://www.scribd.com/document/331996442/Stanford-History-Education-Group-Evaluating-Information-The-Cornerstone-of-Civic-Online-Reasoning">Stanford Report</a> to help navigate the online information minefield.</p>
<h2>1. Get off the website</h2>
<p>A traditional approach to educating about these challenges has been conducting “website evaluations” using a checklist. This usually involves judging the reliability of a site based on the information it contains, such as a named author, the publication date, domain name, and so on.</p>
<p>However, this approach underestimates how sophisticated and deceptive the internet has become. Instead of a vertical checklist approach, web users need to interact laterally. That involves getting off the website and searching for other information that can provide clues as to the validity and balance of information it contains. For example, thoroughly researching sites’ authors may reveal their political alignments, if they are funded by another person or organisation with particular agendas, and so on. Accurate answers to such questions will most likely only be found off the website. </p>
<h2>2. Use a site’s reference list</h2>
<p>Another good strategy is to go straight to the site’s reference list, if one is available. If no reference list is provided, it may well be a good reason to dig deeper.</p>
<h2>3. Identify adjectives</h2>
<p>Adjectives describe how something feels, looks, sounds and acts. They indicate the tone or mood of the message and suggest to readers how they should respond to the content of the site. A savvy web user can identify adjectives, think critically about how these encourage them to view the content of the site, and then evaluate the compatibility between the message itself and the effect of how the message is communicated.</p>
<p>These are just a few practical tips. Above all, readers should cast a more critical eye over information they use from the web, to make sure the knowledge built from it is trustworthy and accurate.</p>
<h2>The myth of the “digital native”</h2>
<p>“Digital native” was a buzz term of the early 2000’s, used to define young people born into a digital world. According to the architect of the <a href="http://marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf">“digital native” narrative, Marc Prensky,</a> if you were born before 1980, you were known as a <a href="http://marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf">“digital immigrant”</a>. Digital immigrants allegedly struggle with the technical domain that digital natives find so natural. </p>
<p>However, this narrative promoted an “us” versus “them” divide and did little to further our understanding of how young people interact with online information. The native generation may well be good at <a href="https://www.aft.org/ae/fall2017/mcgrew_ortega_breakstone_wineburg">flicking between Facebook, Twitter and Instagram whilst texting</a> their best friend about what’s happening on those sites, but the acts of “liking” or “friending” seldom involve making critical judgements.</p>
<p>It could be argued that young people’s saturated use of social media actually works against building critical thinking capabilities, as their interaction with the information is generally at a low level such as (re)tweeting, or simply claiming or making a positive or negative response. </p>
<p>We also need to remember that people born pre-1980 are not necessarily bad with technology. Time has exposed the “digital native versus digital immigrant” narrative to be little more than popular folklore. Even Prensky has backed away from the debate, and now considers we should concentrate on building something he calls <a href="http://www.marcprensky.com/writing/Prensky-Intro_to_From_DN_to_DW.pdf">“digital wisdom”</a>.</p>
<p>Building “digital wisdom”, the ability to select accurate and balanced online information and use it productively to construct robust and well-informed perspectives and knowledge, should be the goal for education at all levels.</p><img src="https://counter.theconversation.com/content/85452/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A report has discovered that while students born after 1980 have good digital skills, they need to think more critically about what they read online.Kim Wilson, Lecturer in History Education, Macquarie UniversityGarry Falloon, Professor of Digital Learning, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/765972017-05-11T14:46:24Z2017-05-11T14:46:24ZWhen I grow up, I want to be a researcher…<figure><img src="https://images.theconversation.com/files/168978/original/file-20170511-32618-121i8zo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">In the molecular-chemistry laboratory of the Ecole Polytechnique at the Université Paris-Saclay.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/117994717@N06/16276695722/in/photolist-qNjjyS-RJf5zq-SydeGo-SKK7PN-SydeMU-kdfJux-pghoEs-SydeJs-qhnsqh-kdgcRK-qi4GAe-S7aSMt-qpiVVY-q4hbzf-RRxdv7-RRxdJJ-RRxdNb-r43uzP-rmdhWN-kdgvpc-v2FyEj-kdiBGh-m19BYT-RRxdrj-q4dAL6-kdjpUN-r3SNj2-saTpbX-TkGfyT-TkGfo2-moun8B-rRBt2D-pQYXn2-q4hDxm-rdWrjw-kdhMKp-t7M5qf-kdhMBP-kXDGXt-q4cWB2-qkvStn-rgNhHq-r8CMvT-qdNA1m-kdfLnv-puzvgM-kdhPP5-qKjy9g-kdhTSA-kdhTqo">Ecole polytechnique, Université Paris-Saclay/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><blockquote>
<p>“So what’s your PhD topic again?"…</p>
</blockquote>
<p>Nowadays, this is the question most commonly asked to early-career researchers, and the answer is becoming more and more complex. While an interdisciplinary approach is favoured in English-speaking world, the French academic system often keeps doctoral students within methodological limits.</p>
<p>So why maintain such an inflexible, discipline-focused system? How can young researchers make their fields and the scientific foundation on which the build their work their own?</p>
<h2>Breaking down boundaries: the end of labels</h2>
<p>A basic trend: The longstanding boundaries between classic disciplines are breaking down or, at least, being blurred, and many academics feel disoriented. One explanation for this radical shift is probably the development of new media. While "traditionalists” try to hold on to their “specialties”, open-minded researchers use new technologies to break down the walls between disciplines. Indeed, since the 1980s, the English-speaking research world has witnessed the birth of new fields of multidisciplinary research. A model from which France and other countries have begun to draw inspiration in the last decade.</p>
<p>For 21st-century PhD students – the first generation of “digital natives” – the web has been a simple fact for their entire lives. They tend to refuse labels, and unlike their predecessors, early career researchers do not want to choose between specialties, methodologies, schools of thought or countries. They want to embrace them all.</p>
<p>And why would they have to choose, anyway? Thanks to the Internet they have access to almost unlimited knowledge, through MOOCs, TED talks, online publications… In a nutshell, open sources. Many doctoral candidates have graduated with two or three masters and followed several transdisciplinary pathways already. They are thus entitled to diversify their experience, and they wish to keep this privilege, and even cultivate it, when writing their thesis.</p>
<h2>“Y generation” researchers</h2>
<p>The training of the current generation of researchers – strewn with pitfalls and migrations – is not so “unusual” anymore. In a sense, their professional lives will be that way as well. For “Y generation” researchers, a certain volatility becomes necessary, if not essential, to fully comprehend new nomadic objects of research. Why not dissect a Latin text in the same way we examine DNA? Could a philosopher learn something from an examination of African tax systems?</p>
<p>For the 2016 <a href="http://jijc2016.event.univ-lorraine.fr/jijc_accueil.php">Early Career Researchers conference</a> at the University of Lorraine, PhD students discussed their take on interdisciplinarity and its potential benefits. The 2017 edition takes place on June 16 this year returns with a new theme: <a href="https://jijc2017.event.univ-lorraine.fr/?forward-action=index&forward-controller=index&lang=e">“Which questions for what research? The Humanities at the crossroads of disciplines”</a>.</p>
<h2>Asking the right questions</h2>
<p>In 2017 we need to discover what kinds of questions are being asked in research. What are the purposes of research? Which questions best correspond to which types of research? What is our take on fundamental research? What is the split between social-science research, applied research, or interventionist research? With the multiplication of ground-breaking concepts, should research fields be restructured?</p>
<p>How particular disciplines are mastered is clearly defined by French institutions, such as the National Counsel of Universities or the competitive exams for secondary-school teaching in the French national education system. Therefore, we should question the legitimisation of new fields of research within a given academic institutional system. As such, cultural studies have often been strongly criticised in France, whereas their popularity within the English-speaking world is easily understandable considering their interdisciplinary nature.</p>
<p>Certain disciplines taught at universities also have their equivalent in the secondary-school system, and many research departments limit their recruitment of lecturers to candidates who have passed the secondary-school exam. Notwithstanding the many differences between teaching in secondary school and conducting research at university, should academics in France continue this historical mode of recruitment? Can research fields be as easily delimited as the disciplinary knowledge one needs to teach in secondary schools? This issue is all the more pressing as new technologies bolster the constant evolution of research questions. Can they enable the Y generation of researchers to free themselves from the ancient methods of “mastering” disciplines and go beyond the more “traditional” fields of research?</p>
<h2>Towards enhanced research</h2>
<p>While fields of research are increasingly changing, should they all intersect and perfectly match taught disciplines, or could they be much more enriched and flexible? A good example is gender studies, which combines history, psychology, sociology and even medicine. Similarly, shouldn’t we consider research fields within the context and needs of society? It is only logical to question the axiological positioning of the researcher with regard to political militancy or societal debates, especially when their research deals with current affairs.</p>
<p>Moreover, an increasing number of companies and other organisations are now proposing collaborations and partnerships with researchers. Industrial agreements for training through research (for example, the French <a href="http://www.anrt.asso.fr/fr/espace_cifre/accueil.jsp">CIFRE program</a>) establish a partnership between a partner – most often a firm – a research department, and a PhD student. What methodologies can be applied in such collaborations? How do we reconcile the researcher’s methods and the partners’ expectations? We also need to question the uses and the limits such cooperative efforts. Concisely put: how do we distinguish between disciplines? Should we talk about a disciplinary area or should we replace it with the definition of a research domain? Is the creation of inter- and/or trans-disciplinary research teams always necessary? Are they really beneficial?</p>
<p>Facing the multiplication of such questions, early-career researchers need to develop innovative research practices and find ways to address the position of today’s researchers.</p>
<h2>Novel practices, new questions</h2>
<p>For its 2017 edition, the organisers of the Early Career Researchers conference invite PhD students of all <strong>disciplinary backgrounds</strong> to reflect on the following axes:</p>
<ul>
<li><p>What epistemological and deontological approaches should researchers adopt today?</p></li>
<li><p>What are the implications of the human factors behind research?</p></li>
<li><p>Which methodolog(y/ies) for what research: where are the boundaries between disciplines?</p></li>
<li><p>Inter/transdisciplinarity and the contributions of research and new technologies to society: how can different perspectives be reconciled?</p></li>
<li><p>How does research interact with its foundations?</p></li>
</ul>
<p>Far from being an isolated initiative, those considerations are beginning to be tackled at international congresses. These include the 2017 PhD colloquium of the <a href="http://congres2017.saesfrance.org/texte-de-cadrage-en/">French Society for the Study of English (SAES)</a>, to be held June 1-3 in Reims, and <a href="https://studies.hypotheses.org/">“Designations of Disciplines and Their Content: The Paradigm of Studies”</a>, which took place at Paris 13–USPC in January 2017.</p>
<hr>
<p><em>It is with this in mind that the early-career researcher conference will be held June 16, 2017, in Metz, France. For more information, visit the <a href="https://jijc2017.event.univ-lorraine.fr/?forward-action=index&forward-controller=index&lang=en">conference website</a>.</em></p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=136&fit=crop&dpr=1 600w, https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=136&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=136&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=171&fit=crop&dpr=1 754w, https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=171&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/163936/original/image-20170404-5702-10f21uu.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=171&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
</figcaption>
</figure><img src="https://counter.theconversation.com/content/76597/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jérémy Filet is a member of the steering comitee of the Early Career Researchers International Conference 2017 (JIJC2017). He has been awarded a Doctoral contract to write his PhD thesis in the Research Lab "Interdisciplinarity in English Studies" (IDEA), and he teaches English at the University of Lorraine (France).
</span></em></p><p class="fine-print"><em><span>Lisa Jeanson est membre du comité de pilotage de la Journée Internationale des Jeunes Chercheurs 2017 (JIJC2017). Elle effectue une thèse en Cifre chez le Groupe PSA au sein du laboratoire PErSEUs de l'Université de Lorraine spécialisé dans l'expérience utilisateur. </span></em></p>How do we and should we work with the first generation “digital native” doctoral researchers?Jérémy Filet, Doctorant en civilisation Britannique du XVIIIème siècle, Université de LorraineLisa Jeanson, Doctorante en Ergonomie Cognitive, Université de LorraineLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/399232015-05-06T19:48:22Z2015-05-06T19:48:22ZHow ‘digital natives’ are killing the ‘sage on the stage’<figure><img src="https://images.theconversation.com/files/80388/original/image-20150505-8382-fbla0o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Digital technology, and those who have grown up with it, are forcing the venerable lecture to adapt to the times.</span> <span class="attribution"><span class="source">uniinnsbruck/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>The idea that <a href="http://theconversation.com/ignore-the-fads-teachers-should-teach-and-students-should-listen-39634">teachers should teach and students should listen</a> presumes that teachers know more than their students.</p>
<p>While this was generally true back when textbooks were a rarity, and may have been partly true since the invention of the public library, it is most likely untrue for at least many students in this era of the “<a href="http://www1.umn.edu/ohr/teachlearn/tutorials/active/what/">active learner</a>” (AKA “<a href="http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf">digital natives</a>”). </p>
<p>After all, with a smartphone in every student’s pocket and Google only a tap away, how can the humble sage expect to compete as the font of all online knowledge?</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/CZ5Vy9BgSeY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>The world is a stage</h2>
<p>The <a href="https://www.youtube.com/watch?v=Ly9BPvFJfqo">very birth of the lecture</a> comes from medieval times, when books were difficult to make and experts were few and far between. Back in those days, the best way to record knowledge was for a monk to stand up the front of the room and recite the passages from a manuscript or book, while the novices below him hurriedly wrote down exactly what was said. </p>
<p>As universities emerged, this tradition continued, with the expert at the pulpit and the juniors in the audience. Hence was born the “sage on the stage”: the expert providing their knowledge to others so that they could learn from this font of all wisdom.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=434&fit=crop&dpr=1 600w, https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=434&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=434&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=545&fit=crop&dpr=1 754w, https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=545&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/80386/original/image-20150505-8411-lyayfu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=545&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Lectures haven’t changed a great deal since Michael Faraday delivered a Christmas lecture at the Royal Institution in 1856.</span>
<span class="attribution"><a class="source" href="http://commons.wikimedia.org/wiki/File:Faraday_Michael_Christmas_lecture.jpg">Wikimedia</a></span>
</figcaption>
</figure>
<p>Since then the role has evolved, but the basic principle has remained the same. Throughout the decades leading towards the end of the 20th century, models were extended with tutorials, laboratories and workshops. But the academic remained the expert, providing their knowledge to (sometimes eager) students. </p>
<p>As part of this role, it’s the academic’s job to entertain, and we have all known academics who take this part of the role very seriously, getting dressed up for class, using props or even planning out a performance with costumes and mask in advance. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/kDgsTjEAdf0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>These academics are embracing the “stage” part of the job, in line with the recent article noted above on <a href="https://theconversation.com/ignore-the-fads-teachers-should-teach-and-students-should-listen-39634">explicit teaching</a>, but the core idea still remains: they are the font of knowledge, the single basin from which students should ‘drink’, building their knowledge of the subject matter through contact with an expert.</p>
<h2>The 21st century: when it all changes</h2>
<p>But something happened around the turn of the millennium. With the rise of the internet and the beginnings of search engines such as Google, no longer was the expert (or the public library) the only place to acquire knowledge. </p>
<p>All of sudden, if you were out to dinner and somebody asked you who directed The Lord of the Rings movies, it was a quick tap and a search for you to yell out “Peter Jackson”. Pub quizzes changed forever, and all of a sudden we found ourselves with a wealth of knowledge at our fingertips. Even worse, the answer you read in (or copied faithfully from) a book several years ago may no longer be the answer now.</p>
<p>This change flowed to academia. But as with much in academia, it took some time to take root. While students were already starting to bring their mobile phones into the classroom (to the <a href="http://www.theguardian.com/education/2012/nov/27/should-mobiles-be-banned-schools">chagrin of some academics</a>), academia was struggling to <a href="https://theconversation.com/online-vs-face-to-face-learning-why-cant-we-have-both-34135">move away from tradition</a>. </p>
<p>By and large, lectures still existed. But they were supplemented by blended learning, flipped classrooms and Massive Open Online Courses (<a href="https://theconversation.com/au/topics/massive-open-online-courses">MOOCs</a>). All of these technologies looked to keep the “sage on the stage” mentality, but supplement it with other resources, so that the internet and its resources could serve as a supplement to the expert on the pulpit.</p>
<p>But we’ve started to notice something over the past couple of years. All of sudden, students don’t think lectures are as important as they once were. We already know students sometimes don’t attend their timetabled lectures, but what has changed is the reason. </p>
<p>Rather than sleeping in or being too busy with homework, the common reason we now hear from our undergraduates is that there is no need to come to the lecture. Why come and listen when they can access YouTube videos on the subject, or read a host of web pages where experts lay it out step-by-step? </p>
<p>And yes, they can even do this from their iPad after they roll over in bed after a big night out! </p>
<p>No longer are academics the sole expert at the pulpit, the sole basin from which students can drink. We are now just one of many possible fonts from which a student can sate their thirst for knowledge.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/80380/original/image-20150505-8426-rysgpf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">When students are standing up and recording a lecture on their phone, you know you’re doing something right.</span>
<span class="attribution"><span class="source">University of Denver/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<h2>The lecture as a performance piece</h2>
<p>So, what is the humble sage to do in this new paradigm? How do we deal with the fact that our stage is gone, replaced by an garden of different fonts of knowledge? </p>
<p>One option could be to embrace the performance art aspect of the role even more.</p>
<p>Talk to any creative type and they will tell you that the real impact of their work is not just the performance, but how it makes the audience change. How it makes them think deeply about the subject. </p>
<p>A creator has really done their job when a movie such as The Imitation Game is not only entertaining, but encourages the viewer to read more about <a href="https://theconversation.com/au/topics/alan-turing">Alan Turing</a> or the Enigma machine. Or perhaps even to contemplate the attitudes to homosexuality in the early 20th century and now. The performance serves as a launching point for investigation of the area, and “moving the furniture” in the mind.</p>
<p>Perhaps the academic needs to aim for the same? Make the lecture an entertaining performance piece on the area that causes the students to look into it more deeply. Recognise that students can get information from many places and embrace this by aiming for the lecture to be a highlight reel and a teaser rather than an expert at the pulpit. </p>
<p>Yes, this means every lecture should be a special occasion, but is that really a bad thing? If it gets our students thinking, then hasn’t it done its job?</p>
<p>If academics begin to do this, then maybe we can reclaim the role of “sage on a stage” in a different way. We can move from our old fashioned pulpit to a digital stage, providing a highlight reel of our discipline and becoming a truly digital sage for the active learner. </p>
<p>If this happens, then maybe the measure of success will be a measure of how many students are using a mobile phone in the classroom rather than how many are putting it away!</p><img src="https://counter.theconversation.com/content/39923/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Cris Brack has received a cash prize (Carrick Institute) for teaching using digital educational technology. He is a Senior Fellow of the Higher Education Academy.</span></em></p><p class="fine-print"><em><span>Michael Cowling does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Lectures and lecturers will have to adapt to modern times in order to stay relevant.Michael Cowling, Senior Lecturer & Discipline Leader, Mobile Computing & Applications, CQUniversity AustraliaCris Brack, Assoc Professor Forest measurement & management, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.