tag:theconversation.com,2011:/uk/topics/virtual-reality-5439/articlesVirtual reality – The Conversation2024-03-24T19:06:41Ztag:theconversation.com,2011:article/2209462024-03-24T19:06:41Z2024-03-24T19:06:41ZWe created a VR tool to test brain function. It could one day help diagnose dementia<figure><img src="https://images.theconversation.com/files/583308/original/file-20240321-22-fi7z9f.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5756%2C3842&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.pexels.com/photo/photograph-of-a-man-in-a-red-sweatshirt-holding-a-virtual-reality-headset-6667710/">Kampus Production/Pexels</a></span></figcaption></figure><p>If you or a loved one have noticed changes in your memory or thinking as you’ve grown older, this could reflect typical changes that occur with ageing. In some cases though, it might suggest something more, such as the onset of dementia.</p>
<p>The best thing to do if you have concerns is to make an appointment with your GP, who will probably run some tests. Assessment is important because if there is something more going on, <a href="https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=09da7f8f782d61bb23411c18ba0af0faae918cdc">early diagnosis</a> can enable prompt access to the right <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/gps.2191?casa_token=4xa6QPERgQkAAAAA:znhnzCjFlILbkI3ffikVOJAVx5vtCe2qFb9DydvjbFOwlvrYTcNHrKhG7hpDQY-yyRviyUTWhaW7DU27">interventions</a>, supports and care. </p>
<p>But current methods of dementia screening have <a href="https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2301149">limitations</a>, and testing can be daunting for patients.</p>
<p>Our research suggests virtual reality (VR) could be a useful cognitive screening tool, and mitigate some of the challenges associated with current testing methods, opening up the possibility it may one day play a role in dementia diagnosis.</p>
<h2>Where current testing is falling short</h2>
<p>If someone is worried about their memory and thinking, their GP might ask them to complete a series of quick tasks that check things like the ability to follow simple instructions, basic arithmetic, memory and orientation.</p>
<p>These sorts of screening tools are really good at confirming cognitive problems that may already be very apparent. But <a href="https://www.cochranelibrary.com/cdsr/doi/10.1002/14651858.CD010783.pub2/full">commonly used screening tests</a> are <a href="https://link.springer.com/article/10.1186/s13195-019-0474-3">not always so good</a> at detecting early and more subtle difficulties with memory and thinking, meaning such changes could be missed until they get worse. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/these-12-things-can-reduce-your-dementia-risk-but-many-australians-dont-know-them-all-191504">These 12 things can reduce your dementia risk – but many Australians don't know them all</a>
</strong>
</em>
</p>
<hr>
<p>A <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9492323/">clinical neuropsychological assessment</a> is better equipped to <a href="https://onlinelibrary.wiley.com/doi/full/10.1111/ene.12488?casa_token=PUj3o1rEfrQAAAAA%3A_fa1cOudFpdvoGx_0u6QJRG2gzuRVWJ8h6x5qrQOKc2J7hwPjYox20DcEhRaRqyZdRXXHEkBIXuRgIH5nw">detect early changes</a>. This involves a comprehensive review of a patient’s personal and medical history, and detailed assessment of cognitive functions, including attention, language, memory, executive functioning, mood factors and more. However, this can be costly and the testing can take several hours.</p>
<p>Testing is also somewhat removed from everyday experience, not directly tapping into activities of daily living.</p>
<h2>Enter virtual reality</h2>
<p>VR technology uses computer-generated environments to create immersive experiences that feel like real life. While VR is often used for entertainment, it has increasingly found applications in health care, including in <a href="https://link.springer.com/article/10.1007/s10055-020-00495-x">rehabilitation</a> and <a href="https://journals.sagepub.com/doi/full/10.1177/0269215517694677?casa_token=-T4Vh6ZsSXYAAAAA:2S7tM5qS25Oe0YQLCqdd0wPlspOIZPv9exKcc6InL5Wn4nfyetfzQOJxgBjb-6F0LGJPWggozMEoJQ">falls prevention</a>. </p>
<p>Using VR for cognitive screening is still a new area. VR-based cognitive tests generally create a scenario such as shopping at a supermarket or driving around a city to ascertain how a person would perform in these situations.</p>
<p>Notably, they engage various senses and cognitive processes such as sight, sound and spatial awareness in immersive ways. All this may reveal subtle impairments which can be missed by standard methods.</p>
<p>VR assessments are also often more engaging and enjoyable, potentially reducing anxiety for those who may feel uneasy in traditional testing environments, and improving compliance compared to standard assessments.</p>
<figure class="align-center ">
<img alt="A senior woman sitting on a bed with her hand to her face." src="https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/583309/original/file-20240321-28-p3dtg4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Millions of people around the world have dementia.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/senior-woman-suffering-headache-2138485783">pikselstock/Shutterstock</a></span>
</figcaption>
</figure>
<p>Most studies of VR-based cognitive tests have explored their capacity to pick up <a href="https://www.frontiersin.org/articles/10.3389/fnhum.2021.628818/full">impairments in spatial memory</a> (the ability to remember where something is located and how to get there), and the results have been promising.</p>
<p>Given VR’s potential for assisting with diagnosis of cognitive impairment and dementia remains largely untapped, our team developed an online computerised game (referred to as semi-immersive VR) to see how well a person can remember, recall and complete everyday tasks. In our VR game, which lasts about 20 minutes, the user role plays a waiter in a cafe and receives a score on their performance.</p>
<p>To assess its potential, we enlisted more than 140 people to play the game and provide feedback. The results of this research are published across three recent papers.</p>
<h2>Testing our VR tool</h2>
<p>In our <a href="https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-024-02478-3">most recently published study</a>, we wanted to verify the accuracy and sensitivity of our VR game to assess cognitive abilities.</p>
<p>We compared our test to an existing screening tool (called the <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2913129/">TICS-M</a>) in more than 130 adults. We found our VR task was able to capture meaningful aspects of cognitive function, including recalling food items and spatial memory.</p>
<p>We also found younger adults performed better in the game than older adults, which echoes the pattern commonly seen in regular memory tests.</p>
<figure class="align-center ">
<img alt="A senior man sitting outdoors using a laptop." src="https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/583311/original/file-20240321-18-smy2uf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Adults of a range of ages tried our computerised game.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/senior-man-working-on-laptop-garden-1488244298">pikselstock/Shutterstock</a></span>
</figcaption>
</figure>
<p>In a <a href="https://bmcgeriatr.biomedcentral.com/articles/10.1186/s12877-024-04767-y">separate study</a>, we followed ten adults aged over 65 while they completed the game, and interviewed them afterwards. We wanted to understand how this group – who the tool would target – perceived the task.</p>
<p>These seniors told us they found the game user-friendly and believed it was a promising tool for screening memory. They described the game as engaging and immersive, expressing enthusiasm to continue playing. They didn’t find the task created anxiety.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/we-gave-palliative-care-patients-vr-therapy-more-than-50-said-it-helped-reduce-pain-and-depression-symptoms-223186">We gave palliative care patients VR therapy. More than 50% said it helped reduce pain and depression symptoms</a>
</strong>
</em>
</p>
<hr>
<p>For a third study, we spoke to <a href="https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-023-02413-y">seven health-care professionals</a> about the tool. Overall they gave positive feedback, and noted its dynamic approach to age-old diagnostic challenges.</p>
<p>However, they did flag some concerns and potential barriers to implementing this sort of tool. These included resource constraints in clinical practice (such as time and space to carry out the assessment) and whether it would be accessible for people with limited technological skills. There was also some scepticism about whether the tool would be an accurate method to assist with dementia diagnosis. </p>
<p>While our initial research suggests this tool could be a promising way to assess cognitive performance, this is not the same as diagnosing dementia. To improve the test’s ability to accurately detect those who likely have dementia, we’ll need to make it more specific for that purpose, and carry out further research to validate its effectiveness.</p>
<p>We’ll be conducting more testing of the game soon. Anyone interested in giving it a go to help with our research can register on <a href="https://brainhealthhub.com.au/projects/leaf-cafe-virtual-reality/">our team’s website</a>.</p><img src="https://counter.theconversation.com/content/220946/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Current methods of screening for dementia have a range of limitations. Using virtual reality for cognitive screening is still a new area, but it’s showing promise.Joyce Siette, Research Theme Fellow in Health and Wellbeing, Western Sydney UniversityPaul Strutt, Senior Lecturer in Psychology, Western Sydney UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2230092024-03-07T11:12:21Z2024-03-07T11:12:21ZOur brains take rhythmic snapshots of the world as we walk – and we never knew<figure><img src="https://images.theconversation.com/files/577820/original/file-20240226-16-psaujb.jpg?ixlib=rb-1.1.0&rect=52%2C9%2C3089%2C2123&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/woman-hiking-mountains-adventure-exercising-legs-105847466">Blazej Lyjak/Shutterstock</a></span></figcaption></figure><p>For decades, psychology departments around the world have studied human behaviour in darkened laboratories that restrict natural movement.</p>
<p>Our new study, <a href="https://www.nature.com/articles/s41467-024-45780-4">published today in Nature Communications</a>, challenges the wisdom of this approach. With the help of virtual reality (VR), we have revealed previously hidden aspects of perception that happen during a simple everyday action – walking. </p>
<p>We found the rhythmic movement of walking changes how sensitive we are to the surrounding environment. With every step we take, our perception cycles through “good” and “bad” phases. </p>
<p>This means your smooth, continuous experience of an afternoon stroll is deceptive. Instead, it’s as if your brain takes rhythmic snapshots of the world – and they are synchronised with the rhythm of your footfall.</p>
<h2>The next step in studies of human perception</h2>
<p>In psychology, the study of visual perception refers to how our brains use information from our eyes to create our experience of the world.</p>
<p>Typical psychology experiments that investigate visual perception involve darkened laboratory rooms where participants are asked to sit motionless in front of a computer screen.</p>
<p>Often, their heads will be fixed in position with a chin rest, and they will be asked to respond to any changes they might see on the screen. </p>
<p>This approach has been invaluable in building our knowledge of human perception, and the foundations of how our brains make sense of the world. But these scenarios are a far cry from how we experience the world every day.</p>
<p>This means we might not be able to <a href="https://www.sciencedirect.com/science/article/pii/S1571064523000830">generalise</a> the results we discover in these highly restricted settings to the real world. It would be a bit like trying to understand fish behaviour, but only by studying fish in an aquarium.</p>
<p>Instead, we went out on a limb. Motivated by the fact our brains have evolved to support action, we set out to test vision during walking – one of our most frequent and everyday behaviours.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A row of students in a uni computer lab looking at screens." src="https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/579126/original/file-20240301-16-354ica.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Doing tests in a lab isn’t quite the same as seeing and interacting with things in the real world.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/side-view-students-using-computer-lab-122284963">sirtravelalot/Shutterstock</a></span>
</figcaption>
</figure>
<h2>A walk in a (virtual) forest</h2>
<p>Our key innovation was to use a wireless VR environment to test vision continuously while walking. </p>
<p>Several previous studies have examined the effects of light exercise on perception, but used <a href="https://doi.org/10.3389/fnhum.2010.00202">treadmills</a> or <a href="https://doi.org/10.1162/jocn_a_01082">exercise bikes</a>. While these methods are better than sitting still, they <a href="https://doi.org/10.1152/japplphysiol.01380.2006">don’t match the ways</a> we naturally move through the world.</p>
<p>Instead, we simulated an open forest. Our participants were free to roam, yet unknown to them, we were carefully tracking their head movement with every step they took. </p>
<figure>
<iframe src="https://player.vimeo.com/video/917787370" width="500" height="281" frameborder="0" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen=""></iframe>
<figcaption><span class="caption">Participants walked in a virtual forest while trying to detect brief visual ‘flashes’ in the moving white circle.</span></figcaption>
</figure>
<p>We tracked head movement because as you walk, your head bobs up and down. Your head is lowest when both feet are on the ground and highest when swinging your leg in-between steps. We used these changes in head height to mark the phases of each participant’s “step-cycle”.</p>
<p>Participants also completed our visual task while they walked, which required looking for brief visual “flashes” they needed to detect as quickly as possible.</p>
<p>By aligning performance on our visual task to the phases of the step-cycle, we found visual perception was not consistent.</p>
<p>Instead, it oscillated like the ripples of a pond, cycling through good and bad periods with every step. We found that depending on the phases of their step-cycle, participants were more likely to sense changes in their environment, had faster reaction times, and were more likely to make decisions.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/everything-we-see-is-a-mash-up-of-the-brains-last-15-seconds-of-visual-information-175577">Everything we see is a mash-up of the brain's last 15 seconds of visual information</a>
</strong>
</em>
</p>
<hr>
<h2>Oscillations in nature, oscillations in vision</h2>
<p>Oscillations in vision have been <a href="https://doi.org/10.1016/j.tics.2016.07.006">shown before</a>, but this is the first time they have been linked to walking.</p>
<p>Our key new finding is these oscillations slowed or increased to match the rhythm of a person’s step-cycle. On average, perception was best when swinging between steps, but the timing of these rhythms varied between participants. This new link between the body and mind offers clues as to how our brains coordinate perception and action during everyday behaviour. </p>
<p>Next, we want to investigate how these rhythms impact different populations. For example, certain psychiatric disorders can lead to people having <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2922365/">abnormalities</a> in their gait.</p>
<p>There are further questions we want to answer: are slips and falls more common for those with stronger oscillations in vision? Do similar oscillations occur for our perception of sound? What is the optimal timing for presenting information and responding to it when a person is moving?</p>
<p>Our findings also hint at broader questions about the nature of perception itself. How does the brain stitch together these rhythms in perception to give us our seamless experience of an evening stroll?</p>
<p>These questions were once the domain of philosophers, but we may be able to answer them, as we combine technology with action to better understand natural behaviour.</p><img src="https://counter.theconversation.com/content/223009/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matthew Davidson does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Psychology researchers have used virtual reality to find our brains oscillate with each step – an intriguing finding to better understand how we see the world.Matthew Davidson, Postdoctoral research fellow, lecturer, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2231862024-02-22T00:12:48Z2024-02-22T00:12:48ZWe gave palliative care patients VR therapy. More than 50% said it helped reduce pain and depression symptoms<figure><img src="https://images.theconversation.com/files/576355/original/file-20240219-28-18om6k.jpg?ixlib=rb-1.1.0&rect=0%2C10%2C7008%2C4647&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Unai Huizi Photography/Shutterstock</span></span></figcaption></figure><p>People in palliative care are dealing with serious, non-curable illness. Every day can be filled with severe physical, psychological and emotional pain.</p>
<p>Palliative care staff work hard to help make patients as comfortable as possible and provide strong emotional support. Meaningful activities can help but patients often aren’t well enough to do the things they really love, such as travel. We wondered whether virtual reality (VR) could help.</p>
<p>To find out, we supported 16 palliative care patients in an acute ward to do three 20-minute VR sessions, and asked them how they felt before and after each one.</p>
<p><a href="http://dx.doi.org/10.1136/spcare-2024-004815">Our study</a>, published this week in the journal <a href="https://spcare.bmj.com/">BMJ Supportive & Palliative Care</a>, found more than 50% of patients experienced clinically meaningful reductions in symptoms such as pain and depression immediately after a 20-minute VR session.</p>
<p>Importantly, though, some also told us it didn’t help or that they felt unwell after using it. This shows taking a nuanced approach to using VR in palliative care is crucial.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="An older woman in bed uses a VR gaming headset." src="https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/574581/original/file-20240209-16-t34gok.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">VR involves using a headset to allow the user to have an immersive experience that feels 3D.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/female-doctor-checking-on-elderly-patient-1859594494">Newman Studio/Shutterstock</a></span>
</figcaption>
</figure>
<h2>What we did</h2>
<p>VR involves using a headset to create an immersive experience that feels 3D, often accompanied by music or realistic sound effects. This computer-generated environment can feel incredibly close to reality. </p>
<p><a href="https://journals.sagepub.com/doi/full/10.1177/20552076231207574">Previous research</a> has looked at VR use in palliative care but we were especially interested in finding out if <em>personalised</em> VR sessions were associated with meaningful changes in pain and depression symptoms.</p>
<p>Personalised VR means each person experiences content that is meaningful to that individual. So rather than asking patients to choose, for example, between a rainforest and a beach VR experience, we interviewed the patients before their sessions to gauge their interests and create a VR session tailored to them.</p>
<p>For example, one person said they wanted a VR experience that allowed them to explore Paris again. Others had migrated to Australia from the UK so they asked for VR experiences that brought them back to the country where they were born. One person was a big fan of Star Wars, so we provided a VR Star Wars game.</p>
<p>For our study, we asked 16 palliative care patients from an acute ward in a South Australian hospital to participate in three VR sessions using a headset that is now known as Meta Quest 2. The participants, who ranged in age from 48 to 87 years old, used the headset for around 20 minutes per session. The primary VR applications we used were Wander and YouTube VR. </p>
<p>We asked each participant about their emotional and physical symptoms before and after each session.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/1xZHsQJlqbA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">One of the apps used in our study was Wander.</span></figcaption>
</figure>
<h2>What we found</h2>
<p>We found just 20 minutes of VR immersion could immediately reduce the participants’ subjective feelings of both physical pain and emotional pain (such as depression). At least half of the participants reported significant relief after a single session. After one session, two out of three participants reported relief. </p>
<p>One person told us:</p>
<blockquote>
<p>When the service is finished you feel like you’re floating. [It takes a] weight off your shoulders.</p>
</blockquote>
<p>Another said:</p>
<blockquote>
<p>Well, I’d rather lie here thinking about a fish swimming [or] a Willy Nelson concert than be dying […] I enjoyed it.</p>
</blockquote>
<p>One participant told us:</p>
<blockquote>
<p>Oh, it’s just amazing, it was nothing like I expected […] it takes you from this world into another beautiful world.</p>
</blockquote>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="An older woman in a wheelchair uses VR." src="https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/576357/original/file-20240219-16-eq6rm7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">While the benefits of VR were profound for some, they were not universal.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/retired-woman-using-vr-glasses-nurse-2073593054">DC Studio/Shutterstock</a></span>
</figcaption>
</figure>
<p>A different person said:</p>
<blockquote>
<p>[…] by the time you get to where I am, there’s things you think of, ‘I wish I’d done this, I wish I’d had the chance to have been able to do that’ and then this offers you that experience to have just about feel like you’ve been there.</p>
</blockquote>
<p>While the benefits of VR were profound for some, they were not universal. </p>
<p>Some participants reported feeling worse after the VR sessions.</p>
<p>One person said the headset felt too heavy on their cheekbone, another said they experienced nausea after using the VR.</p>
<h2>Where to from here?</h2>
<p>We and others have now collected good evidence VR can be a helpful palliative care therapy for some patients – but not all. It is not a universal remedy.</p>
<p>More research is needed to better understand which patients will benefit the most from VR and how we can best use it. It’s also worth remembering skilled staff need to be on hand to support a patient to use VR; it’s no good just buying a VR set and expecting patients to use it on their own.</p>
<p>Our study, while limited, shows VR therapy may in some cases have a role to play to help palliative care patients experience moments of joy and comfort despite the seriousness of their illness.</p><img src="https://counter.theconversation.com/content/223186/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tobias Loetscher received funding from the Breakthrough Mental Health Research Foundation for this project.</span></em></p><p class="fine-print"><em><span>Gregory Crawford has received funding from the NHMRC, the MRFF and Cancer Australia. </span></em></p>One person said they wanted a VR experience that allowed them to explore Paris again.Tobias Loetscher, Associate Professor, University of South AustraliaGregory Crawford, Professor in Palliative Medicine, University of AdelaideLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2216082024-01-31T19:08:15Z2024-01-31T19:08:15ZVirtual reality grooming is an increasing danger. How can parents keep children safe?<figure><img src="https://images.theconversation.com/files/572092/original/file-20240130-27-o3plxl.jpg?ixlib=rb-1.1.0&rect=1208%2C906%2C4920%2C3286&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/girl-7-yo-experiencing-vr-headset-1391525618">SOK Studio/Shutterstock</a></span></figcaption></figure><p>Virtual reality (VR) headsets are increasingly popular among adults and children. They are part of extended reality environments, which “<a href="https://onlinelibrary.wiley.com/doi/10.1002/9781119636113.ch30">enable ever more realistic and immersive experiences</a>”.</p>
<p>VR provides entry into computer-generated 3D worlds and games with different environments and interactions. Sometimes this is loosely referred to as the “metaverse”.</p>
<p>The majority of VR headsets have a <a href="https://www.nytimes.com/2023/06/16/technology/meta-virtual-reality-headset-children-safety.html">lower age limit of 10–13 years</a> due to <a href="https://www.unicef.org/globalinsight/stories/metaverse-and-children">safety concerns</a> of extended reality technologies in general and VR headsets in particular.</p>
<p>But VR is increasingly used by young children, even of <a href="https://www.sciencedirect.com/science/article/abs/pii/S0747563222003661">preschool age</a>. These immersive technologies make it difficult to monitor children’s physical and emotional experiences and with whom they interact. So what are the dangers, and what can we do to keep the kids safe?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">What is the metaverse, and what can we do there?</a>
</strong>
</em>
</p>
<hr>
<h2>The good and the bad</h2>
<p>VR allows children to dive into a digital world where they can immerse themselves into different characters (avatars). Thanks to the richness of the stimuli, VR can give the illusion of actually being in the virtual location – this is called “<a href="https://www.techtarget.com/whatis/definition/virtual-presence">virtual presence</a>”. </p>
<p>If children then interact with other people in the virtual world, the psychological realism is enhanced. These experiences can be fun and rewarding.</p>
<p>However, they can also have negative impacts. Children tend to have <a href="https://www.unicef.org/globalinsight/media/3056/file/UNICEF-Innocenti-Rapid-Analysis-Metaverse-XR-and-children-2023.pdf.pdf">difficulty distinguishing</a> between what occurs within VR and in the real world.</p>
<p>As children <a href="https://link.springer.com/article/10.1007/s10055-021-00563-w">identify with their avatars</a>, the boundary between them and the <a href="https://learning.nspcc.org.uk/research-resources/2023/child-safeguarding-immersive-technologies">VR device is blurred</a> when playing in the metaverse.</p>
<p>Children can even develop traumatic memories when playing in virtual worlds. Due to the immersive nature of VR, the sense of presence makes it feel as if the child’s avatar is actually “real”. </p>
<p>Research is still emerging, but it is known children can form memories from virtual experiences, which means sexual abuse that occurs virtually could turn into a <a href="https://learning.nspcc.org.uk/research-resources/2023/child-safeguarding-immersive-technologies">real-world traumatic memory</a>.</p>
<h2>The rise of ‘cyber grooming’</h2>
<p>Research has found that online predators use different grooming strategies to <a href="https://learning.nspcc.org.uk/media/ezjg0pjb/online-risks-children-evidence-review-main-report.pdf">manipulate children into sexual interactions</a>. This sometimes leads to offline encounters without the knowledge of parents.</p>
<p>Non-threatening grooming strategies that build relationships are common. Perpetrators may use friendship strategies to develop a relationship with children and to build trust. The child then views the person as a trusted friend rather than a stranger. As a result, the prevention messages about strangers learned through education programs are ineffective in protecting children.</p>
<p>A recent <a href="https://journals.sagepub.com/doi/pdf/10.1177/15248380231194072?casa_token=1nA9ftvuB_AAAAAA:QlhaRr0-ori1SG5SdwUnV9XwK8JUhmwHW3hzgZyJluTDbFPi5KlxCZe2tfvsmHPRXpMuGLeiONiWVxg">meta-analysis</a> found that online sex offenders are usually acquaintances. Unsurprisingly, a proportion of adult predators pretend to be peers (that is, other children or teens).</p>
<p>Sexual approaches by adults occur more commonly on platforms that are widely used by children. “Sexual communication with a child” offences, according to police statistics from the United Kingdom, <a href="https://learning.nspcc.org.uk/media/ezjg0pjb/online-risks-children-evidence-review-main-report.pdf">increased by 84%</a> between 2017–18 and 2021–22.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/children-have-been-interacting-in-the-metaverse-for-years-what-parents-need-to-know-about-keeping-them-safe-202303">Children have been interacting in the metaverse for years – what parents need to know about keeping them safe</a>
</strong>
</em>
</p>
<hr>
<p>Due to the hidden nature of cyber grooming, it is difficult to know the true prevalence of this issue. Some <a href="https://learning.nspcc.org.uk/research-resources/2023/online-risks-to-children-evidence-review">police reports in Europe indicate</a> that approximately 20% of children have experienced online sexual solicitation, and up to 25% of children reported sexual interaction with an adult online.</p>
<p>Concerning reports by Europol indicate that <a href="https://doi.org/10.2813/81062">children have been drawn into erotic role play</a> online. In interviews with researchers, some parents <a href="https://www.researchgate.net/publication/339228857_2020_Educated_women_as_gatekeeprs_to_prevent_sexual_exploitation_of_children#fullTextFileContent">have also shared anecdotal experiences</a> of their children being exposed to explicit sex acts on social online gaming platforms <a href="https://blog.hootsuite.com/what-is-roblox/">such as Roblox</a>.</p>
<p>Such encounters have the potential to create memories as if the virtual experience had happened in real life.</p>
<p>For parents it is important to know that cyber groomers are well versed in the use of extremely popular virtual worlds. These provide predators with anonymity and easy access to children, where they can lure them into sexual engagement.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Young boy in VR headset stands in his bedroom and uses wireless controllers in his hands" src="https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=316&fit=crop&dpr=1 600w, https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=316&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=316&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=397&fit=crop&dpr=1 754w, https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=397&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/572095/original/file-20240130-27-xcu9gi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=397&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Children can immerse themselves into virtual words, where interacting with others is fun, but potentially confusing.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/young-boy-vr-headset-plays-augmented-2321840963">Frame Stock Footage/Shutterstock</a></span>
</figcaption>
</figure>
<h2>Parents must try VR themselves</h2>
<p>A recent report from the <a href="https://www.iwf.org.uk/news-media/news/under-10s-groomed-online-like-never-before-as-hotline-discovers-record-amount-of-child-sexual-abuse/">Internet Watch Foundation charity</a> reports that a record number of young children have been manipulated into performing sexual acts online.</p>
<p>Through the metaverse, a sexual offender can be virtually brought into a child’s bedroom and engage in sexual behaviours through the child’s VR device. As VR worlds become more immersive, the danger for children only increases.</p>
<p>Grooming occurs where parents least expect it to happen. To mitigate this danger, parents need to be aware of <a href="https://icmec.org.au/blog/the-new-stranger-danger-tactics-used-in-the-online-grooming-of-children/">online grooming patterns</a> – such as isolating the child, developing their trust and asking them to hide a relationship.</p>
<p>Recognising the signs early can prevent the abuse from happening. But this can be difficult if parents aren’t familiar with the technology their child is using.</p>
<p>To help them understand what their children experience in extended reality environments, parents must familiarise themselves with VR and the metaverse.</p>
<p>If parents experience and experiment with the VR technology themselves, they can have conversations with their children about their experiences and understand with whom the child might interact with.</p>
<p>This will allow parents to make informed decisions and put tailored safeguarding measures in place. These safeguards include reviewing the parental controls and safety features on each platform, and actively learning what their children are playing and whom they are interacting with.</p>
<p>With such safeguards in place, parents can allow their children to have fun with VR headsets while keeping them protected.</p>
<hr>
<p><em>If you believe your child is targeted by grooming or exploitation, or you come across exploitation material, you can report it via <a href="https://www.thinkuknow.org.au/">ThinkuKnow</a> or contact your local police.</em></p>
<p><em>If you are a child, teen or young adult who needs help and support, call the Kids Helpline on 1800 55 1800.</em></p>
<p><em>If you are an adult who experienced abuse as a child, call the Blue Knot Helpline on 1300 657 380 or <a href="https://blueknot.org.au/survivors/blue-knot-helpline-redress-support-service/">visit their website</a>.</em></p><img src="https://counter.theconversation.com/content/221608/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marika Guggisberg does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Online dangers lurk in virtual worlds for children. As more preschoolers immerse themselves in virtual reality, we must manage the risks and keep them safe.Marika Guggisberg, Senior Lecturer, Domestic and Family Violence, CQUniversity AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2219102024-01-29T19:05:11Z2024-01-29T19:05:11ZEditing memories, spying on our bodies, normalising weird goggles: Apple’s new Vision Pro has big ambitions<figure><img src="https://images.theconversation.com/files/571783/original/file-20240128-25-8hsbjk.png?ixlib=rb-1.1.0&rect=7%2C3%2C2541%2C1425&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Apple</span></span></figcaption></figure><p>Apple Vision Pro is a mixed-reality headset – which the company hopes is a “<a href="https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/">revolutionary spatial computer</a> that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” – that begins shipping to the public (in the United States) later this week. </p>
<p>Critics have <a href="https://www.wired.com/story/apple-vision-pro-doomed/">doubted the appeal</a> of the face-worn computer, which “seamlessly blends digital content with the physical world”, but Apple has pre-sold <a href="https://www.engadget.com/apple-might-have-sold-up-to-180000-vision-pro-headsets-over-pre-order-weekend-081727344.html">as many as 180,000</a> of the US$3,500 gizmos.</p>
<p>What does Apple think people will do with these pricey peripherals? While uses will evolve, Apple is focusing attention on watching TV and movies, editing and reliving “memories”, and – perhaps most importantly for the product’s success – having its customers not look like total weirdos.</p>
<p>Apple hopes the new device will redefine personal computing, like the iPhone did 16 years ago, and Macintosh did 40 years ago. But if it succeeds, it will also redefine concerns about privacy, as it captures enormous amounts of data about users and their environments, creating an unprecedented kind of “<a href="https://journals.sagepub.com/doi/abs/10.1177/1354856521989514">biospatial surveillance</a>”.</p>
<h2>Spatial computing</h2>
<p>Apple is careful about its brand and how it packages and describes its products. In an extensive set of <a href="https://developer.apple.com/visionos/submit/#:%7E:text=Don%27t%20refer%20to%20Apple,first%20word%20in%20a%20sentence.">rules for developers</a>, the company insists the new headset is not to be referred to as a “headset”. What’s more, the Apple Vision Pro does not do “augmented reality (AR), virtual reality (VR), extended reality (XR), or mixed reality (MR)” – it is a gateway to “spatial computing”.</p>
<p>Spatial computing, as sketched out in the <a href="https://acg.media.mit.edu/people/simong/thesis/SpatialComputing.pdf">2003 PhD thesis</a> of US software engineer Simon Greenwold, is: “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces”. In other words, the computer can interact with things in the user’s physical surroundings in real time to provide new types of experiences.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A CGI dinosaur stands on a rocky field." src="https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=328&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=328&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=328&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=412&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=412&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571805/original/file-20240129-25-4y9k16.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=412&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Vision Pro comes with an app that lets users get up close and personal with dinosaurs.</span>
<span class="attribution"><a class="source" href="https://www.apple.com/tv-pr/news/2024/01/apple-tv-unveils-groundbreaking-immersive-originals-from-todays-biggest-storytellers-set-to-debut-on-apple-vision-pro/">Apple</a></span>
</figcaption>
</figure>
<p>The Vision Pro has big shoes to fill for new user experiences. The iPhone’s initial “killer apps” were <a href="https://www.macworld.com/article/183052/liveupdate-15.html">clear</a>: the internet in your pocket (including portable access to Google Maps), all your music on a touch screen, and “<a href="https://www.youtube.com/watch?v=c3j03bOOBwY">visual voicemail</a>”. </p>
<p>Sixteen years later, all three of these seem unremarkable. Apple has sold billions of iPhones, and some <a href="https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/">80% of humans</a> now use a smartphone. Their success has all but killed off earlier tools like paper maps and music CDs (and the ubiquity of text, image and video messaging has largely done away with voicemail itself).</p>
<h2>Killer apps</h2>
<p>We don’t yet know what the killer apps of spatial computing might be – if any – but here is where Apple is pointing our attention.</p>
<p>The first is entertainment: the Vision Pro promises “<a href="https://www.apple.com/newsroom/2024/01/apple-previews-new-entertainment-experiences-launching-with-apple-vision-pro/">the ultimate personal theatre</a>”.</p>
<p>The second is an attempt to solve the social problem of walking around with a weird headset covering half your face. An external screen on the goggles shows a constantly updated representation of your eyes to <a href="https://cavrn.org/the-identity-emotion-and-gaze-behind-apples-vision-pro/">offer important social cues about your gaze</a> to those around you. Admittedly, this looks weird. But Apple hopes it is less weird and more useful than trying to interact with humans wearing blank aluminium ski goggles.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man sitting on a couch wearing a headset while an image of children playing floats in the air in front of him." src="https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=325&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=325&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=325&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=408&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=408&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571806/original/file-20240129-27-kmnd7y.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=408&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Reliving ‘memories’ with the Apple Vision Pro.</span>
<span class="attribution"><a class="source" href="https://www.apple.com/apple-vision-pro/">Apple</a></span>
</figcaption>
</figure>
<p>The third is the ability to capture and and relive “memories”: recording and playback of 3D visual and audio from real events. Reviewers have found it striking: </p>
<blockquote>
<p>this was <a href="https://www.cnet.com/tech/computing/i-saw-my-iphone-spatial-movies-in-apple-vision-pro/">stuff from my own life</a>, my own memories. I was playing back experiences I had already lived.</p>
</blockquote>
<p>Apple has <a href="https://www.patentlyapple.com/2023/10/a-new-vision-pro-patent-describes-its-3d-camera-allowing-users-to-relive-memories-add-notes-commentary-about-that-moment.html">patented</a> tools to select, store, and annotate digital “memories”. These memories are files, and potentially products, to be shared in “spatial videos” <a href="https://www.apple.com/au/newsroom/2023/12/apple-introduces-spatial-video-capture-on-iphone-15-pro/">recorded on the latest iPhones</a>. </p>
<h2>Biospatial surveillance</h2>
<p>There is already a large infrastructure devoted to helping tech companies track our behaviour in order to sell us things. Recent <a href="https://www.consumerreports.org/electronics/privacy/each-facebook-user-is-monitored-by-thousands-of-companies-a5824207467/">research</a> found Facebook, for example, receives data from an average of around 2,300 companies on each individual user. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/explainer-what-is-surveillance-capitalism-and-how-does-it-shape-our-economy-119158">Explainer: what is surveillance capitalism and how does it shape our economy?</a>
</strong>
</em>
</p>
<hr>
<p>Spatial computing offers a step change to this tracking. In order to function, spatial computing records and uses vast amounts of intimate data about our bodies and surroundings. </p>
<p>One <a href="https://www.slideshare.net/kentbye/towards-a-framework-for-xr-ethics-kent-bye-awe-november-11-2021">study on headset design</a> noted no fewer than 64 different streams of biometric and physiological data, from eye tracking and pupil response to subtle changes in the body’s electromagnetic field. </p>
<h2>Your face tomorrow</h2>
<p>This is not “consumer” data like the brand of toothpaste you buy. It is more akin to medical data. </p>
<p>For instance, <a href="http://www.mkhamis.com/data/papers/abraham2022nordichi.pdf">analysing a person’s unconscious movements</a> can reveal their emotional state or even predict neurodegenerative disease. This is called “<a href="https://xrsi.org/definition/biometrically-inferred-data-bid">biometrically inferred data</a>” as users are unaware their bodies are giving it up.</p>
<p>Apple suggests it won’t share this type of data with anyone, and Apple has proven better than most companies on privacy. But biospatial surveillance puts more of ourselves in use for spatial computing, in ways that are expanding.</p>
<p>It starts simply enough in the pre-order process, where you need to scan your facial features with your iPhone (to ensure a snug fit). But that’s not the end of it.</p>
<p>Apple’s <a href="https://patents.google.com/patent/WO2023196257A1/en?oq=WO2023196257">patent about memories</a> is also about how to “guide and direct a user with attention, memory, and cognition” through feedback loops that monitor “facial recognition, eye tracking, user mood detection, user emotion detection, voice detection, etc. [from a] bio-sensor for tracking biometric characteristics, such as health and activity metrics […] and other health-related information”. </p>
<h2>Social questions</h2>
<p>Biospatial surveillance is also the key to Apple’s attempt to solve the social problems created by wearing a headset in public. The external screen showing a simulated approximation of the user’s gaze relies on constant measurement of the user’s expression and eye movement with multiple sensors.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man wearing goggles with a screen that shows his eyess" src="https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=312&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=312&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=312&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=393&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=393&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571865/original/file-20240129-21-5qnrow.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=393&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An external screen shows a representation of the user’s eyes.</span>
<span class="attribution"><a class="source" href="https://youtu.be/IY4x85zqoJM?feature=shared&t=57">Apple</a></span>
</figcaption>
</figure>
<p>Your face is constantly mapped so others can see it – or rather see Apple’s vision of it. Likewise, as passersby come into range of the Apple Vision Pro’s sensors, Apple’s vision of them is automagically rendered into your experience, whether they like it or not. </p>
<p>Apple’s new vision of us – and those that surround us – shows how the requirements and benefits of spatial computing will pose new privacy concerns and social questions. The extensive biospatial surveillance that captures intimate biometric and environmental data redefines what personal data and social interactions are possible for exploitation.</p><img src="https://counter.theconversation.com/content/221910/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Luke Heemsbergen does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Apple’s plan for ‘spatial computing’ may redefine personal computing – and also facilitate trouble new kinds of surveillance.Luke Heemsbergen, Senior Lecturer, Digital, Political, Media, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2212602024-01-22T16:32:50Z2024-01-22T16:32:50ZFour ways AI will impact music, from Elvis holograms to interactive soundscapes<p>In the heart of London, a <a href="https://www.theguardian.com/music/2024/jan/04/elvis-evolution-hologram-show-london-premiere">new kind of show</a> is unfolding. Elvis Presley, the king of rock ‘n’ roll, is to take to the stage once more – not in flesh and blood of course, but as a <a href="https://www.respeecher.com/blog/holograms-real-life-technology-works-industry-use-cases">hologram</a>. This spectacle, titled <a href="https://elvis.layeredreality.com/">Elvis Evolution</a>, is more than just a concert and offers a distinct experience from the likes of Abba’s digital avatars: it’s a testament to how artificial intelligence (AI) is reshaping our experience of music and performance. </p>
<p>Unlike the <a href="https://abbavoyage.com/">Abba Voyage</a> hologram show, which primarily focuses on delivering a high-tech concert experience, Elvis Evolution aims to provide an <a href="https://news.sky.com/story/ai-elvis-presley-to-star-on-uk-stage-for-first-time-with-never-seen-before-performances-13041602">immersive journey</a> into Elvis’s life. It will feature interactive sets and multi-sensory elements to transport the audience back in time.</p>
<p>While Abba Voyage uses <a href="https://www.huffingtonpost.co.uk/entry/abba-voyage-behind-the-scenes-secrets_uk_6290de90e4b0cda85dbdf503">motion-capture technology</a> for highly detailed avatars, Elvis Evolution will employ AI-generated animation for a more flexible and dynamic performance, potentially featuring different eras of Presley’s career.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/iEikjzZO2N8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Crucially, this AI-powered Elvis offers the possibility of real-time interaction with the audience, unlike the prerecorded nature of Abba Voyage. This means that while both shows are remarkable in their own right, Elvis Evolution will offer a broader exploration of an artist’s life and career – a unique, multi-sensory holographic experience.</p>
<p>The application of AI in holographic projection is not an entirely recent development – indeed, this technology has been utilised for some years now. A notable example is the DJ Eric Prydz, who has been <a href="https://www.livedesignonline.com/news/eric-prydz-holo-stuns-holographic-3d-fx-powered-by-avolites-ai">incorporating AI-driven holographic projections</a> into his electronic dance music performances for more than 15 years.</p>
<p>But the Elvis hologram show serves as an impressive demonstration of AI’s capability to resurrect iconic artists more and more realistically. Utilising AI in conjunction with holographic projection, technicians and artists are able to forge an almost tangible representation of revered entertainers who are now long gone.</p>
<p>This method involves a detailed analysis of thousands of photographs and videos, enabling a recreation that captures the true spirit of the artist. This isn’t just a trip down memory lane – it’s a leap into a new era where technology bridges the gap between past and present, allowing fans to relive concerts that were once just a fleeting moment in time.</p>
<p>But how else will the rapidly accelerating technology of AI affect music and performance? </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/3siwZmnxIDo?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>1. AI in music production</h2>
<p>Transitioning from the stage to the studio, AI’s role in music production is equally ground-breaking. As an <a href="https://www.siliconindia.com/news/general/ai-scientist-somdip-dey-aka-intelidey-breaks-ground-with-his-first-melodic-deep-house-release-warning-nid-225637-cid-1.html">AI music producer</a>, I have experienced firsthand how these algorithms can compose, create unique sounds and even foresee music trends.</p>
<p>AI tools analyse vast amounts of music data to learn patterns and styles, enabling them to generate compositions in any genre. This <a href="https://timesofindia.indiatimes.com/blogs/a-window-to-the-tech-world/generative-ais-crescendo-in-music-production/">technology</a> is not just a tool, it’s a collaborator, opening doors to new soundscapes and musical possibilities.</p>
<h2>2. The future of live performance</h2>
<p>Beyond holograms, AI is poised to transform live performances in ways we’ve only begun to imagine. Picture a concert where the music adapts in real-time to the mood of the audience, or where immersive soundscapes change based on real-time interactions. </p>
<p>These AI-driven experiences promise to make live shows more dynamic and responsive, offering a level of personalisation that goes beyond a one-size-fits-all performance.</p>
<h2>3. Ethical and creative implications</h2>
<p>The use of AI to resurrect artists for posthumous performances sparks a profound ethical debate. This dilemma centres on the question: is it ethical to “resurrect” artists for performances they never consented to? </p>
<p>On the one hand, these technological marvels allow fans to relive the magic of legendary performers, creating new experiences with historical figures. On the other, it raises concerns about consent, authenticity and the moral rights of those who are no longer here to voice their opinions.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/uJE8pfPfVRo?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Addressing these ethical dilemmas involves a multifaceted approach. First, the consent of the artist’s estate is critical. This includes respecting the wishes of the family and the legal entities that manage the artist’s legacy. However, legal consent is just one aspect. There’s also a moral responsibility to stay true to the artist’s style, ethos and message. This means not just recreating an artist’s likeness but capturing the essence of their artistry in a way that honours their legacy.</p>
<p>Another layer to this debate is the authenticity of the experience. While AI and holographic technology can create visually and sonically accurate representations, they cannot encapsulate the spontaneous, human nuances that defined many great performers. Preserving the integrity of the original performances becomes essential. It’s about striking a balance between innovation and respect, ensuring these recreations do not distort or oversimplify the artist’s contributions to their art.</p>
<p>So crucially, there’s a creative responsibility that comes with using AI in this context. It should be about more than just replicating past performances. These concerts should also explore how these artists might have evolved or collaborated with contemporary talents. </p>
<p>This approach not only pays homage to the artists’ past works but also imagines their potential future contributions, blending historical influence with modern creativity.</p>
<h2>4. Audience engagement</h2>
<p>Finally, AI is revolutionising how audiences engage with music. From virtual reality concerts offering a 360-degree sensory experience to <a href="https://techcrunch.com/2023/12/14/spotify-confirms-test-of-prompt-based-ai-playlists-feature/">AI-curated playlists</a> that understand our preferences better than we do, the future of music is not just about listening, it’s about experiencing. We are moving towards a world where music is not just heard but felt and lived in ways that transcend traditional boundaries.</p>
<p>As we stand at the crossroads of technology and creativity, the possibilities are as limitless as our imagination. The Elvis hologram show is just the beginning of the future of AI-led concerts. In this new landscape, AI is not just a tool – it’s a canvas, a stage and a new voice in the chorus of musical innovation.</p>
<hr>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/536131/original/file-20230706-17-460x2d.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em>Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. <a href="https://theconversation.com/uk/newsletters/something-good-156">Sign up here</a>.</em></p>
<hr><img src="https://counter.theconversation.com/content/221260/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Somdip Dey does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Thanks to AI, the future of music is not just about listening, it’s about experiencing.Somdip Dey, Embedded Artificial Intelligence Scientist & AI Music Producer, University of EssexLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2126992023-11-12T19:15:26Z2023-11-12T19:15:26ZA 360 camera, 1°C weather and an ambitious VR documentary: what I learnt as cinematographer on Sorella’s Story<figure><img src="https://images.theconversation.com/files/548189/original/file-20230913-23-58a3d9.jpg?ixlib=rb-1.1.0&rect=2%2C0%2C1747%2C2043&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Author provided</span></span></figcaption></figure><p>How does one successfully navigate obstacles such as extreme weather, a tight deadline and a spontaneous shot list in a foreign country as a solo cinematographer on a 360 project? </p>
<p>In December 2019 I was in a group of Griffith Film School master’s degree students who travelled to Hungary and Latvia to create an immersive short documentary film using 360 virtual reality (VR) technology. </p>
<p>Sorella’s Story, written and directed by Peter Hegedus, associate professor and filmmaker at Griffith University, showcases re-enactments based on photos of atrocities committed against Jewish people during the Holocaust in Latvia.</p>
<p>The shot schedule was ambitious. We had five exterior scenes to be shot in only a few hours because of the limited daylight. We had a crew of about ten people. </p>
<p>I was director of photography and the only cameraperson. A daunting task in any filmmaking situation, it was made tenfold more challenging by being a 360 project that no one on the crew had experience working with.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/a327ndvC3gY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-air-we-breathe-how-i-have-been-observing-atmospheric-change-through-art-and-science-187985">The air we breathe: how I have been observing atmospheric change through art and science</a>
</strong>
</em>
</p>
<hr>
<h2>New technology brings new challenges</h2>
<p>Viewed through virtual reality lenses, 360-degree films offer the viewer an opportunity to watch a video from all angles.</p>
<p>Unlike traditional cameras with a single lens, our 360 camera looks like a soccer ball, with six small lenses placed throughout the body. </p>
<p>It was a new technology for me and I was curious to see how it was going to change our approach. For example, the six lenses film simultaneously, so the operator and crew need to ensure we have a safe spot to hide to avoid being caught in the frame. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Split screen: women in white, two people in coats." src="https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=406&fit=crop&dpr=1 600w, https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=406&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=406&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=510&fit=crop&dpr=1 754w, https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=510&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/548184/original/file-20230913-29-8j8p81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=510&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The documentary film featured re-enactments based on photographs.</span>
<span class="attribution"><span class="source">Author provided.</span></span>
</figcaption>
</figure>
<p>The distance actors appear from the lens is especially important in 360 filming. This is because the images are “stitched” together in post-production. If the subjects are too close to the lenses, the images can’t be combined to create the appearance of a single shot. </p>
<p>After our test shoots, we gave actors marks to hit in and out of frames and the maximum and minimum distances they could be from the camera. These modifications enabled us to capture the action from all 360 angles. </p>
<p>We needed precise blocking and rehearsed co-ordination between actors and crew to capture the entire scene. Every time a scene was recorded, the director would call action, and the sound and camera crew would have a few seconds to run and hide out of frame. Only then would the actors begin to move. </p>
<p>360 inherently brings with it technical challenges, but Sorella’s Story had the compounding issues of weather, a remote location and myself as a cinematographer without a crew and limited time to learn the technology. </p>
<h2>Shooting plan</h2>
<p>Prior to filming a conventional project, directors and cinematographers break down the script into a shot list – a written breakdown of every shot that will be undertaken – and storyboard, visually symbolising those shots through illustrations or sketches. </p>
<p>Both tools help the filmmaking process and ensure the creative vision is realised on set. </p>
<p>Storyboards are less important in 360: you aren’t considering how different angles will be used in a shoot, and there is much more spontaneity in the actors’ movement. There is so much action to capture at once storyboards would just confuse the issue. </p>
<p>Instead, a shot list and script were followed in some moments, but were used as only a guide.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A film set." src="https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=565&fit=crop&dpr=1 754w, https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=565&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/548186/original/file-20230913-21-k6aatf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=565&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The cast and crew faced cold and icy conditions.</span>
<span class="attribution"><span class="source">Author provided.</span></span>
</figcaption>
</figure>
<h2>Obstacles and problem-solving</h2>
<p>December is one of the coldest months of the year in Budapest, Hungary, with average temperatures of no more than 1°C. At this time of year the days are short, the nights are long, and icy weather conditions are expected. Those conditions brought another challenge: the battery life of electronic devices.</p>
<p>I quickly learned cold weather drains the battery. I tried to reduce cold exposure on the camera by covering the camera with my beanie, with limited success. Battery life that was usually two hours was down to 20 minutes. </p>
<p>Because of the limited budget, we had only two batteries for each device. Ideally, we would have one battery in the camera and the other plugged into the charger. </p>
<p>However, we had no power supply on set. Every time a battery ran out it would be 10 minutes to the nearest power supply, plus at least 30 minutes to recharge.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A beanie on a camera." src="https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=337&fit=crop&dpr=1 600w, https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=337&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=337&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=423&fit=crop&dpr=1 754w, https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=423&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/548185/original/file-20230913-27-37qwo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=423&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Gilberto Roque protecting the camera from the snow.</span>
<span class="attribution"><span class="source">Jemma Potgieter</span></span>
</figcaption>
</figure>
<p>Shooting in this cold climate, ensuring I was invisible on set and maintaining the delicate balance of the distance of actors from the camera demanded a complete re-evaluation of my filmmaking approach. It forced me to be agile in my workflow and engage in real-time problem-solving. </p>
<p>Despite the inherent challenges, working on this project provided me with invaluable experience in this cutting-edge technology. With the current interest in immersive experiences, 360 cinematography has a part to play in cinema’s future.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/google-earth-is-an-illusion-how-i-am-using-art-to-explore-the-problematic-nature-of-western-maps-and-the-myth-of-terra-nullius-187921">Google Earth is an illusion: how I am using art to explore the problematic nature of western maps and the myth of 'terra nullius'</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/212699/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gilberto Roque was a master's student at Griffith Film School when working on Sorella's Story.</span></em></p>How does one successfully navigate obstacles such as extreme weather, a tight deadline and a spontaneous shot list in a foreign country on a 360 project as a solo cinematographer?Gilberto Roque, Lecturer, Filmmaker and Cinematographer, School of Creative Arts, University of Southern QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2134172023-09-28T15:52:30Z2023-09-28T15:52:30ZVirtual reality can help emergency services navigate the complexities of real-life crises<figure><img src="https://images.theconversation.com/files/550453/original/file-20230926-29-i6fpq7.jpeg?ixlib=rb-1.1.0&rect=0%2C0%2C2363%2C1569&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Brandon May</span>, <span class="license">Author provided</span></span></figcaption></figure><p>The UK has experienced several terrorist attacks, from the 2005 London bombings, to the devastating events at Manchester Arena and London Bridge in 2017. These tragic incidents not only resulted in the loss of innocent lives but were also <a href="https://www.emerald.com/insight/content/doi/10.1108/JFP-03-2018-0007/full/html?utm_campaign=Emerald_Health_PPV_Dec22_RoN">immensely challenging for emergency response teams</a>.</p>
<p>Each of these events required coordination between several different emergency services: the police, fire services and medical teams. Combining expertise across emergency response teams is <a href="https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/joop.12349">extremely demanding</a>. </p>
<p>For instance, the ability to <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1100274/full">collaborate effectively and make ethical choices</a> is crucial for minimising harm and saving lives. However, there’s often little time in which to do so. Responders are required to make split-second decisions that can mean the difference between life and death.</p>
<p>Therefore, decision making by emergency response teams is a high stakes game that can <a href="https://link.springer.com/article/10.3758/s13421-020-01056-y">influence the outcomes of different crises</a>. Emergency personnel are often required to make quick, well-informed decisions <a href="https://militaryhealth.bmj.com/content/166/2/72.abstract">under extreme stress and with limited resources</a>. </p>
<p>We have been part of a team from the universities of Portsmouth and Winchester investigating how virtual reality (VR) can help prepare responders for these scenarios. We’ve also been looking at how to make the technology more affordable.</p>
<p>There are already <a href="https://bpspsychub.onlinelibrary.wiley.com/doi/abs/10.1111/joop.12159">training scenarios</a> designed to improve what’s known as the “situational awareness” of the risks in an emergency situation – for instance, the ability to quickly grasp and understand the risks and dynamics of an incident. However, effectively implementing them necessitates competent decision making. </p>
<p>A notable obstacle is <a href="https://bpspsychub.onlinelibrary.wiley.com/doi/abs/10.1111/joop.12108?casa_token=9vv_LVS-WZAAAAAA:jtwDqJv8aihy08zypWK3Sq7h8PeVrm51VaT6hU61AXKJsYzbO3tcHC1lR_cQGXpV5J5zBdsljEi1x9J1">decision inertia</a> – the inclination to stick with an existing plan, even when better alternatives emerge or the situation evolves. This inertia can have psychological origins, <a href="https://bpspsychub.onlinelibrary.wiley.com/doi/abs/10.1111/joop.12309">including the fear of making mistakes or a reluctance to stray from established protocols</a>. </p>
<p>In emergencies, this can lead to less-than-desirable outcomes, such as <a href="https://www.kerslakearenareview.co.uk/media/1022/kerslake_arena_review_printed_final.pdf">delayed action</a> and a failure to distribute the right resources <a href="https://bpspsychub.onlinelibrary.wiley.com/doi/pdfdirect/10.1111/joop.12217">where they’re most needed</a>.</p>
<h2>VR and decision making</h2>
<p>To better understand and address these challenges, researchers and those on the front line are increasingly turning to immersive technologies, <a href="https://journals.sagepub.com/doi/full/10.1177/2041386620926037">such as VR</a>. VR technology can provide users with realistic 3D simulations that closely mirror real world scenarios. This makes it an invaluable tool for studying decision making under high stress. </p>
<p>For emergency responders, there are many benefits to working with VR. Firstly, it can act as a training simulator, allowing first responders to practice critical decision making during a crisis scenario played out in a simulated environment and <a href="https://library.imaging.org/ei/articles/34/12/ERVR-299">with a VR headset</a>. Secondly, it can <a href="https://www.sciencedirect.com/science/article/pii/S0002961007000712?casa_token=L3YT6ZxCeocAAAAA:J_2a-DpYmVs28CJPwQQH4gwOPu7or6L6HzPhsO_MOZLoHw-SY_mfzfVpycl8d0aJHqYlexPDPEWL">significantly reduce errors in the real world</a>. Given these advantages, the integration of VR in emergency training could be revolutionary.</p>
<p>For example, in real world emergencies, decision makers can swiftly identify patterns and cues, such as how best to respond to an unattended bag in a crowded area. These cues can in turn activate what are called mental scripts – predefined sequences of actions or solutions that have <a href="https://www.academia.edu/83266254/Imagination_and_expectation_The_effect_of_imagining_behavioral_scripts_on_personal_influences">proven effective in similar situations in the past</a>. This can minimise harm to both personnel and the public. </p>
<p>High fidelity (highly realistic) simulations offer the most authentic and immersive training experiences. However, despite the numerous advantages of VR in training employees and supporting real time decisions, its widespread adoption faces a major hurdle – a substantial price tag. </p>
<p>Industry solutions such as the VR development tool Unreal Engine, which forms the basis of many virtual environments, have been incredibly useful for training organisations. By practising their response strategies in <a href="https://www.theseus.fi/bitstream/handle/10024/120282/Tiira_Ville.pdf?sequence=1">virtual environments</a>, response teams improve their resilience to unexpected situations and their <a href="https://isprs-annals.copernicus.org/articles/X-3-W2-2022/9/2022/isprs-annals-X-3-W2-2022-9-2022.html">efficiency when faced with extreme events</a>. However, the financial burden of such advanced systems can be prohibitive for many emergency response organisations. Some of these organisations operate on limited financing, such as <a href="https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1100274/full">austerity budgets</a>.</p>
<p>Motivated by the need for more cost effective options, we are working with other researchers to investigate more affordable solutions that don’t sacrifice training quality. We have been developing proof of concept designs for VR systems that can fit on a desktop, using a laptop for example. Similar designs have been shown to be effective in <a href="https://www.tandfonline.com/doi/abs/10.1080/01639625.2017.1407104">recreating other types of event</a>.</p>
<p>By harnessing 360-degree cameras to create immersive simulations based around simulated terrorist events, desktop VR can closely <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/emp2.12903">mirror real world emergency scenarios</a>.</p>
<h2>Outperforming older methods</h2>
<p>We have recently produced our preliminary findings. They show that desktop VR can be a particularly valuable tool in training. This realism is pivotal in influencing users’ decision making. For example, VR-based training, especially the interactive kind that responds to a user’s input, has been <a href="https://www.sciencedirect.com/science/article/pii/S0925753523001170">shown to outperform</a> traditional video-based training methods. </p>
<p>When viewed through the lens of decision inertia, this suggests that immersive VR effectively emulates the high-stress, complex conditions under which emergency responders operate.</p>
<p>These immersive simulations can also recreate the <a href="https://econtent.hogrefe.com/doi/abs/10.1027/1016-9040/a000320?journalCode=epp">psychological processes that contribute to decision inertia</a> – a major challenge in real world crises. This applies to the cheaper, <a href="https://link.springer.com/article/10.1007/s11606-022-07557-7">desktop versions</a> of the simulations too. With this in mind, the value of VR emerges not just as a training tool, but also as an instrumental medium for exploring and understanding decision making processes.</p>
<p>Using these immersive technologies provides an alternative, realistic platform with <a href="http://hcilab.uniud.it/images/stories/publications/2014-09/FearArousal_VRST2014.pdf">similar performance outcomes</a> to conventional in-person or video-based training.</p>
<p>By leveraging more cost-effective VR technologies, such as desktop VR, emergency response organisations can provide high quality training that enhances the decision making skills of their employees. Immersive simulations can also create recognisable events that activate their mental scripts – the predefined actions that have proven useful in similar, past events.</p>
<p>This opens up possibilities for the wider adoption of VR in emergency response training, making it more accessible for agencies and organisations with different budgets.</p><img src="https://counter.theconversation.com/content/213417/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>VR can help responders hone the decision making skills they’ll need in emergency scenarios.Brandon May, Lecturer in Criminology, University of WinchesterSelina Robinson, Senior Lecturer in Forensic Investigation, University of WinchesterLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2105322023-08-08T12:29:26Z2023-08-08T12:29:26ZVirtual reality has negative side effects – new research shows that can be a problem in the workplace<figure><img src="https://images.theconversation.com/files/540888/original/file-20230802-23-rmf25w.jpg?ixlib=rb-1.1.0&rect=171%2C208%2C6377%2C4451&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Using virtual reality headsets can have negative side effects, like dizziness, headaches and nausea.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/woman-hold-vr-headset-suffer-from-motion-sickness-royalty-free-image/1428085442?adppopup=true">Charnsitr/iStock via Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>Some employers are excited about swapping out computer monitors for virtual reality headsets, but the side effects of using VR are not completely understood. In a <a href="https://doi.org/10.1007/s10055-022-00672-0">recent study</a>, my colleagues and I propose 90 factors that could influence VR side effects in the workplace. In <a href="https://doi.org/10.3389/fpsyg.2023.1161932">another study</a>, we suggest guidelines to reduce these negative symptoms.</p>
<p>Our analysis considers over 350 studies to identify a range of VR side effects. Some negative symptoms of VR use – like headaches, tiredness, <a href="http://dx.doi.org/10.1136/bmjophth-2018-000146">eyestrain</a> and neck and shoulder pain –are familiar to those workers who sit at a computer all day. </p>
<p>But the nature of VR introduces new avenues for discomfort, such as disorientation, dizziness, nausea and increased <a href="https://doi.org/10.1152/physrev.2001.81.4.1725">muscle fatigue</a>. Users can be <a href="https://doi.org/10.1080/00140139.2014.956151">overwhelmed with too much information</a>, and sudden or intense <a href="http://public.ebookcentral.proquest.com/choice/publicfullrecord.aspx?p=1127743">sources of stress</a> – like unexpected noises when talking in front of a virtual audience – can diminish attention and memory.</p>
<p>There are many factors that can affect the frequency and severity of these side effects. Some of these characteristics relate to the virtual environment content – for example, how complicated the scene is or the way VR reproduces user movements. Others have more to do with the user, such as age or how long they’re immersed in the VR simulation.</p>
<p>Although more research is needed to identify the exact relationship between side effects and their contributing factors, <a href="https://doi.org/10.3389/fpsyg.2023.1161932">our study</a> suggests several guidelines to mitigate side effects. Each individual’s risk level is unique, but there are basic things anyone can do, like taking regular breaks, not using VR for more than 30 minutes at a time, and stopping use immediately when any symptoms start.</p>
<h2>Why it matters</h2>
<p>Studies have found that <a href="https://doi.org/10.1002/9781119636113.ch30">80% of VR users</a> report mild to severe short-term side effects. Symptoms can make it harder to <a href="https://doi.org/10.1109/TVCG.2022.3203103">efficiently do basic tasks</a> like reading and writing emails.</p>
<p>Nonetheless, several tech giants, <a href="https://theconversation.com/is-the-metaverse-really-the-future-of-work-192633">like Meta and Microsoft</a>, are promoting VR technology as the future of the workplace. But to safeguard workers, employers need a better understanding of the negative side effects of VR. </p>
<h2>What’s next</h2>
<p>Some government organizations, both <a href="https://www.fda.gov/medical-devices/digital-health-center-excellence/augmented-reality-and-virtual-reality-medical-devices">in the U.S.</a> and <a href="https://osha.europa.eu/en/publications/digitalisation-and-occupational-safety-and-health-eu-osha-research-programme">abroad</a>, have already begun to identify safety concerns and <a href="https://www.anses.fr/en/content/what-are-risks-virtual-reality-and-augmented-reality-and-what-good-practices-does-anses#:%7E:text=Exposure%20to%20virtual%20reality%20can,term%20%22virtual%20reality%20sickness%22.">propose guidelines</a> for mitigating the side effects of VR. While in line with our study’s findings, these safety guidelines are often very broad, and some <a href="https://www.iso.org/standard/81847.html">have yet to be finalized</a>.</p>
<p>More research is needed to improve the quality of evidence. One way to gather more data is to use physiological sensors and machine learning models to detect VR side effects and better link each factor to a given effect.</p>
<p>Although researchers can identify influential factors, we don’t yet fully understand which ones are linked to specific side effects – or how strong those connections are. Researchers believe that some characteristics are connected to several VR side effects, but there’s some redundancy when looking at the list of symptoms.</p>
<p>At best, following suggested guidelines could reduce the risks of using VR. With the current level of evidence, it’s difficult to assess how high these risks are. Most assessments about VR side effects are short term. Long-term studies are just starting to be launched or published. Additional research is crucial to ensuring that VR helps workers rather than harming them.</p><img src="https://counter.theconversation.com/content/210532/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alexis Souchet receives funding from European Commission (H2020 program) and the Fulbright Commission. Alexis Souchet is a member of the AFXR - French association of all professionals of immersive technologies and their uses, and Boavizta, which evaluates the environmental impact of digital technologies across organizations. He is also a working group member at the Shift project relative to the Metaverse.
</span></em></p>Trading in PC monitors for VR headsets can cause workers to experience dizziness, headaches and nausea. Researchers are beginning to understand why and what can be done to minimize the effects.Alexis Souchet, Postdoctoral Researcher in Cognitive Ergonomics, University of Southern CaliforniaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2073072023-06-09T03:20:29Z2023-06-09T03:20:29ZThe Apple Vision Pro hasn’t really impressed consumers, but that isn’t the goal – for now<figure><img src="https://images.theconversation.com/files/531060/original/file-20230609-28-hnfwyk.jpeg?ixlib=rb-1.1.0&rect=43%2C54%2C7186%2C4758&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Jeff Chiu/AP</span></span></figcaption></figure><p>Apple’s new Vision Pro mixed reality headset has generated a significant amount of buzz. Announcing it at this year’s <a href="https://developer.apple.com/wwdc23/">Worldwide Developers Conference</a>, chief executive <a href="https://www.bbc.com/news/technology-65809408">Tim Cook said</a> the virtual and augmented reality headset will <a href="https://www.youtube.com/watch?v=TX9qSaGXFyg">allow users to</a> “see, hear and interact with digital content just like it’s in your physical space […] seamlessly blending the real and virtual worlds”.</p>
<p>The Vision Pro is the first new product category Apple has introduced since the Apple Watch in 2014. It marks the company’s foray into <a href="https://www.apple.com/au/newsroom/2023/06/introducing-apple-vision-pro/">spatial computing</a>. <a href="https://www.wsj.com/articles/apple-is-breaking-its-own-rules-with-a-new-headset-80c9b36c">Analysts</a>, <a href="https://www.google.com/finance/quote/AAPL:NASDAQ?sa=X&ved=2ahUKEwj8ztzBybL_AhWFd94KHUKbCv0Q3ecFegQIKhAh">markets</a> and <a href="https://www.gizmodo.com.au/2023/06/early-testers-vision-pro-apple/">consumers</a> have been quick to react – and not all positively. </p>
<p>On one hand, the headset has been <a href="https://techcrunch.com/2023/06/05/first-impressions-yes-apple-vision-pro-works-and-yes-its-good/">lauded for</a> its <a href="https://www.theverge.com/2023/6/5/23750003/apple-vision-pro-hands-on-the-best-headset-demo-ever">technical features</a>. It’s less clunky than competitors’ offerings and has a range of advanced capabilities, including hand and eye tracking, and the seamless combination of <a href="https://www.wsj.com/video/series/joanna-stern-personal-technology/apple-vision-pro-headset-first-look-impressive-immersive-and-heavy/776FE781-C5F2-4048-9161-08563DA7364E">virtual and augmented reality</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=337&fit=crop&dpr=1 600w, https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=337&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=337&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/531062/original/file-20230609-22-b9kivr.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The mixed reality headset has had a mixed reception, although it has generally impressed on the technical front.</span>
<span class="attribution"><span class="source">Jeff Chiu/AP</span></span>
</figcaption>
</figure>
<p>However, others can’t help but point out the hefty price tag of US$3,500 – and the fact that the general public has <a href="https://www.wired.com/story/do-people-actually-want-to-wear-vr-headsets/">simply not</a> <a href="https://www.wired.com/story/apple-vision-pro-doomed/">embraced</a> mixed reality headsets.</p>
<p>Globally, the demand for these headsets has been slowing. Fewer than <a href="https://www.idc.com/promo/arvr">nine million units</a> were shipped in 2022 (mostly by <a href="https://forwork.meta.com/quest/quest-pro/?ref=AVsRL14V5xl7CziavTtRzDrQ8O5yjImi3boLHA-qhBivxUIybXgQgnTdAcluip9cjZjrA7c67zPnPs5o6hzHTzHhbEBZ9MYq98Y7CrFJcfq2-vlUkmOQWpH81Ve59PH6QRUSjSXFlwQ1ejW5iuueSbY7DHyjlTvaROakHgFhkimSBnVniKx8eBZ_zYlTzI3Cktg9jAlUj3RfA2exKTP5Ps3WH1IndddONEOFiKsrIUPfQFsYk_qPBBw8VOTATf6p_jfNrHzpnFSqu8I4nHc02nw0BgZr3aaQzBvbaxkMnK5SutEvjTUiGtqbUB3VJjeUA-EF4BSkodpCwrtrBJa0zDGpk3yzgc6KwY16h9SP5Fr4en9VGIAt6jPTQSfpYa3Kb9UqVG">Meta</a>, Apple’s biggest competitor in this category).</p>
<p>Meta sees spatial computing as a big part of the tech future, despite <a href="https://www.theguardian.com/technology/2022/oct/27/metas-shares-dip-is-proof-metaverse-plan-never-really-had-legs-facebook">market analysts and critics</a> calling for the <a href="https://www.wired.com/story/what-is-the-metaverse/">metaverse</a> to be abandoned. Last week it released the Quest 3 at a relatively low cost of US$499. With continued heavy spending on the metaverse, developers of Quest 3 Reality Labs recorded an operating loss of <a href="https://www.cnbc.com/2023/06/01/meta-quest-3-unveiled-ahead-of-apples-planned-vr-headset-debut.html">US$3.99 billion</a> in the first quarter of 2023. </p>
<h2>So if there is no demand, who is Apple targeting?</h2>
<p>While Meta’s recent history might seem like a cautionary tale, timing and strategy are critical when it comes to technological innovation. And compared to Meta, Apple’s strategy seems prudent.</p>
<p>Apple is likely betting the app developer community will provide it with the <a href="https://appleinsider.com/articles/23/06/06/even-with-so-many-demonstrated-use-cases-apple-vision-pro-might-not-yet-have-a-purpose">use cases</a> it needs to turn the Vision Pro (and subsequent iterations) into its next big income generator – and perhaps change how we interact with this technology forever.</p>
<p>Getting developers to build exciting complementary offerings, such as apps and device add-ons, would give Apple a springboard to convince users of the Vision Pro’s <a href="https://www.smartinsights.com/manage-digital-transformation/digital-transformation-strategy/digital-marketing-models-technology-acceptance-model/">value</a>. But this won’t work without developers’ buy-in, which leads us to believe the Vision Pro is (at least for now) aimed at Apple’s <a href="https://appleinsider.com/articles/22/06/06/apple-now-has-over-34-million-registered-developers">34 million</a> registered <a href="https://developer.apple.com/wwdc23/">app developers</a>, rather than the broader user market. </p>
<p>It’s expected many of the apps on the App Store will work on <a href="https://kanebridgenews.com/apple-releases-vision-pro-headset-first-major-new-product-in-a-decade/">Vision OS</a>, the Vision Pro’s operating system, by the time the product is launched. Apple is already <a href="https://arstechnica.com/gadgets/2023/06/vision-pro-developer-kits-will-help-devs-get-their-apps-ready-before-launch/">supporting developers</a> with programs and tools to redesign apps for compatibility with the Vision Pro, and create new ones. </p>
<p>Users are attracted to a product that provides more app variety, and their migration to it further piques developers’ interest. Typically, this becomes a self-reinforcing cycle. Such a multiplication of value for consumers, coupled with Apple’s manufacturing capabilities, could allow the Vision Pro to rise to dominance. </p>
<p>And this isn’t just speculation; Apple has used this approach before.</p>
<h2>Leveraging an app-driven ecosystem</h2>
<p>Apple has a history of leveraging its app-driven <a href="https://www.investopedia.com/articles/personal-finance/042815/story-behind-apples-success.asp">ecosystem business model</a> to give its products the upper hand. One early example of this was the iPod and iTunes, wherein the Apple Music store, cloud connectivity and massive storage capacity (at the time) created an environment that locked users in. </p>
<p>More importantly, with the sophistication of the hardware and software, the ease of use and the novelty of the experience, users were happy to be locked in. </p>
<p>This approach has been repeated time and again with other Apple products, such as the <a href="https://theconversation.com/latest-updates-apple-is-trying-to-reclaim-its-major-innovator-status-by-making-you-wash-your-hands-141293">Apple Watch</a>. Once more, Apple drove innovation by linking the hardware to other devices and systems, introducing unique features and providing high-quality apps to generate interest. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-iphone-turns-15-a-look-at-the-past-and-future-of-one-of-the-21st-centurys-most-influential-devices-183137">The iPhone turns 15: a look at the past (and future) of one of the 21st century's most influential devices</a>
</strong>
</em>
</p>
<hr>
<h2>Competition heats up</h2>
<p>Ultimately, users will judge the value of the Vision Pro through a combination of objective and subjective information. According to initial reviews, the Vision Pro operates well, and Apple is using branding and marketing tactics to further create a perception of value.</p>
<p>All things considered, Apple’s entry into the mixed reality market represents a big threat to competitors. It has a track record of building hardware at scale and with progressively affordable prices. And let’s not forget its base of some two billion active devices to which the Vision Pro can link. </p>
<p>Apple’s massive ecosystem – built on devices, apps, developers and manufacturing partners – won’t be running dry anytime soon. And by the very fact of its existence, the Vision Pro has a shot at success.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-new-iphone-se-is-the-cheapest-yet-smart-move-or-a-premium-tech-brand-losing-its-way-136507">The new iPhone SE is the cheapest yet: smart move, or a premium tech brand losing its way?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/207307/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Martie-Louise Verreynne receives funding from the ARC and NHMRC. </span></em></p><p class="fine-print"><em><span>Margarietha de Villiers Scheepers has received funding from State and Local Governments for specific research projects.</span></em></p>Many point to Meta’s failings to make a case for mixed reality headsets having no future. But Apple’s approach is arguably much more strategic.Martie-Louise Verreynne, Professor in Innovation and Associate Dean (Research), The University of QueenslandMargarietha de Villiers Scheepers, Associate professor, University of the Sunshine CoastLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2071972023-06-08T17:00:56Z2023-06-08T17:00:56ZApple’s new Vision Pro mixed-reality headset could bring the metaverse back to life<figure><img src="https://images.theconversation.com/files/530701/original/file-20230607-16537-he81j2.jpg?ixlib=rb-1.1.0&rect=0%2C8%2C5548%2C3685&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The Apple Vision Pro headset is displayed in a showroom on the Apple campus on June 5, 2023, in Cupertino, Calif.</span> <span class="attribution"><span class="source">(AP Photo/Jeff Chiu)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/apple-s-new-vision-pro-mixed-reality-headset-could-bring-the-metaverse-back-to-life" width="100%" height="400"></iframe>
<p>The metaverse — <a href="https://www.newscientist.com/article/2286778-what-is-a-metaverse-and-why-is-everyone-talking-about-it/">a shared online space incorporating 3D graphics where users can interact virtually</a> — has been the subject of increased interest and the ambitious goal of big tech companies for the past few years.</p>
<p><a href="https://theconversation.com/facebook-relaunches-itself-as-meta-in-a-clear-bid-to-dominate-the-metaverse-170543">Facebook’s rebranding to Meta</a> is the clearest example of this interest. However, <a href="https://www.cnbc.com/2023/02/01/meta-lost-13point7-billion-on-reality-labs-in-2022-after-metaverse-pivot.html">despite the billions of dollars that have been invested</a> in the industry, the metaverse has yet to go mainstream. </p>
<p>After the struggles <a href="https://www.nytimes.com/2022/10/09/technology/meta-zuckerberg-metaverse.html">Meta has faced in driving user engagement</a>, many have written off the metaverse as a viable technology for the near future. But the technological landscape is a rapidly evolving one and new advancements can change perceptions and realities quickly. </p>
<p><a href="https://www.nytimes.com/wirecutter/blog/apple-wwdc-2023/">Apple’s recent announcement of the Vision Pro mixed-reality headset</a> at its annual Worldwide Developers Conference — the company’s largest launch since the Apple Watch was released in 2015 — could be the lifeline the metaverse needs.</p>
<h2>About the Vision Pro headset</h2>
<p>The Vision Pro headset is spatial computing device that allows users to interact with apps and other digital content using their hands, eyes and voice, all while maintaining a sense of physical presence. It supports 3D object viewing and spatial video recording and photography. </p>
<p>The Vision Pro is a mixed-reality headset, meaning it combines elements of augmented reality (AR) and virtual reality (VR). While VR creates a completely immersive environment, <a href="https://doi.org/10.1007/0-387-30038-4_10">AR overlays virtual elements onto the real world</a>. Users are able to control how immersed they are while using the Vision Pro.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/TX9qSaGXFyg?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A video from Apple introducing the Vision Pro headest.</span></figcaption>
</figure>
<p>From a technological standpoint, <a href="https://www.apple.com/apple-vision-pro/?mtid=2092506t42625&aosid=p238&mnid=sKiK0nq00-dc_mtid_2092506t42625_pcrid_661145521361_pgrid_149209007039_pexid__&cid=wwa-ca-kwgo-avalanche-slid----Announce">the Vision Pro uses two kinds of microchips:</a> the M2 chip, which is currently used in Macs, and the new R1 chip. </p>
<p>The new R1 chip processes input from 12 cameras, five sensors and six microphones, which reduces the likelihood of any motion sickness given the absence of input delays.</p>
<p>The Vision Pro display system also features a whopping 23 million pixels, meaning it will be able to deliver an almost real-time view of the world with a lag-free environment.</p>
<h2>Why do people use new tech?</h2>
<p>To gain a better understanding of why Apple’s Vision Pro may throw the metaverse a lifeline, we first need to understand what drives people to accept and use technology. From there, we can make an informed prediction about the future of this new technology.</p>
<p>The first factor that drives the adoption of technology is how easy a piece of technology will be to use, <a href="https://www.jstor.org/stable/2632151">along with the perceived usefulness of the technology</a>. Consumers need to believe technology will add value to their life in order to find it useful.</p>
<p>The second factor that drives the acceptance and use of technology is <a href="https://doi.org/10.3390/jtaer17020036">social circles</a>. People usually look to their family, friends and peers for cues on what is trendy or useful.</p>
<p>The third factor is <a href="https://doi.org/10.1080/13683500.2023.2165483">the level of expected enjoyment of a piece of technology</a>. This is especially important for immersive technologies. Many factors contribute to enjoyment such as system quality, immersion experiences and interactive environment.</p>
<p>The last factor that drives mainstream adoption is <a href="https://doi.org/10.1016/j.jbusres.2019.01.017">affordability</a>. More important, however, is the value derived from new technology — the benefits a user expects to gain, minus costs.</p>
<h2>Can Apple save the metaverse?</h2>
<p>The launch of the Vision Pro seems to indicate Apple has an understanding of the factors that drive the adoption of new technology.</p>
<figure class="align-center ">
<img alt="A white man in a polo shirt poses in front of a displayed VR headset" src="https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/530723/original/file-20230607-21-mjk31r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Apple CEO Tim Cook poses for photos in front of a pair of the company’s new Apple Vision Pro headsets in a showroom on the Apple campus on June 5, 2023, in Cupertino, Calif.</span>
<span class="attribution"><span class="source">(AP Photo/Jeff Chiu)</span></span>
</figcaption>
</figure>
<p>When it comes to ease of use, the Vision Pro offers an intuitive hand-tracking capability that allows users to interact with simple hand gestures and an impressive eye-tracking technology. Users will have the ability to select virtual items just by looking at them.</p>
<p>The Vision Pro also addresses another crucial metaverse challenge: the digital persona. One of the most compelling features of the metaverse is the ability for users to connect virtually with one another, but <a href="https://www.cnn.com/2022/08/25/tech/vr-avatars/index.html">many find it challenging to connect with cartoon-like avatars</a>. </p>
<p>The Vision Pro is attempting to circumvent this issue by allowing users to create hyper-realistic digital personas. Users will be able to scan their faces to create digital versions of themselves for the metaverse.</p>
<p>The seamless integration of the Vision Pro into the rest of <a href="https://medium.com/swlh/the-irresistible-lure-of-the-apple-ecosystem-81bf8d66294a">the Apple ecosystem</a> will also likely to be a selling point for customers.</p>
<p>Lastly, the power of so-called “Apple effect” is another key factor that could contribute to the Vision Pro’s success. Apple has built <a href="https://www.statista.com/statistics/267966/brand-values-of-the-most-valuable-technology-brands-in-the-world/">an extremely loyal customer base over the years</a> by establishing trust and credibility. There’s a good chance customers will be open to trying this new technology because of this.</p>
<h2>Privacy and pricing</h2>
<p>While Apple seems poised to take on the metaverse, there are still some key factors the company needs to consider. </p>
<p>By its very nature, the metaverse <a href="https://www.thedrum.com/opinion/2022/07/21/the-metaverse-and-consumer-data-here-s-what-you-need-know">requires a wealth of personal data collection</a> to function effectively. This is because the metaverse is designed to offer personalized experiences for users. The way those experiences are created is by collecting data.</p>
<p>Users will need assurances from Apple that their personal data and interactions with Vision Pro are secure and protected. <a href="https://www.cnbc.com/2021/06/07/apple-is-turning-privacy-into-a-business-advantage.html">Apple’s past record of prioritizing data security may be an advantage</a>, but there needs to be continuous effort in this area to avoid loss of trust and consumer confidence.</p>
<p>Price-wise, the Vision Pro costs a whopping US$3,499. This will undoubtedly pose a barrier for users and may prevent widespread adoption of the technology. Apple needs to consider strategies to increase the accessibility of this technology to a broader audience.</p>
<p>As we look to the future of this industry, it’s clear the metaverse is anticipated to be fiercely competitive. While Apple brings cutting-edge technology and a loyal customer base, Meta is still one of the original players in this space and its <a href="https://www.meta.com/ca/quest/quest-3/">products are significantly more affordable</a>. In other words, the metaverse is very much alive.</p><img src="https://counter.theconversation.com/content/207197/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Omar H. Fares does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>After struggling to go mainstream, Apple’s recent announcement of the Vision Pro mixed-reality headset could be the lifeline the metaverse needs.Omar H. Fares, Lecturer in the Ted Rogers School of Retail Management, Toronto Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2071252023-06-06T18:03:00Z2023-06-06T18:03:00ZApple Vision Pro headset: what does it do and will it deliver?<figure><img src="https://images.theconversation.com/files/530291/original/file-20230606-15-6s4g00.jpg?ixlib=rb-1.1.0&rect=17%2C8%2C5964%2C3961&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The headset can blur the lines between virtual reality and augmented reality.</span> <span class="attribution"><a class="source" href="https://epaimages.com/search.pp?flush=1&multikeyword=apple&startdate=&enddate=&autocomplete_City=&metadatafield5=&autocomplete_Country=&metadatafield44=">JOHN G. MABANGLO / EPA IMAGES</a></span></figcaption></figure><p>Apple recently unveiled its <a href="https://www.apple.com/apple-vision-pro/">Vision Pro headset</a> at the Worldwide Developers Conference in California. With it, Apple is venturing into a market of head-mounted devices (HMDs) – which are usually just displays, but in this case is more of a complete computer attached to your head – as well as the worlds of virtual reality (VR), augmented reality (AR) and mixed reality (MR). </p>
<p>The new Apple product will fuel the hopes of many working on these technologies that they will some day be routinely used by the public, just as the iPhone, iPad and Apple Watch helped bring smartphones, tablets and wearable tech into mainstream use.</p>
<p>But what does the Vision Pro actually do, and how much mass appeal will it have?</p>
<p>VR immerses users in an entirely computer-generated world, isolating them to a large degree from their physical surroundings. AR superimposes computer-generated elements onto the real world while the latter remains visible, with the purpose of enhancing the context of our physical surroundings.</p>
<p>A term often used interchangeably with AR is mixed reality, referring to a set of immersive technologies including AR, that provide <a href="https://arxiv.org/abs/1804.08386">different “blends” of physical and virtual worlds</a>. These three technologies are often <a href="https://www.w3.org/immersive-web/">collectively referred to as XR</a>.</p>
<p>The blending of VR and AR seems to be a key part of Apple’s thinking, with the <a href="https://www.apple.com/apple-vision-pro/">Vision Pro</a> allowing users to adjust their level of immersion by deciding how much of the real world they can see. This transitioning between the two experiences will probably be a trend for future HMDs.</p>
<p>The physical world is “seen” through an array of 12 cameras located behind a ski-goggle-like glass fascia, acting as a lens. When the Vision Pro is in VR mode, people approaching you in the real world are automatically detected and displayed as they get close. </p>
<p>A feature called EyeSight also displays the wearer’s eyes through the glass lens when needed, to enable more natural interaction with people around them – a challenge for many HMDs.</p>
<p>In terms of technical specifications, the Vision Pro is impressive. It uses a combination of the M2 microchip and a new chip called the R1. M2 is running <a href="https://developer.apple.com/visionos/">visionOS</a>, which Apple calls its first spatial operating system, along with computer vision algorithms and computer graphics generation. </p>
<p>R1 processes information from the cameras, an array of microphones and a LiDAR scanner – which uses a laser to measure distances to different objects – in order to make the headset aware of its surroundings.</p>
<p>More importantly, the Vision Pro boasts an impressive display system with “more pixels than a 4K TV to each eye”. Its ability to track where the wearer’s eyes are looking allows users to interact with graphical elements just by looking at them. The headset can receive gesture and voice commands and features a form of 360-degree sound called spatial audio. The quoted unplugged operating time is two hours.</p>
<h2>Wearable ‘ecosystem’</h2>
<p>Packed, in typical Apple fashion, in curved aluminum and glass, the headset has an eye-watering price of US$3,499 (£2,819) and represents a collection of many premium features. But Apple has a history of developing products with increasingly versatile capabilities to sense what’s going on in their real-world surroundings.</p>
<figure class="align-center ">
<img alt="Tim Cook (L) and Apple Senior VP of Software Engineering Craig Federighi speak during the conference keynote address." src="https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=377&fit=crop&dpr=1 600w, https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=377&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=377&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=473&fit=crop&dpr=1 754w, https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=473&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/530338/original/file-20230606-15-sfqgzo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=473&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Tim Cook (L) and Apple Senior VP of Software Engineering Craig Federighi speak during the conference keynote address.</span>
<span class="attribution"><a class="source" href="https://epaimages.com/search.pp?flush=1&multikeyword=apple&startdate=&enddate=&autocomplete_City=&metadatafield5=&autocomplete_Country=&metadatafield44=">JOE MABANGLO / EPA IMAGES</a></span>
</figcaption>
</figure>
<p>Apple also focuses on making its devices interoperable – meaning they work easily with other Apple devices – forming a wearable “ecosystem”. This is what really promises to be disruptive about the Vision Pro. It is also akin to what had been promised and hoped for by pioneers in the idea of <a href="https://www.media.mit.edu/wearables/">wearable computing back in the 1990s</a>.</p>
<p>Combining the headset with the iPhone, which still forms the backbone of Apple’s ecosystem, and the Apple Watch could help create new uses for augmented reality. Likewise, linking the headset to many programming tools demonstrates the company’s desire to tap into an existing community of developers of augmented reality applications.</p>
<p>Many questions remain, however. For example, will it be able to access mixed reality applications via a web browser? What will it be like to use from an ergonomic point of view?</p>
<p>It’s also unclear when the Vision Pro be available outside the US or whether there will be a non-Pro version – as the “Pro” part of the title implies a more “expert”, or developer market.</p>
<p>The Vision Pro is a gamble, as XR is often seen as something that promises but rarely delivers. Yet, companies such as Apple and those that are probably its primary competitors in the XR domain, Meta and Microsoft, have the clout to make XR popular for the general public.</p>
<p>More importantly, devices such as the Vision Pro and its ecosystem, as well as its competitors could provide the foundation for developing <a href="https://theconversation.com/metaverse-five-things-to-know-and-what-it-could-mean-for-you-171061">the metaverse</a>. This is an immersive world, facilitated by headsets, that aims for social interaction that’s more natural than with previous products.</p>
<p>Sceptics will say that Vision Pro and EyeSight make you appear like a scuba diver in your living room. But this could finally be the time to dive into the deep waters of XR.</p><img src="https://counter.theconversation.com/content/207125/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Panagiotis Ritsos receives funding from the DSP Centre, Bangor University, which has been partly funded by the European Regional Development Fund through Welsh Government and also by the North Wales Growth Deal through Ambition North Wales, Welsh Government and UK Government.
</span></em></p><p class="fine-print"><em><span>Dr. Peter W. S. Butcher receives funding from the DSP Centre, Bangor University, which has been partly funded by the European Regional Development Fund through Welsh Government and also by the North Wales Growth Deal through Ambition North Wales, Welsh Government and UK Government.</span></em></p>Will Apple’s Vision Pro set the new standard for the future of virtual reality?Panagiotis Ritsos, Senior Lecturer in Visualisation, Bangor UniversityPeter Butcher, Lecturer in Human Computer Interaction, Bangor UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1902822023-05-31T09:48:29Z2023-05-31T09:48:29ZTechnology is radically changing sleep as we know it<figure><img src="https://images.theconversation.com/files/527283/original/file-20230519-25-h1921w.jpg?ixlib=rb-1.1.0&rect=26%2C66%2C8844%2C4707&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/ai-artificial-intelligence-concept-deep-learning-1734250598">metamorworks/Shutterstock</a></span></figcaption></figure><p>From sleep trackers to <a href="https://go.drugbank.com/categories/DBCAT001105">wakefulness drugs</a>, the 21st century has seen an influx of new technology that could radically alter the way we sleep. </p>
<p>Many of these new technologies chase the dream of optimised slumber. They promise to help tailor our sleep schedules to fit around our social lives, help us sleep for longer or even skip a night’s sleep altogether. </p>
<p>Here’s how technology is permeating our sleep, and what the future holds. </p>
<h2>Time to wake up</h2>
<p>Sleeping pills have recently been joined by a wave of wakefulness drugs, purportedly safer and more powerful alternatives to caffeine. It seems that they work best on people who are already sleep deprived and don’t have a huge effect on those who are already well rested. </p>
<p>Modafinil is touted for its <a href="https://www.ox.ac.uk/news/2015-08-20-review-%E2%80%98smart-drug%E2%80%99-shows-modafinil-does-enhance-cognition">cognition enhancing effects</a> (especially in sleep-deprived people) and can supposedly keep people awake and alert for several days at a time. <a href="https://doi.org/10.1016/j.phrs.2010.04.002">Some scientific studies</a> are showing that this may indeed be the case, although results are mixed, with other research showing the effects are similar to caffeine. </p>
<p>The drug was developed to help people with narcolepsy but some have started using it for its focus-enhancing effects. It is a controlled drug (prescription only) in most countries. People who use it for cognitive enhancement or wakefulness are buying it on the black market or getting it from friends who have a prescription.</p>
<p>Modafinil is popular with students – in 2020, Loughborough University researchers found that, of 506 students surveyed at 54 UK universities, <a href="https://www.researchgate.net/publication/341139217_Working_smart_the_use_of_%27cognitive_enhancers%27_by_UK_university_students">19% had taken cognitive enhancement substances</a>. </p>
<p>But people who take them for non-medical purposes are <a href="https://doi.org/10.1080/09687637.2019.1618025">risking their health</a>. Studies of the safety of these drugs do not consider this type of use. We don’t know what using these drugs to stay awake for long periods of time does to people’s bodies. But we do know that disrupting your sleep pattern (for example, shift work) is <a href="https://www.webmd.com/sleep-disorders/features/shift-work">linked to health problems</a> such as diabetes and cardiovascular disease.</p>
<p>Recent studies suggest some people are combining sleep and wakefulness pills to <a href="https://doi.org/10.1080/09687637.2018.1555231">manage their body rhythms</a> and <a href="https://doi.org/10.1080/09687637.2019.1585760">optimise their sleep</a> or unwind after a day of hard work. The effects of taking wakefulness pills with other drugs is largely unknown. </p>
<p>In the UK sale or supply of a prescription-only or unlicensed medicine is a criminal offence. Whereas in the US, even possession of stimulants without a prescription is a crime.</p>
<h2>Smart sleep</h2>
<p>Many people already use smart watches, smart jewellery and fitness bands to track their sleep – for example, alarms that wake people up at the optimal point in their sleep cycle and motion sensor apps <a href="https://onlinelibrary.wiley.com/doi/10.1111/jsr.12270">that analyse sleep patterns</a>. </p>
<figure class="align-center ">
<img alt="Adult reaches for smart phone with lit screen on bedside table" src="https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/527284/original/file-20230519-17-wnyj1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Technology is disrupting our sleep.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/woman-sleeping-her-bed-receiving-phone-2116032359">Stokkete/Shutterstock</a></span>
</figcaption>
</figure>
<p>New ways of tracking sleep could soon include donning a <a href="https://www.newscientist.com/article/2198157-smart-pyjamas-could-detect-why-youre-not-sleeping-well/">pair of pyjamas</a> embedded with sensors to track changes in posture, respiratory and heart rate, or hugging a robot pillow, whose algorithm creates a breathing pattern to mimic and help you fall asleep.</p>
<p>Meanwhile, care robots have already been <a href="https://doi.org/10.1016/j.techsoc.2020.101318">trialled</a> in Japan to test whether they could help older people sleep better. Designed to watch over residents at night in care homes, they give staff information on how well the residents are sleeping and let them know if anyone goes for a nocturnal wander.</p>
<h2>In your dreams</h2>
<p>Dream management technologies are in much earlier stages of development. <a href="https://www.media.mit.edu/publications/dream-engineering-simulating-worlds-through-sensory-stimulation/">Scientists believe</a> that sensory stimulation technologies and devices, such as virtual reality visors, could be used for sleep engineering. This <a href="https://www.cardiff.ac.uk/research/explore/research-units/neuroscience-and-psychology-of-sleep-lab-naps">new science </a> involves exposing the sleeper to sensory stimuli, such as clicking sounds and vibrations, at specific times in the sleep cycle. The aim would be to improve sleep quality, enhance memory and even treat post-traumatic stress disorder (PTSD). </p>
<p>As for the prospects of “reading” our dreams, progress is being made on this front too. Scientists have taken the first steps towards <a href="https://doi.org/10.1126/science.1234330">dream interpretation</a> by measuring brain activity during sleep and using AI to decode visual imagery. Participants in a 2013 study were asked to report the imagery from the dreams after sleeping inside an MRI scanner. Researchers compared scans from people viewing the same types of images while awake and the results showed matching patterns of brain activity. </p>
<h2>Nightmare technology</h2>
<p>But there is a dystopian side to this story. The technology we already have – electric light, smart phones, streaming services – can be disastrous for our sleep. </p>
<p>For example, a recent <a href="https://doi.org/10.1080/07448481.2018.1499655">study</a> in the US found that college students often sleep with their mobile phone in bed with them, which means a call, software update or app notification can disturb them. Watching TV or playing video games in bed and staring at our tablets and mobile phone screens into the night <a href="https://doi.org/10.1016/j.socscimed.2015.11.037">has become the norm for many</a>. It can lead to <a href="https://doi.org/10.1016/j.sleep.2013.09.007">poor sleep</a> and knock our sleep cycles off kilter.</p>
<p>Growing numbers of people are <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5263088/">seeking treatments</a> for new sleep conditions such as <a href="https://www.sleepfoundation.org/orthosomnia#:%7E:text=Orthosomnia%20is%20the%20proposed%20term,disorders%20from%20sleep%20tracker%20data.">orthosomnia</a> – the obsessive quest for perfect sleep, similar to an unhealthy preoccupation with nutrition. Some people become so concerned about improving their sleep metrics that it is actually <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9875581/">giving them insomnia</a>. </p>
<p>There’s still so much we don’t know about sleep, and new technology is changing our sleep faster than scientists can keep up with. One thing seems almost for sure: sleep and technology in western society are becoming entangled like never before.</p><img src="https://counter.theconversation.com/content/190282/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Wakefulness drugs like Modafinil point to a new frontier of technosleep driven by personal goals rather than medical need.Catherine Coveney, Senior Lecturer in Sociology, Loughborough UniversityEric L Hsu, Lecturer in Sociology, University of South AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1962582023-05-15T11:47:10Z2023-05-15T11:47:10ZWe’re using VR to help find the next generation of basketball stars<figure><img src="https://images.theconversation.com/files/502212/original/file-20221220-23-ygrw7y.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">blank</span> </figcaption></figure><p>Golden State Warriors point guard Steph Curry is one of the world’s leading basketball players and unquestionably the <a href="https://www.skysports.com/nba/news/36244/12492411/five-key-reasons-stephen-curry-became-the-greatest-nba-shooter-of-all-time-and-revolutionised-basketball-with-his-shooting">greatest shooter of all time</a>. The video below may look like it’s on a loop, but it’s actually Curry sinking 105 three point shots in a row – that’s five minutes of the same precise and highly-skilled action, without a single miss:</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1342975296799031296"}"></div></p>
<p>Curry is 6'2". In the real world he is tall, fast and strong, but in the NBA, where players average height is <a href="https://www.thehoopsgeek.com/average-nba-height/">about 6'6"</a> (198cm), he is on the small side. Many previous basketball superstars were first scouted as freakishly tall teenagers, but his game instead relies on clever movement, smooth dribbling and that famous pinpoint shooting. So how do you identify such attributes in order to spot the next Steph Curry?</p>
<p>Talent identification and development is one of the key challenges for many elite sports. To stay ahead of the game, coaches are switching from subjective to objective methods and using increasingly sophisticated processes, including virtual reality (VR). VR systems can detect and map athletes’ strengths and weaknesses relative to several determinants of performance, including the sorts of “perceptual-motor” skills that someone like Curry would excel at.</p>
<p>In my academic research, I use digital technologies like these to mimic real-world sports performance. For instance I recently used motion capture to assess <a href="https://theconversation.com/var-i-used-motion-capture-technology-to-show-why-the-premier-league-gets-tight-offside-decisions-wrong-189223">offside decisions in football</a> and I showed that people on average judge the offside moment later than the actual moment.</p>
<p>My <a href="https://onlinelibrary.wiley.com/doi/10.1111/sms.14250">latest research</a> looks at basketball throwing. I asked 22 players with different levels of expertise, from beginner to professional, to naturally throw a basketball in a simulator developed by Antoine Morice and colleagues from the Institute of Movement Sciences at Aix-Marseille University. </p>
<p>We placed electronics inside the ball and markers on the players’ bodies to monitor their movements while throwing in the simulator. Players wore stereoscopic glasses and watched a virtual twin of a real basketball court on a huge screen. The virtual scene was constantly updated with the players’ point of view to make them feel they were inside the game.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1131555265688555520"}"></div></p>
<h2>Experience counts – even in simulators</h2>
<p>Experienced players had higher success rate in the simulator than novice players, just as you would expect on the real court. Both experienced and novice players found it harder to score when the virtual distance was increased, but the more experienced players were better able to adjust their throwing motion based on visual information suggesting the basket was further away.</p>
<p>Male players also released the ball to the basket at a lower angle compared with female players, even after adjusting for their height. </p>
<p>Finally, players switched from a “free throw” style at short distances – where both feet remain on the ground as the ball is thrown – to jump-shots further from the basket, and their movement patterns differed between expert and novice players.</p>
<p>Probably the most important feature of sports simulators is the way they can manipulate visual information such as where the floor markings are or how far the player is from the basket. These manipulations are often difficult to replicate in the real world.</p>
<p>In our study, we kept the floor-marking information constant but changed how far away the basket appeared. We observed that experienced players could use this information to adjust their throw. </p>
<p>These adjustments were also evident within players’ movements. For example, experienced players used more free throws, which increase stability when the ball is released and therefore lead to more successful shots.</p>
<p>Researchers in this field talk about “fidelity”, which is the ability of VR simulators to mimic real-world sports performance. Fidelity is particularly tough to achieve when it comes to simulating fine and skilful actions such as free throws in basketball. </p>
<p>If the simulator is going to be used for training, coaches need it to elicit realistic behaviour from all players, and that means providing visual cues for things like the distance from the basket so that players can adjust their movements.</p>
<p>Manipulating things like how far away the basket appears might help identify a future star player since talent is not only about physical characteristics but the way a player can adapt to different situations. </p>
<p>If Curry or someone similarly talented were to throw a basketball for the first time while in the VR simulation, we might expect them to not only score well in terms of body position and so on, but also to adapt to the sorts of changing circumstances that arise constantly in competitive games. </p>
<p>And this is one of the key advantages of a VR simulation: we can assess how players adapt to a change in one specific condition while everything else remains the same. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1616247798265188352"}"></div></p>
<p>In the clip above, Curry gains possession deep in his own half. There is no time to dribble or pass as there is just one second remaining before half time. He shoots immediately, and scores. </p>
<p>Yes, the best players have repeatable and almost flawless technique, as Curry demonstrated in the first clip. But they’re also able to respond to changing circumstances – in this case, a need to shoot suddenly from extreme distance – and VR can be used to identify both skills.</p><img src="https://counter.theconversation.com/content/196258/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>This work was supported by the “Investissements d'avenir,” a French Government Programme under Grant number ANR-11-IDEX-0001-02; Carnot Institute “STAR” under Grant Cybershoot (2017); Défi Instrumentation aux Limites program of the CNRS under Grant Virtushoot (2017); and the Digital Economy Next Stage CAMERA 2.0 (EP/T022523/1). Pooya Soltani also received funding from Collège de France (PAUSE program 2018).</span></em></p>When we tweaked the simulator, talented players naturally adapted.Pooya Soltani, Senior Lecturer in Games Design, Staffordshire UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2028932023-04-18T11:43:22Z2023-04-18T11:43:22ZThe latest trends in video games from the 2023 global Game Developers Conference<p>San Francisco’s <a href="https://gdconf.com/">Game Developers Conference</a> (GDC) – the global gathering of the greatest creative minds in the games industry – opened its doors for the second time since the pandemic in March 2023.</p>
<p>Each year a number of key trends stand out. For 2023 it was applications of artificial intelligence (AI) for <a href="https://www.gamedeveloper.com/design/style-and-substance-in-game-design">game development</a>, with the future shape of the gaming experience – with and without virtual reality (VR) – high on the agenda. </p>
<p>Following last year’s tentative steps back on to the expo floor – interacting with devices was off the menu because of COVID fears – <a href="https://about.meta.com/uk/metaverse/">Meta</a> (formerly Facebook) was busy sanitising display models of its popular Quest 2 headset to encourage visitors to try it out. The company was also keen to push its heavily discounted Quest Pro headset, which features “colour pass-through”, meaning someone wearing the headset can see an <a href="https://www.techtarget.com/whatis/definition/augmented-reality-AR">augmented view</a> of the world around them.</p>
<p>Chinese manufacturer <a href="https://www.pico.net/">Pico</a> of the Pico 4 headset, which has striking similarities to the Quest, had an even larger stand and was generating similar levels of interest. Clearly the time was right to start interacting with shared devices again, as long as good hygiene prevailed.</p>
<h2>AI developments and issues</h2>
<p>Before the conference, <a href="https://www.cnbc.com/2023/02/15/elon-musk-co-founder-of-chatgpt-creator-openai-warns-of-ai-society-risk.html">considerable buzz</a> had been building around <a href="https://openai.com/blog/chatgpt">ChatGPT</a>, the chatbot that can provide convincing, detailed written answers to users’ questions.</p>
<p>The latest updated version was released at the event, offering several improvements. Most noticeable was the reduction in repetition of key phrases in the chatbot’s answers which alerted people to the fact it was AI-generated content.</p>
<p>For games developers the interest in this type of AI relates to speeding up and easing game development, but there were also concerns raised about jobs being replaced by AI. The positive consensus was that humans are still best placed to ask the right questions to generate game content, even if such content was created by AI.</p>
<p>Adobe announced its <a href="https://news.adobe.com/news/news-details/2023/Adobe-Unveils-Firefly-a-Family-of-new-Creative-Generative-AI/default.aspx">Firefly AI tool</a> that can generate both images and 3D models. This technology might also assist with generating “substances” – <a href="https://www.gamedesigning.org/learn/digital-texturing/#:%7E:text=over%20a%20material.-,What%20are%20Materials%3F,%2C%20shading%2C%20and%20so%20on.">materials</a> that can be applied to game models and scenes. The company indicated its tool was to “enhance the creative process, rather than replace it”.</p>
<p>Adobe drew a cheer from developers for promising “clean and safe” content, meaning anything the AI created would be based on what it had learned from Adobe stock and public domain images rather than sources simply scraped from anywhere on the internet. This should avoid the potential ethical and legal issues of unintentionally publishing content learned from a copyrighted source, a key issue for this kind of AI.</p>
<p>Some developers demonstrated AI implementing content into a game scene based upon typing a sentence. For example, key in “Create a scene with 10 boulders”, and huge rocks were inserted into a scene without the usual manual placement of objects. I met one developer who had created a tool to search for and use AI to automatically rig (that is, prepare for animation) public-domain game models from the internet.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/j4r2Y9hNkNc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Welcome to the metaverse</h2>
<p>The other major theme, which has figured much in public awareness since Facebook changed its name to Meta, is that of the <a href="https://mention.com/en/blog/metaverse-multiverse-omniverse/#:%7E:text=Omniverse%20is%20a%20concept%20that,elements%20of%20Multiverses%20and%20Metaveverses.">metaverse</a> – a network of virtual worlds through which people can access many functions of the internet including games. We know Meta has already gambled big in this area, and clearly companies like Pico are happy to sell devices to experience it. </p>
<p>So it wasn’t surprising that several tool and game engine providers were positioning themselves as a useful resource when creating content for the metaverse. The previous generation of game engines (the software frameworks used by developers used to create games) such as <a href="https://www.unrealengine.com/en-US/?utm_source=GoogleSearch&utm_medium=Performance&utm_campaign=%7Bcampaigname%7D&utm_id=17086214830&sub_campaign=&utm_content=&utm_term=unrealengine">Unreal Engine 4</a> and <a href="https://unity.com/">Unity</a> have made significant efforts to adapt to engage the film and television industries. It’s clear from this year’s GDC conference that the goal for the next generation of game engines is to prepare for digital spaces like the metaverse.</p>
<p>Meanwhile, <a href="https://www.epicgames.com/site/en-US/home">Epic Games</a>, developer of the hugely successful <a href="https://www.fortnite.com/">Fortnite</a>, is pitching that the metaverse doesn’t have to be about putting a VR headset on and disconnecting from those around you. Epic believes it actually already here via games (like Fortnite) played on a 2D screen using the PCs, consoles and portable-handheld devices people already have access to.</p>
<p>To accelerate the creation of content for this Epic vision of the metaverse, the company announced the launch of its <a href="https://store.epicgames.com/en-US/p/fortnite--uefn?utm_campaign=%5B%5BFNBR_RT_UEFN_Google-Search_Exact/Phrase_UK%5D%5D&utm_source=Google-Search">Unreal Engine for Fortnite</a> (UEFN), which would allow content creators more advanced control than ever before of gameplay and assets – the graphics and things a game relies upon. It also announced a new computer language called <a href="https://dev.epicgames.com/documentation/en-us/uefn/verse-language-reference#:%7E:text=What%20Is%20Verse%3F,as%20a%20first%2Dtime%20programmer.">Verse</a>, which the company hopes will be easy to understand while capable of powering the metaverse future.</p>
<p>Epic is no stranger to creating languages to allow customisation of the behaviour of elements in a game, and has suggested that Verse could become the standard language across a range of metaverse/omniverse implementations, not just its own. If a major partner such as Microsoft were to come on board, there would be more confident industry take-up of Verse. </p>
<p>Epic is proposing the (Fortnite-powered) metaverse as a place to easily leap from one connected experience to another – to transition from a first-person shooter game to a racing game, <a href="https://www.game.co.uk/webapp/wcs/stores/servlet/HubArticleView?hubId=2339764&articleId=2339765&catalogId=10201&langId=&storeId=10151">Ready Player One</a>-style, should be achievable with current technology.</p>
<p>But it’s the console manufacturers who are best placed to create devices with supporting operating systems to enable simple user function across gaming and other experiences. So maybe our attention should be on what Microsoft, Sony or even Nintendo do next. Welcome to the Mario-verse?</p><img src="https://counter.theconversation.com/content/202893/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gavin Wade does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Meta and Pico lead the field with their VR headsets, ChatGPT continues its inexorable rise and new engine developments are pushing the boundaries of the video game experience.Gavin Wade, Senior Lecturer in Computer Games Technology, University of PortsmouthLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2001342023-02-26T15:06:28Z2023-02-26T15:06:28ZBillions have been sunk into virtual reality. To make it worth it, the industry needs to grow beyond its walled gardens<figure><img src="https://images.theconversation.com/files/511843/original/file-20230222-14-d9y1sq.jpg?ixlib=rb-1.1.0&rect=29%2C7%2C4962%2C3315&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">While VR is still used primarily as a gaming device, it has the potential to move beyond the industry and revolutionize the way people interact with one another in the metaverse.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/billions-have-been-sunk-into-virtual-reality--to-make-it-worth-it--the-industry-needs-to-grow-beyond-its-walled-gardens" width="100%" height="400"></iframe>
<p>Despite recent <a href="https://www.cnbc.com/2023/01/18/tech-layoffs-microsoft-amazon-meta-others-have-cut-more-than-60000.html">waves of Big Tech layoffs</a>, <a href="https://www.roadtovr.com/bytedance-pico-consumer-vr-us-jobs/">billions of dollars</a> <a href="https://venturebeat.com/business/hp-moves-into-vr-and-ar-with-investment-in-venture-reality-fund/">have been</a> <a href="https://venturebeat.com/business/hp-moves-into-vr-and-ar-with-investment-in-venture-reality-fund/">sunk into virtual reality (VR)</a> hardware and software over the past few years. </p>
<p>For this investment to be worthwhile, the VR industry needs to achieve sustainability and growth. To do this, it will have to explore many different applications of VR technology, including <a href="https://www.hp.com/us-en/workstations/learning-hub/vr-leading-manufacturing.html">manufacturing</a> and <a href="https://www.forbes.com/sites/cathyhackl/2020/08/30/social-vr-facebook-horizon--the-future-of-social-media-marketing">social VR</a>. Social VR is a type of virtual reality experience where users can meet and interact with one another in a virtual world.</p>
<p>As a <a href="https://www.utm.utoronto.ca/">University of Toronto Mississauga (UTM)</a> <a href="https://www.utm.utoronto.ca/iccit/people/bree-mcewan">associate professor</a> who researches social VR and teaches classes on virtual environments, I am often faced with the question of what will drive the adoption of social VR by broader society. </p>
<p>As the UTM lead of the University of Toronto’s <a href="https://datasciences.utoronto.ca/dsi-utm/">Responsible Data Science initiative</a>, I am also interested in the data collection, retention and deployment that is needed to build an <a href="https://datasciences.utoronto.ca/data-and-the-metaverse/">efficient and ethical metaverse</a>. </p>
<h2>Walled gardens</h2>
<figure class="align-right ">
<img alt="A book cover of Snow Crash by Neal Stephenson. It had a red sword against a navy swirly background dotted with yellow, red, blue and white circles." src="https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=900&fit=crop&dpr=1 600w, https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=900&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=900&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1131&fit=crop&dpr=1 754w, https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1131&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/511842/original/file-20230222-18-wteifr.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1131&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Author Neal Stephenson coined the term ‘metaverse’ in his 1992 science-fiction novel <em>Snow Crash</em>.</span>
<span class="attribution"><span class="source">(Penguin Random House)</span></span>
</figcaption>
</figure>
<p>At the present moment, our cultural imagination of the metaverse surpasses the real thing. In books about the metaverse, you can speed across the <a href="https://www.penguinrandomhouse.com/books/172832/snow-crash-by-neal-stephenson/">world on a motorcycle</a> with katana in-hand, or <a href="https://www.penguinrandomhouse.ca/books/293994/neuromancer-by-william-gibson/9780441007462">slip in and out of cyberspace</a> on a mission for artificial intelligences.</p>
<p>In films and television shows about it, you can leave behind your everyday life <a href="https://www.warnerbroscanada.com/movies/ready-player-one">to embark on a scavenger hunt</a> through ‘80s nostalgia or <a href="https://www.warnerbros.com/movies/matrix">save the world</a> while bending your body around the trajectory of bullets. Or you can <a href="https://collider.com/best-star-trek-holodeck-episodes/">walk through a door in your workplace</a> and find yourself in Sherlock Holmes’s London or the wild west. In all these versions of the metaverse, we imagine leaving the physical world and entering a new, fully formed digital universe.</p>
<p>However, this is not the current state of VR technologies. Rather, we seem to be stuck in the <a href="https://www.cnbc.com/2022/08/18/web3-is-in-chaos-metaverses-in-walled-gardens-randi-zuckerberg.html">walled garden phase</a> of this potentially revolutionary interactive technology. Until the VR industry figures out how to move beyond these walled gardens, the metaverse may never live up to the hype.</p>
<p>A <a href="https://www.pcmag.com/encyclopedia/term/walled-garden">walled garden</a> is a mediated environment that restricts users to specific content within a website or social media platform. This is how the early internet worked — providers like <a href="https://www.wsj.com/articles/SB968104011203980910">AOL, CompuServe and Prodigy</a> kept users on affiliated sites.</p>
<p>This later changed when the true potential of the internet was realized and users began freely traversing sites and platforms. Users connected and drew on information from many different sources.</p>
<p>Today, information, memes, images, celebrity gossip and cultural moments all diffuse across the internet and are accessible from many different hardware devices, including cellphones, tablets and computers. </p>
<p>Today’s VR more closely resembles a <a href="https://doi.org/10.1109/MC.2021.3130480">walled garden environment</a> than the interconnected internet. There are only a handful of social software programs that are accessible from different headsets. </p>
<p>Software developers may find it difficult to <a href="https://doi.org/10.1007/978-3-030-23528-4_59">program for multiple headsets</a> at once, in part due to a lack of a <a href="https://xrbootcamp.com/the-best-5-vr-sdk-for-interactions/#headline-91-640">standard software development kit</a> across VR hardware devices. This leaves the current virtual reality market, despite the potential for immersive, interactive, social experiences, more similar to the gaming console market than a communication channel.</p>
<p>For VR to become the next widely adopted communication channel, the industry needs to move beyond the walled garden phase. To do this, VR needs to increase its interoperability — the ability for programs and applications to be able to integrate and for software to run across VR hardware.</p>
<p>Interoperability raises important questions about the data infrastructure of VR hardware and software, the sharing of consumer and corporate data and our ability to traverse to different parts of the metaverse.</p>
<h2>The tipping point</h2>
<p>Virtual reality adoption is often talked about as if it’s just about to take off. In 1992, VR visionary <a href="http://www.jaronlanier.com/">Jaron Lanier</a> predicted the possibility of home VR <a href="https://doi.org/10.1111/j.1460-2466.1992.tb00816.x">by the turn of the century</a>. </p>
<p>Researchers <a href="https://doi.org/10.1177/1461444820924623">Tony Liao and Andrew Iliadis found something similar in their research</a> on the <a href="https://dynamics.microsoft.com/en-us/mixed-reality/guides/what-is-augmented-reality-ar/">augmented reality</a> industry. Augmented reality was consistently talked about as if widespread adoption was just another five to 10 years out.</p>
<p>Yet, as author and researcher <a href="https://www.wired.com/story/virtual-reality-rich-white-kid-of-technology/">Dave Karpf succinctly lays out in WIRED</a>, while both augmented and virtual reality technologies keep advancing, they have yet to reach the tipping point necessary for widespread social adoption. </p>
<p>The technology, Karpf argues, is always “about to turn a corner, about to be more than just a gaming device, about to revolutionize other fields.” Yet, the primary use case of virtual reality <a href="https://www.meta.com/blog/quest/best-of-quest-2022/">remains as a gaming device</a>. </p>
<p>Leaning into VR as a gaming platform could work for the industry — the <a href="https://www.roadtovr.com/oculus-quest-store-revenue-1-billion-milestone-growth-meta/">usage of VR as a gaming device is increasing</a> and gamers are used to buying consoles that can only run specific titles created for that console — but it misses the potential of virtual reality. VR has the ability to bring communicators together into shared spaces to engage, interact and share human social experiences. </p>
<figure class="align-center ">
<img alt="A person wearing a virtual reality headset standing in the middle of a virtual world surrounded by avatars" src="https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/511845/original/file-20230222-24-ru4v63.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Social VR is a type of virtual reality experience where users can meet and interact with one another in a virtual world.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>The creation of these shared VR spaces will likely require movement towards interoperable social spaces where users can move easily and freely from one social VR space to another. </p>
<p>Interoperability, in turn, requires open software standards and data sharing between entities that have traditionally kept a close hold on their data collection and analysis processes. Consumers deserve to have confidence in the safety and protection of the data generated by their social interactions. </p>
<h2>The future of VR</h2>
<p>If the VR industry is to experience the kind of growth that will make it worthy of the billions of dollars that have been invested in it, we need to view the metaverse as public infrastructure, much like the internet is. </p>
<p>Those of us in both the VR industry and the VR research community must turn our attention to how data can contribute to interoperability while protecting individual instances of social interaction from surveillance and commodification. </p>
<p>The balance between the openness needed for interoperability, and the protections necessary to maintain consumer confidence, will be a tough balance to strike. Yet, without this balance, widely adopted social VR will continue to <a href="https://www.forbes.com/sites/qai/2023/01/06/vr-headset-sales-underperform-expectations-what-does-it-mean-for-the-metaverse-in-2023/">remain out of reach</a>.</p><img src="https://counter.theconversation.com/content/200134/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bree McEwan is affiliated with the University of Toronto, University of Toronto - Mississauga, and the UofT Data Sciences Institute. </span></em></p>If the VR industry is to experience the kind of growth that will make it worthy of the billions of dollars that have been invested in it, we need to view the metaverse as public infrastructure.Bree McEwan, Associate Professor, Institute of Communication, Culture, Information and Technology, University of TorontoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1996892023-02-21T18:11:47Z2023-02-21T18:11:47ZFive emerging trends that could change our lives online<figure><img src="https://images.theconversation.com/files/509720/original/file-20230213-22-56qt5a.jpg?ixlib=rb-1.1.0&rect=22%2C11%2C7271%2C4803&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">VR headsets are key to realising the Metaverse</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/concept-technologygamingentertainment-people-african-man-enjoying-562748395">Shutterstock / SFIO CRACHO</a></span></figcaption></figure><p>The way we live our lives online is rapidly changing. Artificial intelligence (AI), virtual reality and innovations such as blockchain – a kind of digital record for transactions — are set to transform the online world, affecting everything from social media to how people and businesses make money from their creativity.</p>
<p>If you’re feeling confused by the pace of change, here’s what you need to know about five trends on the cusp of making a major impact.</p>
<h2>1. Generative AI</h2>
<p>AI and the more specific field of machine learning (where software improves at a task with experience) are already used to <a href="https://www.europarl.europa.eu/news/en/headlines/society/20200827STO85804/what-is-artificial-intelligence-and-how-is-it-used">personalise the recommendations</a> we get when we shop online, in digital assistants like Alexa and for automated translation of text. The uses for this technology are <a href="https://www.forbes.com/sites/forbesbusinesscouncil/2022/05/05/the-future-of-ai-5-things-to-expect-in-the-next-10-years/">only likely to grow</a>. There are some innovative uses of AI by businesses that may point to how people will be using the technology in future.</p>
<p>The AI-powered chatbot ChatGPT is a high-profile example. <a href="https://www.ft.com/content/a6d71785-b994-48d8-8af2-a07d24f661c5">Microsoft recently invested US$10 billion (£8.2 billion) investment in the chatbot’s parent company</a> showing how seriously these online tools are being taken.</p>
<p>It was seen by some journalists as the start of an <a href="https://www.businessinsider.com/google-microsoft-ai-search-war-will-add-to-carbon-emissions-2023-2">“AI war”</a> between Microsoft and Google. The latter company has been <a href="https://www.technologyreview.com/2022/09/22/1059922/deepminds-new-chatbot-uses-google-searches-plus-humans-to-give-better-answers/">incorporating AI into its search engine</a> to improve the answers people get. <a href="https://www.jasper.ai/demo">Jasper.ai</a> is another forward-thinking use of AI. This online service generates written content for blogs, social media posts and letters. </p>
<p>Meanwhile, Meta, the company that owns Facebook, is working on AI-powered software that can generate video from a text prompt, such as <a href="https://arstechnica.com/information-technology/2022/09/write-%20text-get-video-meta-announces-ai-video-generator/">“teddy bear painting a portrait”</a>. This is regarded as the next step on from online tools that generate images from text, such as <a href="https://openai.com/dall-e-2/">DALL-E</a> and <a href="https://stability.ai/blog/stable-diffusion-public-release">Stable Diffusion</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-chatgpt-chatbot-is-blowing-people-away-with-its-writing-skills-an-expert-explains-why-its-so-impressive-195908">The ChatGPT chatbot is blowing people away with its writing skills. An expert explains why it's so impressive</a>
</strong>
</em>
</p>
<hr>
<h2>2. The metaverse</h2>
<p>The <a href="https://en.wikipedia.org/wiki/Metaverse">“metaverse”</a> is intended to make the online world more like the real one, through the use of <a href="https://en.wikipedia.org/wiki/Virtual_reality">virtual reality</a> (VR) headsets. Instead of interacting with a two-dimensional profile on social media, you would don your VR headset to be represented by an <a href="https://uk.pcmag.com/vr-1/142134/enter-the-metaverse-how-to-create-a-virtual-avatar">avatar in a 3D virtual world</a>. Your avatar would be able to communicate with other ones in a space modelled on the real world. Online shops could take the form of 3D virtual spaces so customers could browse in much the same way they would in their everyday lives. </p>
<p>A new wave of advanced VR headsets could help facilitate the metaverse. These could include advanced features such as eye tracking — which can make interactions with 3D worlds more instant and realistic — and <a href="https://www.youtube.com/watch?v=fBLwshNHBRo">facial expression detection</a>, which would ensure 3D avatars replicate their users’ demeanours. <a href="https://www.tomsguide.com/news/apple-vr-and-mixed-reality-headset-release-date-price-specs-and-leaks">Apple</a> and Qualcomm are developing new VR headsets that could launch in 2023, but details of their features are being kept under wraps.</p>
<p><a href="https://vr.youtube.com">YouTube</a> and <a href="https://creator.oculus.com/manage/mediastudio/?locale=en_GB">Meta</a> are both building libraries of <a href="https://www.youtube.com/watch?v=eQOglqUJQZI">360-degree video</a> and images, as well as computer-generated objects and <a href="https://theconversation.com/real-estate-in-the-metaverse-is-booming-is-it-really-such-a-crazy-idea-174021">backgrounds</a> that can be used to build the 3D environments that your avatar would explore in these virtual worlds.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">What is the metaverse, and what can we do there?</a>
</strong>
</em>
</p>
<hr>
<h2>3. Digital certificates</h2>
<p>The owners of 360-degree video and computer-generated landscapes designed for use in the metaverse will want to sell their digital creations. To prevent unauthorised use, a kind of token called an NFT can provide these items of digital content with certificates of authenticity and ownership. </p>
<p>These <a href="https://www.britannica.com/topic/non-fungible-token">non-fungible tokens</a> allow the content to be bought and sold with confidence, something that’s increasingly happening with the use of cryptocurrency. In 2022, YouTube, Facebook, Instagram and Twitter all introduced NFTs to their user and advertiser bases. <a href="https://usa.visa.com/partner-with-us/info-for-partners/visa-creator-program.html">Visa</a> and <a href="https://www.mastercard.com/news/perspectives/2022/simple-nft-purchasing-on-nft-marketplaces/">Mastercard</a> have also made buying NFTs possible with their credit and debit cards. </p>
<p>Despite a recent <a href="https://www.theverge.com/2022/11/14/23458863/nike-nfts-happen-dot-swoosh-sneakers-crypto">drop in the NFT market</a>, <a href="https://usa.visa.com/visa-everywhere/blog/bdp/2021/08/18/nfts-mark-a-1629328216374.html">forecasts by the US stock exchange Nasdaq</a> suggest the tokens could perform well in 2023. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/nfts-in-the-art-world-a-revolution-or-ripoff-191299">NFTs in the art world: A revolution or ripoff?</a>
</strong>
</em>
</p>
<hr>
<h2>4. Blockchain</h2>
<p>A kind of digital record, or ledger, called a <a href="https://en.wikipedia.org/wiki/Blockchain">blockchain</a> could help underpin private networks of people online, providing a safe space for them free from trolls, stalkers and fraud. Permission to view information can be restricted to a small number of people and the record of activity provided by blockchain can’t be changed. This means any unauthorised activity on the network is instantly traceable.</p>
<p>And because information is stored across a network of computers rather than a single server, it is more difficult to hack. An example of an emerging type of online community that could make use of blockchain is a <a href="https://www.investopedia.com/tech/what-dao/">DAO (decentralised autonomous organisation)</a>. These networks have discarded the top-down management used elsewhere in favour of a more democratic form of governance with no central authority.</p>
<p>A social platform called Mastodon shares many aspects with DAOs. It was recently in the news when <a href="https://www.theguardian.com/news/datablog/2023/jan/08/elon-musk-drove-more-than-a-million-people-to-mastodon-but-many-arent-sticking-around">more than a million users</a> fled Twitter to the platform in the wake of Elon Musk’s takeover.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/cryptocurrency-funded-groups-called-daos-are-becoming-charities-here-are-some-issues-to-watch-174763">Cryptocurrency-funded groups called DAOs are becoming charities – here are some issues to watch</a>
</strong>
</em>
</p>
<hr>
<h2>5. ‘Workfluencers’</h2>
<p>Businesses have taken note of the rise of social media influencers and are adopting their approach to reach target audiences. They are making use of <a href="https://theconversation.com/linkedin-at-20-how-a-new-breed-of-influencer-is-transforming-the-business-networking-giant-196413">what’s called an employee advocate, or “workfluencer”</a>. Companies have realised that employees’ social media profiles and posts may better convey the brand than corporate accounts.</p>
<p>When crafted thoughtfully, social media posts by employees can seem significantly more authentic to other users than corporate PR. People have grown more honest about day-to-day work life, rather than only producing stories on professional milestones and achievements.</p>
<p>Organisations are likely to build procedures to encourage teams and employees to communicate and distribute material on the company’s behalf.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/linkedin-at-20-how-a-new-breed-of-influencer-is-transforming-the-business-networking-giant-196413">LinkedIn at 20: how a new breed of influencer is transforming the business networking giant</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/199689/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Theo Tzanidis does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Here are the trends on the cusp of transforming the online world.Theo Tzanidis, Senior Lecturer in Digital Marketing, University of the West of ScotlandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1982472023-02-14T13:41:41Z2023-02-14T13:41:41ZWhat is Mondiacult? 6 take-aways from the world’s biggest cultural policy gathering<figure><img src="https://images.theconversation.com/files/508369/original/file-20230206-21-8bineg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">hadynyah/Getty Images</span></span></figcaption></figure><p>Culture’s status in global society got a major boost in 2022 when it was recommended to become its own sustainable development goal. This happened at the Unesco World Conference on Cultural Policies and Sustainable Development – called <a href="https://www.unesco.org/en/mondiacult2022?TSPD_101_R0=080713870fab2000a09e1f590eee9236224b49b65b35b132c616cc394977b0a02ac2e62027c474a20861e610e0143000765baf2107ff32468755177504b7b9a252592c01e65570cbe751e36ef19eb1605e90d2f17ad9e80a512b2762ca6cb961">Mondiacult</a>. The world’s most important cultural policy gathering took place in Mexico City 40 years after its first edition in the same city. The 2022 meeting gathered 2,600 participants including 135 government ministers, 83 non-governmental organisations, 32 intergovernmental organisations and nine UN agencies. </p>
<p>Mondiacult is important because it’s a decision-making meeting that helps shape the world’s cultural policies and especially the relationship between culture and development. What was clear is that there is a shift in this relationship. Culture does not only contribute to sustainable development but is one of development’s components. </p>
<p>Culture aids <a href="https://sdgs.un.org/goals">sustainable development goals</a> in areas like health, education and environment. For example, local customs and traditional knowledge are relevant in promoting health programmes. Local and traditional products are useful for sustainable production. Indigenous knowledge helps develop environmental practices to fight climate change. </p>
<p><a href="https://sdgs.un.org/goals">Sustainable development goals</a> – like clean water and quality education – are the United Nations (UN) blueprint for a better future for all. At Mondiacult, culture was raised to the status of being its own sustainable development goal. A careful reading of the <a href="https://www.unesco.org/sites/default/files/medias/fichiers/2022/10/6.MONDIACULT_EN_DRAFT%20FINAL%20DECLARATION_FINAL_1.pdf">final declaration</a> offers several reasons why:</p>
<h2>1. Culture can fight climate change</h2>
<p>Culture can contribute to the reduction of climate change’s negative impact. Ecological organisations and other stakeholders are now interested in discovering the usefulness of cultural practices and other local know-how to preserve the environment. Ancient communities faced climate crises and developed their own resilient practices rooted in cultural heritage. That is why concepts like indigenous knowledge systems have emerged. </p>
<h2>2. Digital must be ethical</h2>
<p>The transition from analogue to digital has become an important aspect in the production, distribution and consumption of cultural and creative goods and services. The COVID-19 pandemic revealed the value of digital and online spaces. Augmented reality, for example, enables exploring museum collections from a phone or computer. Virtual reality enables the visiting of historical monuments. Blockchain technology and artificial intelligence have grown hugely, but bring new ethical concerns. Which is why Unesco has adopted a set of <a href="https://unesdoc.unesco.org/ark:/48223/pf0000380455_fre.locale=fr">recommendations</a> on the ethics of artificial intelligence.</p>
<h2>3. Cultural diversity matters</h2>
<p>Our world is made of many different cultures. Acknowledging and accepting this cultural diversity is an ethical imperative, in Mondiacult 2022’s view. For the cultural ministers gathered in Mexico City, cultural diversity is the “founding principle of all of Unesco’s cultural conventions, recommendations and declarations. It cannot be separated from respect for human dignity and all fundamental human rights.” </p>
<h2>4. Cultural objects must be returned</h2>
<p>Another “ethical imperative” is the return of cultural assets to countries that they were looted from. The <a href="https://theconversation.com/benin-bronzes-what-is-the-significance-of-their-repatriation-to-nigeria-171444">Benin bronzes</a> case is a good example – ancient cultural objects stolen from Nigeria by colonial forces who are now slowly returning them. This restitution is crucial because it is supposed to “promote the right of peoples and communities to enjoy their cultural heritage … to strengthen social cohesion and the intergenerational transmission of cultural heritage”. It would be morally unfair to deny restitution, according to Mondiacult 2022. </p>
<h2>5. Culture is a global public good</h2>
<p>Culture is “our most powerful global public good”, <a href="https://unesdoc.unesco.org/ark:/48223/pf0000382082_eng">wrote</a> Unesco official Ernesto Ottone:</p>
<blockquote>
<p>Today, more than ever, we need to find meaning, we need universality, we need culture in all its diversity. </p>
</blockquote>
<p>Culture is reaffirmed as the “existential foundation” of humanity in this period of multiple crises on the planet. Now that a high-level meeting like Mondiacult has affirmed that culture is a public good, it must be preserved in the same way as the environment is.</p>
<h2>6. Culture is a development goal in itself</h2>
<p>Most significant is a new momentum to give culture a central place in the global development agenda. Before Mondiacult, Unesco’s aim was to convince the world’s policymakers that culture can <a href="https://unesdoc.unesco.org/ark:/48223/pf0000371557.locale=fr">contribute</a> significantly to achieving sustainable development goals. Now, Mondiacult 2022’s ambitious final <a href="https://www.unesco.org/en/articles/mondiacult-2022-states-adopt-historic-declaration-culture?TSPD_101_R0=080713870fab2000f74c4eb59493c567f3e18b1c8872e37ae64990e839cf3668f57e49286fb9f65f08249d61f71430003d79c69a210fba638ee45377843ff76e26f08becf03cf6dff247f25bfdb1b4b06649a8fba6fb9883fadb4106e6dc9543">declaration</a> affirms:</p>
<blockquote>
<p>We call on the UN secretary general to firmly anchor culture as a global public good and to integrate it as a specific goal in its own right in the development agenda beyond 2030.</p>
</blockquote>
<p>The cultural goal is to achieve “more harmony between peoples and communities”. This could involve the promotion of cultural diversity, the return of cultural assets, increased budgets for creative activities and other policies. </p>
<h2>Why this matters</h2>
<p>If the UN adopts this option of culture being a sustainable development goal, the post-2030 sustainable development agenda will have new content. This will change how development agencies deal with culture and how universities teach the relationship between culture and development. The result could be more funding for culture, which is increasingly underfunded by governments. </p>
<p>In addition, making cultural diversity an “ethical imperative” should play a role, if possible, in discussions about the commercialisation of cultural goods and services and the digital transition. </p>
<p>Next to come will be Mondiacult’s conditions of implementation. This is a follow-up action plan that should mobilise stakeholders to embrace Mondiacult’s outcomes ahead of the 2024 UN <a href="https://www.un.org/en/common-agenda/implementation">Summit of the Future</a>.</p><img src="https://counter.theconversation.com/content/198247/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ribio Nzeza Bunketi Buse does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The huge gathering of policymakers focused on culture’s crucial role in sustainable development.Ribio Nzeza Bunketi Buse, Associate Professor, University of Kinshasa Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1941582022-12-25T20:41:36Z2022-12-25T20:41:36Z5 great immersive experiences you can have this summer<figure><img src="https://images.theconversation.com/files/500605/original/file-20221213-1889-xrw7op.jpg?ixlib=rb-1.1.0&rect=0%2C13%2C4500%2C3119&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>What do you think of when you hear the word “immersive”?</p>
<p>It conjures up different things for different people. For some, it’s a simple feeling you get hitting the beach, the pool or even the floatation tank. </p>
<p>For others, it’s immersion through imagination – through books, theatre, exhibitions or the cinema. </p>
<p>For the more tech-savvy, immersion may involve picking up their phone, turning on a game console and grabbing a controller or strapping on a head-mounted display to enter a different reality.</p>
<p>All these interpretations are correct. Immersion is sensorial. It hits one or more of your senses – sight, hearing, touch, smell and taste. It makes you physically engage, interact and navigate with and through an experience.</p>
<p>Here are five different methods to immerse you this summer beyond jumping into the ocean.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-brief-history-of-immersion-centuries-before-vr-94835">A brief history of immersion, centuries before VR</a>
</strong>
</em>
</p>
<hr>
<h2>1. Augmented reality</h2>
<p>For the uninitiated, augmented reality is a way to interact with digital content superimposed and interacting with the real world, usually through your mobile device. </p>
<p>While augmented reality hasn’t had a considerable impact post the heady days of 2016 and Pokémon GO, the team behind that worldwide smash haven’t been resting on their laurels. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/_LXyYgMaUuE?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p><a href="https://www.ingress.com/">Ingress Prime</a> is an excellent option for those coming from the Pokémon experience, with a more adult open-ended story and elements of “capture the flag” mixed with old-fashioned <a href="https://en.wikipedia.org/wiki/Geocaching">geocaching</a>. </p>
<p>During game play, you pick a team and your phone is transformed into a “scanner”, and local landmarks are turned into “portals”. Two teams compete to claim ownership of these portals.</p>
<p>And their brand new app <a href="https://playperidot.com/">Peridot</a>, currently in beta, will be familiar to Tamagotchi owners, here with a few twists. You get to raise, care for and even breed your virtual pet with other player’s pets in order to avoid extinction. </p>
<p>But unlike Tamagotchis of old, you can take these creatures for virtual walks, as you explore the actual, physical world around you. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/KFPaWNjZ9uY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-augmented-reality-anyway-99827">What is augmented reality, anyway?</a>
</strong>
</em>
</p>
<hr>
<h2>2. 3D movies</h2>
<p>See that fancy flatscreen television sitting in the corner of the lounge? Chances are that if it was purchased in the early to mid-2010s it may have been part of the push for 3D TVs and may have even come with a bunch of 3D glasses similar to the ones you might get at the cinema. </p>
<p>There are some great hidden 3D gems you can watch at home.</p>
<p><a href="https://youtu.be/C-Rs8cmZdtM">The Young and Prodigious T. S. Spivet</a> (2013), is a lovely example of a road movie, as our ten-year-old protagonist travels across the country to accept an award from the Smithsonian for inventing a perpetual motion machine.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/C-Rs8cmZdtM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Slightly more adult is <a href="https://youtu.be/0yPt3kQzxa8">Long Day’s Journey Into Night</a> (2019), which plays like a Lynchian dream for most of its running time and features an astonishing hour-long 3D sequence presented as a single take as the film’s protagonist wanders through town. </p>
<p>Finally, the 2018 Oscar winner for animated feature, <a href="https://youtu.be/g4Hbz2jLxvQ">Spider-Man: Into the Spider-Verse</a> is a downright trippy and loopy experience if you can find it in 3D.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/g4Hbz2jLxvQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>3. Escape rooms</h2>
<p>Immersive experiences don’t have to take in technology.</p>
<p>In an escape room, a small team bands together to solve a series of puzzles to “escape” from the “room” these puzzles are set in. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Ee0qrodm5kY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>You can find many different escape rooms in almost all Australia’s capital cities. My personal favourite is the <a href="https://www.cipherroom.com.au/">Cipher Room</a> in Sydney’s inner west and their monochrome, film noir inspired Marlowe Hotel. </p>
<p>My advice is to dress up in black and white to make for a completely immersive experience, as you and your friends solve a series of clues in order to break into the hotel and retrieve some incriminating documents.</p>
<h2>4. Virtual reality games</h2>
<p>While embracing new tech, why not get your retro-gaming fix simultaneously? </p>
<p>Older gamers might remember the classic 1990s CD-Rom adventure Myst, where the player explores a mysterious island solving puzzles along the way (also serving as inspiration for thousands of escape rooms across the globe). </p>
<p>The game has now been <a href="https://store.steampowered.com/app/1255560/Myst/">re-imagined</a> for virtual reality as a free-roaming adventure and has never looked better. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/IJs2GcbzgP4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Fans of 2000s consoles systems will get a kick from <a href="https://store.steampowered.com/app/788690/Psychonauts_in_the_Rhombus_of_Ruin/">Psychonauts in the Rhombus of Ruin</a> which continues the wacky Tim Burton-esque aesthetics of the classic Psychonauts (2005), picking up the story from the end of the first adventure and taking it into new dimensions and levels. </p>
<p>As per the original game you take on the role of Raz, as you use his psychic powers to solve a series of puzzles to escape the Rhombus of Ruin. Terrific for a bit of lazy afternoon casual gameplay.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/FYkNKO9F28s?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Fans of first-person shooters will dig the multi award-winning <a href="https://store.steampowered.com/app/546560/HalfLife_Alyx/">Half-Life: Alyx</a>. </p>
<p>This is the game to play if you want to sweat it out, as you run around fighting against aliens that have taken over the Earth. Alyx’s storyline serves as a prequel to Half-Life 2 (2004), and features some hilarious voice acting from Rhys Darby as the character Russell. Highly recommended. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/O2W0N3uKXmo?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/virtual-reality-can-combat-isolation-with-awe-and-empathy-on-earth-and-in-space-170189">Virtual reality can combat isolation with awe and empathy — on Earth and in space</a>
</strong>
</em>
</p>
<hr>
<h2>5. 4DX movies</h2>
<p>James Cameron has finally finished his sequel to 2009’s Avatar, and the best way to experience <a href="https://youtu.be/d9MyW72ELq0">Avatar: The Way of Water</a> is going to be the fully immersive experience of 4DX, a cinema experience available in most capital cities. The technology blends on-screen images with synchronised motion seats and environmental effects such as water, wind, fog, fragrance, snow and more.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/d9MyW72ELq0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Cameron’s film should offer a completely immersive experience, using the 4DX cinema’s synchronisation of 3D visuals, motion simulation and fog effects. Especially given the flying and underwater sequences, the wind and water effects should make for a completely immersive experience over summer.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-future-of-tv-how-feely-vision-could-tickle-all-our-senses-54059">The future of TV? How feely-vision could tickle all our senses</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/194158/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gregory Ferris does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>From augmented reality to hitting an escape room, here’s how to keep yourself – and your senses – occupied this summer.Gregory Ferris, Senior Lecturer, Media Arts & Production, University of Technology SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1943172022-12-04T12:36:38Z2022-12-04T12:36:38ZThe metaverse offers challenges and possibilities for the future of the retail industry<figure><img src="https://images.theconversation.com/files/497760/original/file-20221128-20372-wpv8jq.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C7951%2C4999&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">As technology improves, the potential for retailers to make use of the metaverse will grow.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>In 1968, American computer scientist Ivan Sutherland predicted the future of augmented and virtual reality with his concept of the “<a href="https://doi.org/10.1145/1476589.1476686">Ultimate Display</a>. The Ultimate Display relied on <a href="https://www.oxfordreference.com/view/10.1093/oi/authority.20110810105219843">the kinetic depth effect</a> to create two dimensional images that moved with its users, giving the illusion of a three-dimensional display. </p>
<p>While the concept of <a href="https://doi.org/10.1016/j.jbusres.2019.04.023">virtual reality</a> only focuses on the creation of three-dimensional environments, the <a href="https://doi.org/10.3390/encyclopedia2010031">metaverse</a> — a term <a href="https://www.wired.com/story/plaintext-neal-stephenson-named-the-metaverse-now-hes-building-it/">coined by Neal Stephenson in his 1992 book <em>Snow Crash</em></a> — is a much broader concept that surpasses this. </p>
<p>While no official definition of the metaverse truly exists, science and technology reporter <a href="https://www.newscientist.com/article/2286778-what-is-a-metaverse-and-why-is-everyone-talking-about-it/">Matthew Sparkes provides a decent one</a>. He defines the metaverse as "a shared online space that incorporates 3D graphics, either on a screen or in virtual reality.”</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">What is the metaverse, and what can we do there?</a>
</strong>
</em>
</p>
<hr>
<p>Since the term was coined, the idea of the metaverse has remained more of a fictional concept than a scientific one. However, with technological advancements in recent years, the metaverse has become more tangible. Much of the recent hype happened after Mark Zuckerberg made the <a href="https://theconversation.com/facebook-relaunches-itself-as-meta-in-a-clear-bid-to-dominate-the-metaverse-170543">announcement to rename the Facebook brand to Meta</a>. Many retailers have since jumped aboard the metaverse train. </p>
<figure class="align-center ">
<img alt="A white man in a black long-sleeved shirt gestures while speaking" src="https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/497765/original/file-20221128-20372-r5t3i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Meta chief executive Mark Zuckerberg announced Facebook’s name change to Meta in 2021. He said the move reflected the company’s interest in broader technological ideas, like the metaverse.</span>
<span class="attribution"><span class="source">(AP Photo/Nick Wass)</span></span>
</figcaption>
</figure>
<p>Nike <a href="https://www.cnbc.com/2021/11/02/nike-is-quietly-preparing-for-the-metaverse-.html">recently filed multiple trademarks</a> allowing them to create and sell Nike shoes and apparel virtually. <a href="https://www.forbes.com/sites/ronshevlin/2022/02/16/jpmorgan-opens-a-bank-branch-in-the-metaverse-but-its-not-for-what-you-think-its-for/?sh=672a911a158d">JP Morgan opened their first virtual bank branch</a>. <a href="https://www.psfk.com/2022/02/samsung-galaxy-debuts-new-products-in-the-metaverse.html">Samsung recreated their New York City flagship store</a> in the virtual browser-based platform <a href="https://decentraland.org/">Decentraland</a>, where they are launching new products and creating events.</p>
<p>While many retailers are capitalizing on the metaverse early, there is still uncertainty about whether the metaverse really is the future of retailing or whether it will be a short-lived fad.</p>
<h2>Dispelling metaverse myths</h2>
<p>Much of that uncertainty around the metaverse stems from confusion about the technology. While examining the top keyword associations related to the metaverse on Google Trends, I found “what is metaverse” and “metaverse meaning” to be the top phrases customers searched for. To alleviate some of this confusion, it’s important to dispel commonly held myths about the metaverse.</p>
<p><strong>Myth 1: You need a VR headset to access the metaverse</strong></p>
<p>While an optimal experience in the metaverse can be achieved through VR headsets, anyone can access the metaverse through their personal computers. For instance, customers can create their avatars and access the metaverse in Decentraland on screen without a VR headset.</p>
<figure class="align-center ">
<img alt="A virtual avatar in a green shirt, black pants, and sneakers standing in a virtual world" src="https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=417&fit=crop&dpr=1 600w, https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=417&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=417&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=524&fit=crop&dpr=1 754w, https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=524&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/496248/original/file-20221119-14-i80nmz.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=524&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">My virtual avatar in Decentraland.</span>
<span class="attribution"><span class="source">(Decentraland Foundation)</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p><strong>Myth 2: The metaverse will replace real-life interactions</strong></p>
<p>Rather than replacing existing modes of communication, the Metaverse provides a more interactive mode of communication. New technologies <a href="https://hbr.org/2022/08/the-metaverse-will-enhance-not-replace-companies-physical-locations">always bring about predictions of the end of physical interactions</a>. It’s helpful to compare the metaverse with the rise of smartphones. Smartphones enhance communication by allowing people to interact with their social networks, but have not entirely replaced face-to-face interactions. The metaverse will be the same.</p>
<p><strong>Myth 3: The metaverse is just for gaming</strong></p>
<p>While gaming remains the dominant driver of user involvement with the metaverse (<a href="https://www.ey.com/en_us/tmt/what-s-possible-for-the-gaming-industry-in-the-next-dimension/chapter-3-insights-on-the-metaverse-and-the-future-of-gaming">97 per cent of gaming executives</a> believe that gaming is the centre of the metaverse today), it’s not the only activity people can take part in. </p>
<p>In a recent survey, McKinsey & Company <a href="https://www.mckinsey.com/industries/retail/our-insights/probing-reality-and-myth-in-the-metaverse">asked customers what their preferred activity on the metaverse would be</a> in the next five years. Shopping virtually ranked the highest, followed by attending <a href="https://www.cbc.ca/news/canada/hamilton/telehealth-khalid-1.5636540">telehealth appointments</a> and virtual synchronous courses.</p>
<h2>Keeping expectations realistic</h2>
<p>In its current form, the <a href="https://doi.org/10.1016/j.ijinfomgt.2022.102542">metaverse lacks the technological infrastructure</a> to deliver on market expectations. It may be appropriate to compare the metaverse with the <a href="https://finbold.com/guide/dot-com-bubble/">dot-com bubble between 1995 and 2000</a> that was caused by speculation in internet-based businesses.</p>
<p>Similarly, there appears to be tremendous hype and expectations around what the technology can deliver in its current form. <a href="https://www.talkdesk.com/resources/reports/connecting-in-the-metaverse/">A recent survey of 1,500 consumers</a> found that 51 per cent of people expect customer service to be better in the metaverse, 32 per cent expect less frustration and anxiety while dealing with customer service agents in the metaverse compared to phone interactions, and 27 per cent expect interactions with metaverse virtual avatar assistants to be more effective than online chat-bots.</p>
<p>While such expectations can appear reasonable, metaverse technology is still in its infancy stage, where the focus remains on developing infrastructure and processes for the future. The unrealistic expectations may potentially lead to a metaverse bubble as reality struggles to meet expectations.</p>
<h2>Challenges for retailers</h2>
<p>As with any emerging technology, retailers need to be prepared for challenges posed by the metaverse. Some of these challenges include the following:</p>
<ul>
<li><p><strong>Data security and privacy:</strong> With the novelty of metaverse technology and the wealth of personal data collected, the metaverse will be <a href="https://theconversation.com/we-need-to-anticipate-and-address-potential-fraud-in-the-metaverse-186188">an attractive target for cyber-hackers</a>. New approaches and methods need to be considered for a safe metaverse that customers can trust. </p></li>
<li><p><strong>Experienced talent:</strong> Having the right talent that can create, manage and support experiences in the metaverse needs to be at the forefront of engaging with the technology. However, due to the novelty of the technology, finding such talent will be a challenge.</p></li>
<li><p><strong>Regulations:</strong> <a href="https://doi.org/10.1145/3546607.3546611">With no clear jurisdictions and regulations in place</a>, the safety of virtual spaces in the metaverse may be compromised and end up pushing customers away. Retailers need to ensure these spaces are safe and protected.</p></li>
<li><p><strong>Managing customers’ expectations:</strong> Retailers need to educate their customers about what can currently be done in the metaverse, and what customers should expect from businesses in the metaverse.</p></li>
</ul>
<p>Despite these challenges, retailers will still be able to craft novel shopping experiences in the metaverse — it will just require appropriately skilled and qualified people to make it happen. With appropriate planning and preparation, retailers will be able to meet these challenges head-on.</p>
<figure class="align-center ">
<img alt="A woman wearing a VR headset standing in a shopping mall" src="https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/497772/original/file-20221128-20-v4ei81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The metaverse will have the potential to revolutionize the retail industry once the technology is advanced enough.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Opportunities for retailers</h2>
<p>As technology improves, the potential uses of the metaverse for retailers will grow. At the moment, the metaverse offers retailers three key opportunities for improving the online shopping experience. </p>
<p>The first is brand exposure. Retailers can expand their presence through virtual billboards and interactive advertisements with less noise, compared to existing online and mobile channels. Cloud Nine, an IT services company, <a href="https://www.globenewswire.com/news-release/2022/03/01/2394387/0/en/Cloud-Nine-Publishes-Metaverse-Advertising-Billboards-for-Its-Limitless-VPN-Metaverse-Focused-VPN-Service.html">is one of the earliest companies to advertise their services on virtual billboards in Decentraland</a>. Virtual billboard advertising is something marketers should keep in mind.</p>
<p>Secondly, the metaverse offers unique experiences for customers to engage with brands through events, contests, and game-like features. Such experiences could increase loyalty and brand engagement. <a href="https://www.vogue.com/article/metaverse-fashion-week-decentraland">The Metaverse Fashion Week</a> is an example of how retailers can create unique brand engagement opportunities. Retailers including Tommy Hilfiger, Perry Ellis and Dolce & Gabbana all participated in the pilot experience, leading the wave for immersive and unique customer-brand interactions.</p>
<p>Lastly, the metaverse provides retailers the chance to personalize customer experiences. Similar to how retailers can <a href="https://hbr.org/2015/05/customer-data-designing-for-transparency-and-trust">customize customers’ online experiences through data collection</a>, retailers can tailor customer experiences in the virtual environment. In <a href="https://www.oculus.com/horizon-worlds/learn/?utm_source=gg&utm_medium=ps&utm_campaign=18478966989&utm_term=horizon%20world&utm_content=">Meta’s Horizon Worlds</a>, for example, users can create their own virtual worlds, invite friends and customize their own experiences.</p><img src="https://counter.theconversation.com/content/194317/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Omar H. Fares does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The metaverse offers novel opportunities for retailers and their customers, but retailers need to be adequately prepared to overcome the challenges of new technology.Omar H. Fares, Lecturer in the Ted Rogers School of Retail Management, Toronto Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1947282022-11-16T23:07:51Z2022-11-16T23:07:51ZAn entire Pacific country will upload itself to the metaverse. It’s a desperate plan – with a hidden message<figure><img src="https://images.theconversation.com/files/495556/original/file-20221116-21-2v3psz.jpg?ixlib=rb-1.1.0&rect=100%2C100%2C5505%2C3891&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What's the message between the lines of Tuvalu's proposal to move to the metaverse?</span> <span class="attribution"><a class="source" href="https://unsplash.com/s/photos/message-in-the-bottle">Scott Van Hoy/Unsplash</a>, <a class="license" href="http://artlibre.org/licence/lal/en">FAL</a></span></figcaption></figure><p>The Pacific nation of Tuvalu is planning to create a version of itself in the metaverse, as a response to the existential threat of rising sea levels. Tuvalu’s minister for justice, communication and foreign affairs, Simon Kofe, made the announcement via a chilling digital address to leaders at COP27. </p>
<p>He said the plan, which accounts for the “worst case scenario”, involves creating a <a href="https://theconversation.com/au/topics/digital-twin-89034">digital twin</a> of Tuvalu in the metaverse in order to replicate its beautiful islands and preserve its rich culture:</p>
<blockquote>
<p>The tragedy of this outcome cannot be overstated […] Tuvalu could be the first country in the world to exist solely in cyberspace – but if global warming continues unchecked, it won’t be the last.</p>
</blockquote>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/sJIlrAdky4Q?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Tuvalu turns to metaverse as rising seas threaten existence, 16 Nov 2022.</span></figcaption>
</figure>
<p>The idea is that the metaverse might allow Tuvalu to “fully function as a sovereign state” as its people are forced to live somewhere else. </p>
<p>There are two stories here. One is of a small island nation in the Pacific facing an existential threat and looking to preserve its nationhood through technology. </p>
<p>The other is that by far the preferred future for Tuvalu would be to avoid the worst effects of climate change and preserve itself as a terrestrial nation. In which case, this may be its way of getting the world’s attention. </p>
<h2>What is a metaverse nation?</h2>
<p>The <a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">metaverse</a> represents a burgeoning future in which augmented and virtual reality become part of everyday living. There are many visions of what the metaverse might look like, with the most well-known coming from Meta (previously Facebook) CEO Mark Zuckerberg.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">What is the metaverse, and what can we do there?</a>
</strong>
</em>
</p>
<hr>
<p>What most of these visions have in common is the idea that the metaverse is about interoperable and immersive 3D worlds. A persistent avatar moves from one virtual world to another, as easily as moving from one room to another in the physical world.</p>
<p>The aim is to obscure the human ability to distinguish between the real and the virtual, for <a href="https://theconversation.com/what-is-the-metaverse-a-high-tech-plan-to-facebookify-the-world-165326">better or for worse</a>.</p>
<p>Kofe implies three aspects of Tuvalu’s nationhood could be recreated in the metaverse:</p>
<ol>
<li><p>territory – the recreation of the natural beauty of Tuvalu, which could be interacted with in different ways</p></li>
<li><p>culture – the ability for Tuvaluan people to interact with one another in ways that preserve their shared language, norms and customs, wherever they may be</p></li>
<li><p>sovereignty – if there were to be a loss of terrestrial land over which the government of Tuvalu has sovereignty (a tragedy beyond imagining, but which they have begun to imagine) then could they have sovereignty over virtual land instead?</p></li>
</ol>
<h2>Could it be done?</h2>
<p>In the case that Tuvalu’s proposal is, in fact, a literal one and not just symbolic of the dangers of climate change, what might it look like?</p>
<p>Technologically, it’s already easy enough to create beautiful, immersive and richly rendered recreations of Tuvalu’s territory. Moreover, thousands of different online communities and 3D worlds (such as <a href="https://secondlife.com/">Second Life</a>) demonstrate it’s possible to have entirely virtual interactive spaces that can maintain their own culture.</p>
<p>The idea of combining these technological capabilities with features of governance for a “<a href="https://theconversation.com/what-are-digital-twins-a-pair-of-computer-modeling-experts-explain-181829">digital twin</a>” of Tuvalu is feasible. </p>
<p>There have been prior experiments of governments taking location-based functions and creating virtual analogues of them. For example, Estonia’s <a href="https://en.wikipedia.org/wiki/E-Residency_of_Estonia">e-residency</a> is an online-only form of residency non-Estonians can obtain to access services such as company registration. Another example is countries setting up virtual embassies on the <a href="https://www.learntechlib.org/p/178165/">online platform Second Life</a>.</p>
<p>Yet there are significant technological and social challenges in bringing together and digitising the elements that define an entire nation. </p>
<p>Tuvalu has only about 12,000 citizens, but having even this many people interact in real time in an immersive virtual world is a technical challenge. There are <a href="https://www.matthewball.vc/all/networkingmetaverse">issues of bandwidth</a>, computing power, and the fact that many users have an aversion to headsets or suffer nausea.</p>
<p>Nobody has yet demonstrated that nation-states can be successfully translated to the virtual world. Even if they could be, others argue the digital world makes <a href="http://thestack.org/">nation-states redundant</a>.</p>
<p>Tuvalu’s proposal to create its digital twin in the metaverse is a message in a bottle – a desperate response to a tragic situation. Yet there is a coded message here too, for others who might consider retreat to the virtual as a response to loss from climate change.</p>
<h2>The metaverse is no refuge</h2>
<p>The metaverse is built on the physical infrastructure of servers, data centres, network routers, devices and head-mounted displays. All of this tech has a hidden carbon footprint and requires physical maintenance and energy. <a href="https://theconversation.com/the-internet-consumes-extraordinary-amounts-of-energy-heres-how-we-can-make-it-more-sustainable-160639">Research</a> published in Nature predicts the internet will consume about 20% of the world’s electricity by 2025. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-internet-consumes-extraordinary-amounts-of-energy-heres-how-we-can-make-it-more-sustainable-160639">The internet consumes extraordinary amounts of energy. Here's how we can make it more sustainable</a>
</strong>
</em>
</p>
<hr>
<p>The idea of the <em>metaverse nation</em> as a response to climate change is exactly the kind of thinking that got us here. The language that gets adopted around new technologies – such as “cloud computing”, “virtual reality” and “metaverse” – comes across as both clean and green. </p>
<p>Such terms are laden with “<a href="https://www.publicaffairsbooks.com/titles/evgeny-morozov/to-save-everything-click-here/9781610393706/">technological solutionism</a>” and “<a href="https://eprints.qut.edu.au/203186/">greenwashing</a>”. They hide the fact that technological responses to climate change often <a href="https://www.sciencedirect.com/science/article/abs/pii/S0921800905001084?via%3Dihub">exacerbate the problem</a> due to how energy and resource intensive they are.</p>
<h2>So where does that leave Tuvalu?</h2>
<p>Kofe is well aware the metaverse is not an answer to Tuvalu’s problems. He explicitly states we need to focus on reducing the impacts of climate change through initiatives such as a <a href="https://www.theguardian.com/environment/2022/nov/08/tuvalu-first-to-call-for-fossil-fuel-non-proliferation-treaty-at-cop27">fossil-fuel non-proliferation treaty</a>. </p>
<p>His video about Tuvalu moving to the metaverse is hugely successful as a provocation. It got worldwide press – just like his <a href="https://youtu.be/jBBsv0QyscE">moving plea</a> during COP26 while standing knee-deep in rising water.</p>
<p>Yet Kofe suggests:</p>
<blockquote>
<p>Without a global conscience and a global commitment to our shared wellbeing we may find the rest of the world joining us online as their lands disappear.</p>
</blockquote>
<p>It is dangerous to believe, even implicitly, that moving to the metaverse is a viable response to climate change. The metaverse can certainly assist in keeping heritage and culture alive <a href="https://eprints.qut.edu.au/131407/">as a virtual museum</a> and digital community. But it seems unlikely to work as an ersatz nation-state. </p>
<p>And, either way, it certainly won’t work without all of the land, infrastructure and energy that keeps the internet functioning.</p>
<p>It would be far better for us to direct international attention towards Tuvalu’s other initiatives described in the <a href="https://devpolicy.org/tuvalu-preparing-for-climate-change-in-the-worst-case-scenario-20211110/">same report</a>: </p>
<blockquote>
<p>The project’s first initiative promotes diplomacy based on Tuvaluan values of olaga fakafenua (communal living systems), kaitasi (shared responsibility) and fale-pili (being a good neighbour), in the hope that these values will motivate other nations to understand their shared responsibility to address climate change and sea level rise to achieve global wellbeing.</p>
</blockquote>
<p>The message in a bottle being sent out by Tuvalu is not really about the possibilities of metaverse nations at all. The message is clear: to support communal living systems, to take shared responsibility and to be a good neighbour.</p>
<p>The first of these can’t translate into the virtual world. The second requires us to <a href="https://theconversation.com/ending-the-climate-crisis-has-one-simple-solution-stop-using-fossil-fuels-194489">consume less</a>, and the third requires us to care.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ending-the-climate-crisis-has-one-simple-solution-stop-using-fossil-fuels-194489">Ending the climate crisis has one simple solution: Stop using fossil fuels</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/194728/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nick Kelly receives research funding from the Australian Research Council.</span></em></p><p class="fine-print"><em><span>Marcus Foth receives research funding from the Australian Research Council and the Future Food CRC. He is a member of the Queensland Greens.</span></em></p>Rising sea levels due to climate change are already having severe impacts on the nation of Tuvalu. It proposes to build a digital replica of itself in the metaverse. Could it be done?Nick Kelly, Senior Lecturer in Interaction Design, Queensland University of TechnologyMarcus Foth, Professor of Urban Informatics, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1934822022-11-04T18:08:44Z2022-11-04T18:08:44ZWhy Meta’s share price collapse is good news for the future of social media<figure><img src="https://images.theconversation.com/files/493309/original/file-20221103-15-nv3ite.jpg?ixlib=rb-1.1.0&rect=0%2C47%2C7956%2C5260&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/kazan-russia-oct-28-2021-facebook-2065574948">Sergei Elagin / Shutterstock</a></span></figcaption></figure><p>Facebook may not be the original social media platform but it has stood the test of time – until recently. Meta, the company that owns Facebook, Instagram and WhatsApp, saw its value plummet by <a href="https://www.ft.com/content/0f4c676c-56a6-4b5e-850f-ddb78f9feb40">around $80 billion</a> (£69 billion) in just one day at the end of October, after its <a href="https://www.theguardian.com/technology/2022/oct/26/meta-earnings-report-facebook-stocks">third-quarter profits halved</a> amid the global slowdown. Meta is now valued at <a href="https://www.cnbc.com/2022/10/27/meta-is-no-longer-one-of-the-20-biggest-us-companies.html">around $270 billion</a> compared with more than $1 trillion last year.</p>
<p>Several issues have caused <a href="https://www.theguardian.com/technology/2022/oct/27/metas-shares-dip-is-proof-metaverse-plan-never-really-had-legs-facebook">investors to turn away</a> from the social media giant, including falling advertising revenue, a <a href="https://www.theverge.com/2022/10/25/23423637/apple-app-store-tax-boosted-social-media-posts">conflict with Apple</a> over its app store charging policy, and <a href="https://www.washingtonpost.com/technology/interactive/2022/tiktok-popularity/">competition for younger audiences</a> from newer platforms such as TikTok. </p>
<p>Meta’s chief-executive Mark Zuckerberg has also used his majority control to double down on his ambitions for the “metaverse”, a virtual reality project on which he has already spent more than $100 billion – with <a href="https://www.businessinsider.com/memes-mark-zuckerberg-metaverse-meta-struck-a-nerve-2022-10">questionable results</a> according to initial investor and media reaction. Zuckerberg has promised <a href="https://www.ft.com/content/f24edcb8-74a1-45da-8eb9-108ecc0da9ae">even more</a> investment in the metaverse next year. </p>
<p>It’s <a href="https://www.dailymail.co.uk/news/article-11369273/How-Mark-Zuckerberg-pumped-36-BILLION-failing-Metaverse-lost-30-billion-it.html">tempting</a> to describe this spending spree as <a href="https://www.independent.co.uk/voices/mark-zuckerberg-facebook-meta-metaverse-b2213071.html">a billionaire’s “insane fantasy”</a>, but there is a simpler explanation. As dominant platforms compete for a limited amount of advertising revenue, regulation – particularly when it differs between countries or regions – has created space for more competitors. This is good news for new social media companies, but it also means that the only way Meta is likely to be able to keep its dominant position is by placing a massive bet on the technology of the future. Zuckerberg believes that means the metaverse, but this <a href="https://www.independent.co.uk/tech/tim-cook-metaverse-apple-meta-b2191705.html">remains to be seen</a>.</p>
<figure class="align-center ">
<img alt="Man wearing VR googles with the Meta logo" src="https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=410&fit=crop&dpr=1 600w, https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=410&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=410&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=516&fit=crop&dpr=1 754w, https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=516&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/493319/original/file-20221103-22-c0br4w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=516&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Meta has invested a lot in its vision for the metaverse, accessed using a VR headset.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/meta-written-on-googles-man-wearing-2071779797">Aleem Zahid Khan / Shutterstock</a></span>
</figcaption>
</figure>
<h2>Tech’s changing fortunes</h2>
<p>Even with its recent troubles, Meta owns <a href="https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/">the largest social network</a> in the world. Those recent results that caused investors to flee in their droves <a href="https://www.ft.com/content/9e457f1a-ee9f-419d-b8aa-2b54f7809705">still showed</a> total revenues of $27 billion and profits of $4.4 billion. </p>
<p>To maintain its position as market leader in the past, Meta has typically bought its <a href="https://theconversation.com/tech-firms-face-more-regulation-after-moves-to-stop-killer-acquisitions-but-innovation-could-also-be-under-threat-187278">most promising competitors</a> as early as possible. <a href="https://theconversation.com/facebook-latest-court-case-shows-how-europe-is-clamping-down-on-big-tech-173100">Integrating</a> these newly acquired startups into the company’s ecosystem helped to <a href="https://onlinelibrary.wiley.com/doi/full/10.1111/1756-2171.12298">maximise advertising revenue</a> and preclude competition. </p>
<p>Research shows that digital markets are typically dominated by a single firm, but also that these firms tend to be <a href="https://onlinelibrary.wiley.com/doi/full/10.1002/smj.3365">much more specialised</a> than the major companies of the past. Meta is only active in social media and makes money almost exclusively by selling advertising.</p>
<p>Attempts by such firms to expand into other areas typically fail – know anyone with a Facebook <a href="https://www.cnet.com/tech/mobile/heres-why-the-facebook-phone-flopped/">phone</a>? And while you may not remember Google’s attempt at <a href="https://www.theverge.com/2019/4/2/18290637/google-plus-shutdown-consumer-personal-account-delete">social media</a>, iPhone users are probably at least aware of Apple’s <a href="https://www.cnet.com/tech/tech-industry/apple-ceo-we-are-extremely-sorry-for-maps-flap/">maps</a> app.</p>
<p>So Facebook relies on consumers using devices produced by other tech companies to make money. But as global social media advertising revenue <a href="https://www.ft.com/content/0abf4840-2f5a-4eae-8414-1dfda77750b0">slows down</a>, this is becoming more difficult. Apple has begun <a href="https://www.theverge.com/2022/10/25/23423637/apple-app-store-tax-boosted-social-media-posts">charging Meta</a> for the revenue it makes from iPhone users, for example. And research shows that, when two companies compete to make <a href="https://link.springer.com/article/10.1007/s11151-022-09872-z">money from the same captive source</a>, their successive markups not only push prices higher for consumers but also keep profits lower for both firms.</p>
<h2>Global domination fail</h2>
<p>Meta’s strategy has, until recently, allowed it to rule social media in western markets – but not in China, a country of more than <a href="https://www.mckinsey.com/%7E/media/McKinsey/Business%20Functions/Marketing%20and%20Sales/Our%20Insights/Understanding%20social%20media%20in%20China/Understanding%20social%20media%20in%20China.pdf">300 million social media users</a>. Since 2009, Facebook has been blocked by the country’s “<a href="https://www.theguardian.com/news/2018/jun/29/the-great-firewall-of-china-xi-jinpings-internet-shutdown">great firewall</a>”, the largest and most sophisticated system of censorship in the world. </p>
<p>Reported attempts to <a href="https://www.bbc.co.uk/news/technology-38073949">adapt Facebook</a> to suit Chinese government media control have never been successful. And so, Chinese company ByteDance was able to launch a news platform called <a href="https://d3.harvard.edu/platform-digit/submission/the-story-behind-toutiao-the-20-billion-news-aggregator-app/">Toutiao</a> in 2012 without having to compete with a dominant social network. In 2016, ByteDance launched Douyin, a social media platform for publishing short videos which was <a href="https://www.bbc.co.uk/news/technology-53640724">subsequently released</a> to the rest of the world in 2018 as TikTok. </p>
<p>Despite not being profitable, ByteDance’s market capitalisation is now estimated at around <a href="https://www.reuters.com/technology/bytedance-spend-up-3-bln-repurchase-shares-investors-2022-09-16/">$300 billion</a> – versus Meta’s current £270 billion valuation. It is also popular among <a href="https://www.washingtonpost.com/technology/interactive/2022/tiktok-popularity/">younger users</a> that tend to be much more avid social media users.</p>
<p>Meta cannot simply buy TikTok: it is too big, <a href="https://www.scmp.com/tech/big-tech/article/3193932/tiktok-owner-bytedance-approves-us3-billion-share-buy-back-its-first">not publicly traded</a> and under <a href="https://www.theguardian.com/world/2021/nov/03/chairman-tiktok-owner-bytedance-steps-down-zhang-yiming-beijing-tightens-grip">tight control</a> by the Chinese government. Zuckerberg’s firm has instead tried to compete by launching <a href="https://techcrunch.com/2022/06/02/chasing-tiktok-meta-rolls-out-new-reels-features-and-expands-instagram-reels-to-90-seconds/">similar features on Instagram</a>. Ironically, the only large market where <a href="https://www.bloomberg.com/news/newsletters/2022-10-04/facebook-s-glitzy-parties-in-tiktok-free-india">this strategy is really working</a> is India, a country that banned TikTok in 2021 <a href="https://www.bbc.co.uk/newsround/53266068">due to a military conflict</a> with China. </p>
<figure class="align-center ">
<img alt="Younger person accessing Tik Tok on phone." src="https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/493312/original/file-20221103-22-zgx3er.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">TikTok tends to attract a younger audience than more established platforms like Facebook.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/2207-barnaul-russia-women-holding-phone-1795552729">diy13 / Shutterstock</a></span>
</figcaption>
</figure>
<h2>Fair competition</h2>
<p>At the same time that TikTok has been expanding beyond Meta’s reach, western regulators have also started to examine the impact of the lack of competition in digital markets on innovation. While research shows that the winner-take-all nature of highly innovative markets is typically <a href="https://academic.oup.com/jeea/article/8/5/1133/2295941">good for consumers</a>, this is only true when all companies get a fair chance <a href="https://www.sciencedirect.com/science/article/abs/pii/S0167718721000023">to become dominant</a>.</p>
<p>In addition to <a href="https://theconversation.com/google-loses-appeal-against-2-4-billion-fine-tech-giants-might-now-have-to-re-think-their-entire-business-models-171628">recent rulings</a> against tech company dominance by its highest court, the European Union also recently introduced the <a href="https://eur-lex.europa.eu/eli/reg/2022/1925/oj">Digital Markets Act</a>. This outlaws many practices used by dominant firms to preserve their status in a market. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/can-the-eus-digital-markets-act-rein-in-big-tech-192373">Can the EU's Digital Markets Act rein in big tech?</a>
</strong>
</em>
</p>
<hr>
<p>Similar legislation is expected <a href="https://www.ft.com/content/8c64e651-c073-449e-bc6b-ca55ed7165ca">from the US</a> after the November <a href="https://time.com/6214028/tech-antitrust-bill-senate-vote/">midterm elections</a>, while the UK has <a href="https://www.theguardian.com/technology/2022/oct/18/facebook-meta-sell-giphy-cma">forced Meta to sell</a> gif library Giphy to ensure it doesn’t <a href="https://www.gov.uk/government/news/cma-orders-meta-to-sell-giphy">decrease competition</a> in the online advertising sector.</p>
<p>All of this means that, for Facebook to remain dominant, Meta needs to invest in its own products. To be the market leader of tomorrow, the company cannot simply count on buying up promising startups.</p>
<p>But its metaverse is a <a href="https://www.ft.com/content/41c59dc9-6185-4142-95b6-8aba8d2d84b3">nebulous project</a> and an odd bet. After all, Google has already failed to drum up interest in <a href="https://www.wired.com/story/google-glass-reasonable-expectation-of-privacy/">Google Glass</a>, even though the <a href="https://www.theguardian.com/commentisfree/2017/jul/23/the-return-of-google-glass-surprising-merit-in-failure-enterprise-edition">technology</a> behind it was successful. What has changed to convince normal people to <a href="https://www.theverge.com/2022/10/6/23391895/meta-facebook-horizon-worlds-vr-social-network-too-buggy-leaked-memo">regularly wear</a> virtual reality headsets? </p>
<p>The only alternative for Meta may be to find a better idea in which to invest. In the meantime, regulation continues to protect potential competitors. This is great news for consumers and creators alike: now might be the best moment to launch an innovative social media format that can actually compete with giants like Meta to become the market leader.</p>
<p><a href="https://theconversation.com/au/topics/social-media-and-society-125586" target="_blank"><img src="https://images.theconversation.com/files/479539/original/file-20220817-20-g5jxhm.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=144&fit=crop&dpr=1" width="100%"></a></p><img src="https://counter.theconversation.com/content/193482/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Renaud Foucart does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Meta’s focus on virtual reality might free up space for smaller social media players to compete.Renaud Foucart, Senior Lecturer in Economics, Lancaster University Management School, Lancaster UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1926332022-10-23T19:02:03Z2022-10-23T19:02:03ZIs the metaverse really the future of work?<figure><img src="https://images.theconversation.com/files/491001/original/file-20221021-27-fqxhat.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2868%2C1610&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Meta</span></span></figcaption></figure><p>According to Mark Zuckerberg, the “metaverse” – which the Meta founder <a href="https://www.theverge.com/22588022/mark-zuckerberg-facebook-ceo-metaverse-interview">describes</a> as “an embodied internet, where instead of just viewing content – you are in it” – will radically change our lives. </p>
<p>So far, Meta’s main metaverse product is a virtual reality playground called Horizon Worlds. When Zuckerberg announced his company’s metaverse push in October 2021, the prevailing sentiment was that it was something nobody had asked for, <a href="https://www.wired.com/story/metaverse-big-tech-land-grab-hype/">nor particularly wanted</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-the-metaverse-a-high-tech-plan-to-facebookify-the-world-165326">What is the metaverse? A high-tech plan to Facebookify the world</a>
</strong>
</em>
</p>
<hr>
<p>Many of us wondered what people would actually do in this new online realm. Last week, amid announcements of new hardware, software, and business deals, Zuckerberg presented an answer: the thing people will do in the metaverse is <a href="https://www.computerweekly.com/news/252525977/Meta-presents-vision-for-business-metaverse"><em>work</em></a>.</p>
<p>But who is this for? What are the implications of using these new technologies in the workplace? And will it all be as rosy as Meta promises?</p>
<h2>The future of work?</h2>
<p>The centrepiece of last week’s <a href="https://about.fb.com/news/2022/10/meta-quest-pro-social-vr-connect-2022/">Meta Connect</a> event was the announcement of the Quest Pro headset for virtual and augmented reality. Costing US$1,499 (~A$2,400), the device has new features including the ability to track the user’s eyes and face. </p>
<p>The Quest Pro will also use outward-facing cameras to let users see the real world around them (with digital add-ons).</p>
<p>Meta’s presentation showed this function in use for work. It depicted a user sitting among several large virtual screens – what it has previously dubbed “<a href="https://www.youtube.com/watch?v=5_bVkbG1ZCo&ab_channel=MetaQuest">Infinite Office</a>”. As Meta technical chief Andrew Bosworth <a href="https://www.youtube.com/watch?v=hvfV-iGwYX8&ab_channel=Meta">put it</a>, “Eventually, we think the Quest could be the only monitor you’ll need.”</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1579947568372404226"}"></div></p>
<p>Meta also announced it is working with Microsoft to make available virtual versions of business software such as Office and Teams. These will be incorporated into <a href="https://www.meta.com/au/work/workrooms/">Horizon Workrooms</a> virtual office platform, which has been widely ridiculed for its low-quality graphics and floating, legless avatars.</p>
<h2>The Microsoft approach</h2>
<p>The partnership may well provide significant benefit for both companies. </p>
<p>Microsoft’s own mixed-reality headset, the HoloLens, has seen limited adoption. Meta dominates the augmented and reality markets, so it makes sense for Microsoft to try to hitch a ride on Meta’s efforts.</p>
<p>For Meta, its project may gain credibility by association with Microsoft’s long history of producing trusted business software. Partnerships with other businesses in the tech sector and beyond are <a href="https://journals.sagepub.com/eprint/2NZHR4AAPZWJUKTIHCW7/full">a major way</a> that Meta seeks to materialise its metaverse ambitions.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A virtual reality office showing avatars sitting around a meeting table." src="https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=425&fit=crop&dpr=1 754w, https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=425&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/491002/original/file-20221021-26-dnas7u.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=425&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Meta Microsoft Teams in VR.</span>
<span class="attribution"><span class="source">Meta</span></span>
</figcaption>
</figure>
<p>Microsoft also represents an alternative approach to making a product successful. While several decades of efforts to sell VR technology to consumers have had limited success, Microsoft became a household name by selling to businesses and other enterprises.</p>
<p>By focusing on an enterprise market, firms can normalise emerging technologies in society. They might not be things that consumers <em>want</em> to use, but rather things that workers are <em>forced</em> to use. </p>
<p>Recent implementations of Microsoft’s Teams software in industry and government across Australia offer <a href="https://www.microsoft.com/en-au/business/industry/government/">models</a> for how the metaverse may arrive in offices.</p>
<h2>Enhanced bossware</h2>
<p>While proponents of work in the metaverse envisage a future in which technologies like AR and VR are frictionlessly incorporated into our work lives, bringing about prosperity and efficiency, there are a number of areas of concern.</p>
<p>For one, technologies like VR and AR threaten to institute new forms of worker surveillance and control. The rise of remote work throughout the COVID-19 pandemic led to a boom in “<a href="https://www.abc.net.au/news/science/2022-05-06/workers-returning-to-offices-covid-surveillance-software/101019128">bossware</a>” – software for employers to monitor every move of their remote workers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/3-ways-bossware-surveillance-technology-is-turning-back-the-management-clock-189070">3 ways 'bossware' surveillance technology is turning back the management clock</a>
</strong>
</em>
</p>
<hr>
<p>Technologies like VR and AR – which rely on the <a href="https://policyreview.info/articles/analysis/critical-questions-facebooks-virtual-reality-data-power-and-metaverse">capture and processing of vast amounts of data</a> about users and their environments to function – could well intensify such a dynamic. </p>
<p>Meta says such data will remain “<a href="https://www.theverge.com/23397187/mark-zuckerberg-quest-pro-metaverse-interview-decoder">on device</a>”. However, <a href="https://arxiv.org/abs/2106.05407">recent research</a> shows third-party Quest apps have been able to access and use more data than they strictly need.</p>
<h2>Privacy and safety</h2>
<p>Developers are learning, and worried, about <a href="https://dl.acm.org/doi/10.1145/3546155.3546691">the privacy and safety implications</a> of virtual and augmented reality devices and platforms. </p>
<p>In experimental settings, VR data are already used to track and measure biometric information about users with a high degree of <a href="https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2120845?casa_token=wh1pT6ou3OgAAAAA%3A4x6FA6UVnRAtJBTrE0kkFc2QXfYcA5zaAVdY9aVzPWEfVNGnaa71xks6xwYYRzKN-lsIf_Su504PSQ">accuracy</a>. VR data also have been used to measure things like <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3778085/">attention</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/companies-are-increasingly-tracking-eye-movements-but-is-it-ethical-191842">Companies are increasingly tracking eye movements — but is it ethical?</a>
</strong>
</em>
</p>
<hr>
<p>In a future where work happens in the metaverse, it’s not hard to imagine how things like gaze-tracking data might be used to determine the outcome of your next promotion. Or to imagine work spaces where certain activities are “programmed out”, such as anything deemed “unproductive”, or even things like union organising. </p>
<p>Microsoft’s 365 platform already monitors similar metrics about digital work processes – you can view your own <a href="http://myanalytics.microsoft.com">here</a>, if your organisation subscribes. Microsoft 365’s entrance to VR spaces will offer it plenty of new data to be analysed to describe your work habits.</p>
<p>Moderating content and behaviour in virtual spaces may also be an issue, which could lead to discrimination and inequity. Meta has so far <a href="https://lens.monash.edu/@kate-euphemia-clark/2022/10/13/1385033/sexual-assault-in-the-metaverse-isnt-a-glitch-that-can-be-fixed">given little</a> in the way of concrete protections for its users amid increasing claims of <a href="https://www.technologyreview.com/2021/12/16/1042516/the-metaverse-has-a-groping-problem/">harassment</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/im-a-black-woman-and-the-metaverse-scares-me-heres-how-to-make-the-next-iteration-of-the-internet-inclusive-173310">I'm a Black woman and the metaverse scares me – here’s how to make the next iteration of the internet inclusive</a>
</strong>
</em>
</p>
<hr>
<p>Earlier this year, a report by consumer advocacy group SumOfUs found many users in Horizon Worlds have <a href="https://www.sumofus.org/images/Metaverse_report_May_2022.pdf">been encouraged to turn off safety features</a>, such as “personal safety bubbles”, by other users. </p>
<p>The use of safety features in workplaces may likewise be seen as antisocial, or as not part of “the team”. This could have negative impacts for already marginalised workers.</p><img src="https://counter.theconversation.com/content/192633/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ben Egliston has received a research award funded by Meta Australia.</span></em></p><p class="fine-print"><em><span>Kate Euphemia Clark and Luke Heemsbergen do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The virtual office promised by the metaverse could bring enhanced surveillance and the drawbacks of online culture to the workplace.Ben Egliston, Postdoctoral Research Fellow, Digital Media Research Centre, Queensland University of TechnologyKate Euphemia Clark, PhD student, Media, Monash UniversityLuke Heemsbergen, Senior Lecturer, Media and Politics, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1877462022-10-03T12:06:27Z2022-10-03T12:06:27ZWhat is déjà vu? Psychologists are exploring this creepy feeling of having already lived through an experience before<figure><img src="https://images.theconversation.com/files/482853/original/file-20220906-22-6ddelb.jpg?ixlib=rb-1.1.0&rect=732%2C310%2C4526%2C3518&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How can someplace you've never been feel so familiar?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/girl-walking-through-door-royalty-free-image/646061622">mrs/Moment via Getty Images</a></span></figcaption></figure><figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=293&fit=crop&dpr=1 600w, https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=293&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=293&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=368&fit=crop&dpr=1 754w, https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=368&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/281719/original/file-20190628-76743-26slbc.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=368&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em><a href="https://theconversation.com/us/topics/curious-kids-us-74795">Curious Kids</a> is a series for children of all ages. If you have a question you’d like an expert to answer, send it to <a href="mailto:curiouskidsus@theconversation.com">curiouskidsus@theconversation.com</a>.</em></p>
<hr>
<blockquote>
<p><strong>Why do people experience déjà vu? – Atharva P., age 10, Bengaluru, India</strong></p>
</blockquote>
<hr>
<p>Have you ever had that weird feeling that you’ve <a href="http://onlyhumanaps.blogspot.com/2008/10/">experienced the same exact situation before</a>, even though that’s impossible? Sometimes it can even seem like you’re reliving something that already happened. This phenomenon, <a href="https://doi.org/10.1037/0033-2909.129.3.394">known as déjà vu</a>, has puzzled philosophers, <a href="https://doi.org/10.1001/archneurpsyc.1959.02340150001001">neurologists</a> and <a href="https://www.jstor.org/stable/25118382">writers</a> for a <a href="https://www.google.com/books/edition/The_Cavendish_Lecture/Dg41AQAAMAAJ?hl=en&gbpv=0">very long time</a>.</p>
<p>Starting in the late 1800s, <a href="https://www.routledge.com/The-Deja-Vu-Experience/Cleary-Brown/p/book/9780367273200">many theories began to emerge</a> regarding what might cause déjà vu, which means “already seen” in French. People thought maybe it stemmed from mental dysfunction or perhaps a type of brain problem. Or maybe it was a temporary hiccup in the otherwise normal operation of human memory. But the topic did not reach the realm of science until quite recently.</p>
<h2>Moving from the paranormal to the scientific</h2>
<p>Early in this millennium, a scientist named Alan Brown decided to conduct a <a href="https://doi.org/10.1037/0033-2909.129.3.394">review of everything researchers had written about déjà vu</a> until that point. Much of what he could find had a paranormal flavor, having to do with the supernatural – things like past lives or psychic abilities. But he also found studies that surveyed regular people about their déjà vu experiences. From all these papers, Brown was able to glean some basic findings on the déjà vu phenomenon.</p>
<p>For example, Brown determined that roughly two thirds of people experience déjà vu at some point in their lives. He determined that the most common trigger of déjà vu is a scene or place, and the next most common trigger is a conversation. He also reported on hints throughout a century or so of medical literature of a possible association between déjà vu and some types of seizure activity in the brain.</p>
<p>Brown’s review brought the topic of déjà vu into the realm of more mainstream science, because it appeared in both a scientific journal that scientists who study cognition tend to read, and also <a href="https://www.routledge.com/The-Deja-Vu-Experience/Brown/p/book/9781138006010">in a book</a> aimed at scientists. His work served as a catalyst for scientists to design experiments to investigate déjà vu.</p>
<h2>Testing déjà vu in the psychology lab</h2>
<p>Prompted by Brown’s work, my own research team began conducting experiments aimed at testing hypotheses about possible mechanisms of déjà vu. We <a href="https://www.routledge.com/The-Deja-Vu-Experience/Cleary-Brown/p/book/9780367273200">investigated a near century-old hypothesis</a> that suggested déjà vu can happen when there’s a spatial resemblance between a current scene and an unrecalled scene in your memory. Psychologists called this the Gestalt familiarity hypothesis.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="brightly lit area of a hospital with workers and patients" src="https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/482854/original/file-20220906-25-tky7ns.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Maybe the layout of a new place is very similar to somewhere else you’ve been, but that you aren’t consciously remembering.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/nurses-station-in-hospital-royalty-free-image/906005394">FS Productions/Tetra images via Getty Images</a></span>
</figcaption>
</figure>
<p>For example, imagine you’re passing the nursing station in a hospital unit on your way to visit a sick friend. Although you’ve never been to this hospital before, you are struck with a feeling that you have. The underlying cause for this experience of déjà vu could be that the layout of the scene, including the placement of the furniture and the particular objects within the space, have the same layout as a different scene that you did experience in the past.</p>
<p>Maybe the way the nursing station is situated – the furniture, the items on the counter, the way it connects to the corners of the hallway – is the same as how a set of welcome tables was arranged relative to signs and furniture in a hallway at the entrance to a school event you attended a year earlier. According to the Gestalt familiarity hypothesis, if that previous situation with a similar layout to the current one doesn’t come to mind, you might be left only with a strong feeling of familiarity for the current one.</p>
<p>To investigate this idea in the laboratory, my team used virtual reality to place people within scenes. That way we could manipulate the environments people found themselves in – some scenes shared the same spatial layout while otherwise being distinct. As predicted, <a href="https://doi.org/10.1016/j.concog.2011.12.010">déjà vu was more likely to happen</a> when people were in a scene that contained the same spatial arrangement of elements as an earlier scene they viewed but didn’t recall.</p>
<p>This research suggests that one contributing factor to déjà vu can be spatial resemblance of a new scene to one in memory that fails to be consciously called to mind at the moment. However, it does not mean that spatial resemblance is the only cause of déjà vu. Very likely, many factors can contribute to what makes a scene or a situation feel familiar. More research is underway to investigate additional possible factors at play in this mysterious phenomenon.</p>
<hr>
<p><em>Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to <a href="mailto:curiouskidsus@theconversation.com">CuriousKidsUS@theconversation.com</a>. Please tell us your name, age and the city where you live.</em></p>
<p><em>And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.</em></p><img src="https://counter.theconversation.com/content/187746/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne Cleary is a member of the American Psychological Association Council of Representatives.</span></em></p>While people have wondered about déjà vu for a long time, only recently have scientists started experimentally investigating what might trigger it.Anne Cleary, Professor of Cognitive Psychology, Colorado State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1899212022-09-13T12:33:06Z2022-09-13T12:33:06Z5 challenges of doing college in the metaverse<figure><img src="https://images.theconversation.com/files/483585/original/file-20220908-20-lxhmps.jpg?ixlib=rb-1.1.0&rect=0%2C69%2C4641%2C3487&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A student wears virtual reality goggles and headphones as part of a digital learning experience.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/connor-powelson-graduate-assistant-and-phd-candidate-news-photo/1240927332">Bill O'Leary/The Washington Post via Getty Images</a></span></figcaption></figure><p>More and more colleges are becoming “<a href="https://theconversation.com/six-benefits-that-the-metaverse-offers-to-colleges-and-universities-188950">metaversities</a>,” taking their physical campuses into a virtual online world, often called the “metaverse.” One initiative has <a href="https://www.victoryxr.com/our-partners/meta/">10 U.S. universities and colleges</a> working with Meta, the parent company of Facebook, and virtual reality company VictoryXR to create 3D online replicas – sometimes called “<a href="https://steve-grubbs.medium.com/the-advantages-of-a-digital-twin-virtual-reality-campus-563b77c951cc">digital twins</a>” – of their campuses that are updated live as people and items move through the real-world spaces.</p>
<p>Some classes are <a href="https://theconversation.com/six-benefits-that-the-metaverse-offers-to-colleges-and-universities-188950">already happening in the metaverse</a>. And VictoryXR says that by 2023, it plans to <a href="https://www.forbes.com/sites/emmawhitford/2022/09/03/metaversity-is-in-session-as-meta-and-iowas-victoryxr-open-10-virtual-campuses/?sh=606238016f25">build and operate 100 digital twin campuses</a>, which allow for a group setting with live instructors and real-time class interactions. </p>
<p>One metaversity builder, New Mexico State University, says it wants to offer degrees in which students can take all their classes in virtual reality, <a href="https://www.protocol.com/enterprise/metaverse-in-education-morehouse-meta">beginning in 2027</a>.</p>
<p>There are many <a href="https://theconversation.com/six-benefits-that-the-metaverse-offers-to-colleges-and-universities-188950">benefits to taking college classes in the metaverse</a>, such as 3D visual learning, more realistic interactivity and easier access for faraway students. But there are also potential problems. My recent <a href="https://scholar.google.com/citations?user=g-jALEoAAAAJ&hl=en&oi=ao">research</a> has focused on <a href="https://doi.org/10.1109/MITP.2022.3178509">ethical, social and practical</a> aspects of the metaverse and risks such as <a href="https://doi.org/10.1016/j.ijinfomgt.2022.102542">privacy violations and security breaches</a>. I see five challenges:</p>
<h2>1. Significant costs and time</h2>
<p>The metaverse <a href="https://theconversation.com/six-benefits-that-the-metaverse-offers-to-colleges-and-universities-188950">provides a low-cost learning alternative in some settings</a>. For instance, building a cadaver laboratory costs <a href="https://skarredghost.com/2021/08/04/victoryxr-fisk-vr-cadaver-lab/">several million dollars and requires a lot of space</a> and maintenance. A virtual cadaver lab has made scientific <a href="https://www.fisk.edu/featured/fisk-university-htc-vive-t-mobile-and-victoryxr-launch-5g-powered-vr-human-cadaver-lab/">learning affordable at Fisk University</a>.</p>
<p>However, licenses for virtual reality content, construction of digital twin campuses, virtual reality headsets and other investment expenses do <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">add costs for universities</a>.</p>
<p>A metaverse course license can cost universities <a href="https://www.insidehighered.com/news/2022/08/03/college-metaverse-here-higher-ed-ready">at least $20,000, and could go as high as $100,000 for a digital twin campus</a>. VictoryXR also charges a <a href="https://www.forbes.com/sites/emmawhitford/2022/09/03/metaversity-is-in-session-as-meta-and-iowas-victoryxr-open-10-virtual-campuses/?sh=3dbfa7cf6f25">yearly subscription fee of $200</a> per student to access its metaverse.</p>
<p>Additional costs are incurred for virtual reality headsets. While Meta is providing a <a href="https://www.insightintodiversity.com/metaversities-offer-new-possibilities-for-education-but-some-experts-urge-campuses-to-be-mindful-of-potential-risks/">limited number of its virtual reality headsets – the Meta Quest 2 – for free</a> for metaversities launched by Meta and VictoryXR, that’s only a few of what may be needed. The low-end 128GB version of the Meta Quest 2 <a href="https://store.facebook.com/quest/products/quest-2/">headset costs $399.99</a>. Managing and maintaining a large number of headsets, <a href="https://www.forbes.com/sites/forbestechcouncil/2022/08/24/the-accessibility-and-affordability-of-the-metaverse-in-education-right-now/?sh=1ea2ba5d7f8f">including keeping them fully charged</a>, involves additional operational costs and time. </p>
<p>Colleges also need to spend significant time and resources to <a href="https://www.insidehighered.com/news/2022/08/03/college-metaverse-here-higher-ed-ready">provide training to faculty to deliver metaverse courses</a>. Even more time will be required to deliver metaverse courses, many of which will need <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">all-new digital materials</a>.</p>
<p>Most educators don’t have the <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">capability to create their own metaverse teaching materials</a>, which can involve merging videos, still images and audio with text and interactivity elements into an <a href="https://roundtablelearning.com/how-to-create-original-vr-content-everything-you-need-to-know/">immersive online experience</a>.</p>
<h2>2. Data privacy, security and safety concerns</h2>
<p>Business models of companies developing metaverse technologies <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">rely on collecting users’ detailed personal data</a>. For instance, people who want to use Meta’s Oculus Quest 2 virtual reality headsets must have Facebook accounts.</p>
<p>The headsets can collect highly personal and sensitive data <a href="https://store.facebook.com/legal/quest/privacy-policy/">such as location, students’ physical features and movements, and voice recordings</a>. Meta has <a href="https://www.washingtonpost.com/technology/2022/01/13/privacy-vr-metaverse/">not promised to keep that data private or to limit access</a> that advertisers might have to it.</p>
<p>Meta is also working on a high-end virtual reality headset called <a href="https://techcrunch.com/2021/10/28/project-cambria-is-a-high-end-vr-headset-designed-for-facebooks-metaverse/">Project Cambria</a>, with more advanced capabilities. Sensors in the device will allow a virtual avatar to maintain eye contact and make facial expressions that mirror the user’s eye movements and face. That data information <a href="https://www.washingtonpost.com/technology/2022/01/13/privacy-vr-metaverse/">can help advertisers measure users’ attention</a> and target them with personalized advertising.</p>
<p>Professors and students may not freely participate in class discussions if they know that all their moves, their speech and even their facial expressions are <a href="https://www.diverseeducation.com/institutions/article/15293003/what-could-the-metaverse-mean-for-higher-education">being watched by the university as well as a big technology company</a>.</p>
<p>The virtual environment and its equipment can also collect a wide range of user data, such as <a href="https://www.newsweek.com/metaverse-huge-opportunity-education-big-tech-must-not-ruin-it-opinion-1693962">physical movement, heart rate</a>, <a href="https://www.law.com/legaltechnews/2022/03/29/cybersecurity-privacy-and-constitutional-concerns-risks-to-know-before-entering-the-metaverse/?slreturn=20220714213359">pupil size, eye openness</a> and even signals of emotions. </p>
<p>Cyberattacks in the metaverse could even cause physical harm. Metaverse interfaces <a href="https://securityintelligence.com/articles/metaverse-security-challenges/">provide input directly into users’ senses</a>, so they effectively trick the user’s brain into believing the user is in a different environment. <a href="https://it-online.co.za/2022/02/08/meta-safety-meta-security-metaverse/">People who attack virtual reality systems</a> can influence the activities of immersed users, even inducing them to <a href="https://doi.org/10.1109/TDSC.2019.2907942">physically move into dangerous locations</a>, such as to the top of a staircase.</p>
<p>The metaverse can also <a href="https://www.emergingedtech.com/2022/04/where-is-edtech-heading-rise-of-metaverse-quick-guide/">expose students to inappropriate content</a>. For instance, Roblox has launched <a href="https://education.roblox.com/">Roblox Education</a> to bring 3D, interactive, virtual environments into physical and online classrooms. Roblox says it has <a href="https://www.connectsafely.org/roblox">strong protections to keep everyone safe</a>, but no protections are perfect, and its metaverse involves user-generated content and a chat feature, which could be <a href="https://www.familyzone.com/anz/families/blog/roblox-parents-review">infiltrated by predators</a> or people <a href="https://www.rollingstone.com/culture/culture-features/roblox-virtual-strip-clubs-condo-games-sex-1197237/">posting pornography</a> or other <a href="https://www.bark.us/blog/is-roblox-safe-for-kids/">illegal material</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/gOLI_OIV3nc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A Stanford University class took students on an exploration of world-merging virtual and physical elements.</span></figcaption>
</figure>
<h2>3. Lack of rural access to advanced infrastructure</h2>
<p>Many metaverse applications such as <a href="https://www.vodafone.com.au/red-wire/what-5g-networks-mean-for-the-future-of-vr-ar-technology">3D videos are bandwidth-intensive</a>. They require high-speed data networks to handle all of the <a href="https://www.fisk.edu/featured/fisk-university-htc-vive-t-mobile-and-victoryxr-launch-5g-powered-vr-human-cadaver-lab/">information flowing between sensors and users</a> across the virtual and physical space. </p>
<p>Many users, especially in rural areas, <a href="https://www.vodafone.com.au/red-wire/what-5g-networks-mean-for-the-future-of-vr-ar-technology">lack the infrastructure to support the streaming of high-quality metaverse content</a>. For instance, 97% of the population living in urban areas in the U.S. has <a href="https://www.weforum.org/agenda/2021/01/covid-digital-divide-learning-education/">access to a high-speed connection, compared to 65% in rural areas and 60%</a> in tribal lands.</p>
<h2>4. Adapting challenges to a new environment</h2>
<p>Building and launching a metaversity requires drastic changes in a school’s approach to <a href="https://www.incitevr.com/about/company">teaching</a> and learning.
For instance, metaverse <a href="https://www.uoc.edu/portal/en/news/actualitat/2022/143-education-metavers.html">students aren’t just recipients of content</a> but active participants in virtual reality games and other activities.</p>
<p>The combination of advanced technologies such as <a href="https://www.incitevr.com/about/company">immersive game-based learning and virtual reality with artificial intelligence</a> can create personalized learning experiences that are not in real time but still experienced through the metaverse. Automatic systems that tailor the content and pace of learning to the ability and interest of the student can make learning in the metaverse <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">less structured</a>, with fewer set rules.</p>
<p>Those differences require significant <a href="https://www.uoc.edu/portal/en/news/actualitat/2022/143-education-metavers.html">modifications in assessment and monitoring processes</a>, such as quizzes and tests. Traditional measures such as <a href="https://scholar.harvard.edu/files/mcgivney/files/introductionlearningmetaverse-april2022-meridiantreehouse.pdf">multiple choice questions are inappropriate to assess</a> individualized and unstructured learning experiences offered by the metaverse.</p>
<h2>5. Amplifying biases</h2>
<p>Gender, racial and ideological biases are common in textbooks of <a href="https://verdemagazine.com/checking-the-source-scrutinizing-the-biases-in-our-curriculum">history, science</a> and <a href="https://doi.org/10.1080/15290824.2018.1532570">other subjects</a>, which influence how students understand certain events and topics. In some cases, those biases prevent the achievement of justice and other goals, such as <a href="https://files.adulteducation.at/voev_content/340-gender_books.pdf">gender equality</a>.</p>
<p>Biases’ effects can be even more powerful in rich media environments. <a href="https://dergipark.org.tr/en/pub/jsser/issue/19098/202639">Films</a> are <a href="https://www.insidehighered.com/news/2022/08/03/college-metaverse-here-higher-ed-ready">more powerful</a> at <a href="https://doi.org/10.1080/00377996.2011.616239">molding students’</a> views than textbooks. <a href="https://aber.apacsci.com/index.php/met/article/view/1804/2138">Metaverse content</a> has the potential to be <a href="https://www.insidehighered.com/news/2022/08/03/college-metaverse-here-higher-ed-ready">even more influential</a>. </p>
<p>To maximize the benefits of the metaverse for teaching and learning, universities – and their students – will have to wrestle with protecting users’ privacy, training teachers and the level of national investment in broadband networks.</p><img src="https://counter.theconversation.com/content/189921/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nir Kshetri does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>There are benefits to taking college classes in the metaverse, but there are also potential problems.Nir Kshetri, Professor of Management, University of North Carolina – GreensboroLicensed as Creative Commons – attribution, no derivatives.