tag:theconversation.com,2011:/id/topics/haptics-5616/articlesHaptics – The Conversation2021-12-06T13:40:31Ztag:theconversation.com,2011:article/1719152021-12-06T13:40:31Z2021-12-06T13:40:31ZConsumers value a product viewed online more if they see it being virtually touched<figure><img src="https://images.theconversation.com/files/435086/original/file-20211201-19-1abb59k.jpg?ixlib=rb-1.1.0&rect=94%2C120%2C5651%2C3566&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Apple reportedly has policies designed to encourage consumers to touch its products. </span> <span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/Apple/31da0252188f493d971a98d86de33f0c/photo?Query=apple%20store%20&mediaType=photo&sortBy=arrivaldatetime:desc&dateRange=Anytime&totalCount=3720&currentItemNo=4">AP Photo/Marcio Jose Sanchez</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>Consumers who see a product on sale being virtually touched are more engaged and willing to pay more than if the item is displayed on its own, according to a <a href="http://www.doi.org/10.1177/00222437211059540">2022 research paper I co-authored</a>. </p>
<p><a href="https://thedecisionlab.com/reference-guide/economics/the-endowment-effect/">Behavioral economists have previously shown</a> that people value objects more highly if they own them, a concept known as “the endowment effect.” Marketers have found that <a href="https://doi.org/10.1086/598614">this feeling of ownership can occur</a> even when a consumer merely touches something in a store.</p>
<p>With <a href="https://www.census.gov/retail/mrts/www/data/pdf/ec_current.pdf">Americans buying a record amount of stuff online</a>, I wondered whether virtual touch also influences how consumers perceive and value products. To find out, I teamed up with marketing researchers <a href="https://scholar.google.com/citations?user=44V54PcAAAAJ&hl=en&oi=ao">Joann Peck</a>, <a href="https://scholar.google.com/citations?user=CSGi0lQAAAAJ&hl=en&oi=ao">William Hedgcock</a> and <a href="https://www.linkedin.com/in/yixiangxu/">Yixiang Xu</a> and performed a series of studies. </p>
<p>In one, we examined 4,535 Instagram posts from four companies with tangible products that could be displayed in one’s hands. For example, we reviewed Instagram posts including ones that showed a hand grasping a Starbucks Pumpkin Spice Latte against a backdrop of autumn leaves and hands unboxing the latest Samsung smartphone. We also examined posts without any touching. </p>
<p>Of the posts that contained a product, 43% portrayed hands in physical contact with it. These garnered significantly more engagement – receiving on average 65% more “likes” – than those that didn’t.</p>
<figure class="align-center ">
<img alt="Two images side by side show packages of yarn; the one on the left shows a hand touching it." src="https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=370&fit=crop&dpr=1 600w, https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=370&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=370&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=465&fit=crop&dpr=1 754w, https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=465&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/435392/original/file-20211202-15-q4rbd5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=465&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Which image makes you want to buy the yarn?</span>
<span class="attribution"><span class="source">We Are Knitters</span></span>
</figcaption>
</figure>
<p>To test this in an immersive environment, we recruited 144 students to a behavioral lab and asked them to wear a virtual reality headset that depicted them inside a sportswear store. Students could look 360 degrees around the virtual store, which mirrored a brick-and-mortar retail space with mannequins in the window and floor-to-ceiling clothing displays. </p>
<p>After about a minute, the headset simulated moving toward a red T-shirt hanging on a rack. One-third of the students then viewed their virtual hand reach out to touch the shirt, a second third saw a cursor appear over the product – and no hand – while the rest witnessed the hand grasp a pole on a nearby shelf. </p>
<p>Afterward, students completed a survey asking them to state how much they would pay for the T-shirt, up to $30. Those who saw their hand touching the shirt were willing to pay an average of 33% more than those who did not.</p>
<p>We tested across six additional studies using a variety of stimuli, including GIFs and videos. We varied the type of product being touched, the apparent gender and realness of the hands and their movement. We found consistent results showing an increased willingness to pay for the product when people “touched” it – even when we gave them a cartoonlike blue hand.</p>
<figure class="align-center ">
<img alt="A virtual scene shows a clothing store with clothes on racks, mannequins and shelves" src="https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/435393/original/file-20211202-13-64rlv9.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Students in the study were immersed in a VR sports store, which simulated reaching for the red T-shirt.</span>
<span class="attribution"><a class="source" href="http://www.doi.org/10.1177/00222437211059540">Luangrath, Peck, Hedgcock and Xu (2021)</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<h2>Why it matters</h2>
<p>Touch is a powerful tool for forming connections with products. </p>
<p><a href="https://doi.org/10.1086/598614">Companies have known this for years</a> and try to encourage consumers to touch products in their stores. Apple <a href="https://www.forbes.com/sites/carminegallo/2012/06/14/why-the-new-macbook-pro-is-tilted-70-degrees-in-an-apple-store/?sh=2f6e42625a98">reportedly tilts laptop screens</a> in its stores to a specific angle to force consumers to touch them to have a comfortable viewing angle.</p>
<p>As more sales occur online, companies are trying to adjust to replicate the in-store sensory experience, such as by making return policies lenient so people know they can still try before they buy. <a href="https://www.ama.org/2017/09/20/lenient-return-policies-can-increase-sales/">Studies have found this strategy can increase sales</a>.</p>
<p>Sellers are also <a href="https://www.immersion.com">experimenting with other ways</a> to mimic the sense of touch to get consumers to form these critical connections with their products. For example, <a href="https://www.marketingdive.com/ex/mobilemarketer/cms/news/software-technology/19457.html">companies are testing ways to use haptic</a> or touch technologies to allow consumers to get sensory feedback on their mobile phones when they watch ads. </p>
<p>Our research suggests that observing a product being touched establishes a connection to the hand on screen doing the touching. This may create the sensation that the virtual hand is one’s own, which increases the feeling of psychological ownership over the product.</p>
<h2>What still isn’t known</h2>
<p>We’ve studied how people perceive products that are being touched virtually, but we don’t know how this affects other consumer behaviors, such as returning a product. It’s possible that seeing someone else touch a product may backfire by creating high expectations for how a product feels but then fall short when consumers actually hold the product in their hands.</p>
<p>[<em>Too busy to read another daily email?</em> <a href="https://memberservices.theconversation.com/newsletters/?nl=weekly&source=inline-toobusy">Get one of The Conversation’s curated weekly newsletters</a>.]</p><img src="https://counter.theconversation.com/content/171915/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrea Luangrath does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>New research shows people experience the ‘endowment effect’ of valuing an object more when they can touch it, even in virtual settings.Andrea Luangrath, Assistant Professor of Marketing, University of IowaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1301052020-01-31T15:06:12Z2020-01-31T15:06:12ZVibration on the skin helps hearing-impaired people locate sounds<figure><img src="https://images.theconversation.com/files/313068/original/file-20200131-41485-sstbce.jpg?ixlib=rb-1.1.0&rect=12%2C228%2C4115%2C2500&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/new-york-city-nov-09-2011-568168327">Bumble Dee/Shutterstock</a></span></figcaption></figure><p>When we hear a car hurtling towards us, we usually immediately know where it is coming from so we can get out of the way. Our brains have an amazing ability to rapidly separate the sound of the car from background sounds and track its location in the world – an ability called spatial hearing. </p>
<p>To do this, our brains exploit the differences in the sounds arriving at our two ears. For example, if a sound comes from our left side, it is louder in our left ear and, because sound takes time to travel through the air, it arrives at our left ear first. Using these cues, our brain builds a continuously evolving picture of where sound-emitting things in the world are, which it uses to locate sound sources and avoid threats, such as that out-of-control car.</p>
<p>Unfortunately, for many people with hearing impairments, spatial hearing is often severely limited. This is particularly true for people who use a cochlear implant, who often find locating and separating different sounds <a href="https://www.ncbi.nlm.nih.gov/pubmed/16151344">very difficult</a>. </p>
<p>We are trying a different approach. We take the information usually given by the difference in sound between the ears and present it through the sense of touch instead. The idea is that by providing the missing sound information through vibrations on the skin, the brain will be able to merge the two senses to improve perception.</p>
<p>Cochlear implants work by bypassing the damaged parts of the outer and middle ear and directly stimulating the auditory nerve. For thousands of people with severe hearing impairments, this technology has had an incredible impact on their lives, restoring some of their hearing and allowing them to follow conversations in quiet places similarly to people with normal hearing. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/313069/original/file-20200131-41527-16riux0.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Cochlear implants are usually just put in one of the user’s ears.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Cochlear_implant#/media/File:Blausen_0244_CochlearImplant_01.png">BruceBlaus/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Unfortunately, most users only have an implant in one ear, usually because of the additional expense and risk of a second implantation surgery. Having only one implant means that those tiny differences in loudness and timing of sounds between the ears cannot be used for spatial hearing. </p>
<p>Although one implant works very well in a quiet place, this can make it difficult to cope in busy sound environments. Imagine trying to listen to your friend talk to you in a crowded restaurant - with clattering crockery and loud conversations - when you can’t tell which direction the different sounds are coming from.</p>
<h2>Once more, with feeling</h2>
<p>Researchers have previously tried several approaches to improve spatial hearing in cochlear implant users by improving implant technology, but with limited success. In a new study, <a href="https://www.nature.com/articles/s41598-020-58503-8">published in Scientific Reports</a>, we tested whether providing spatial hearing cues to the wrists as vibrations would help cochlear implant users locate sounds. </p>
<p>To conduct our experiment, we set up a ring of loudspeakers. The participants were asked to identify which loudspeaker played the sound of a voice saying: “Where am I speaking from?” As expected, many implant users struggled when they only had the audio. But we found that when they had haptic stimuli (vibrations) alongside the audio, they could identify the location of the sound far more accurately. </p>
<p>After around 15 minutes of training with audio and haptic cues together, participants were able to effectively combine the two signals. We found that they were performing better with combined audio and haptic stimulation than with either sense by itself. This suggests that their brains were able to rapidly merge the information arriving through the two senses to improve spatial hearing. </p>
<p>These results highlight the huge potential of <a href="https://www.nature.com/articles/s41598-019-47718-z">using haptics to aid hearing</a>. In <a href="https://journals.sagepub.com/doi/full/10.1177/2331216518797838">other work</a>, we have already shown that haptics can improve speech perception <a href="https://theconversation.com/playing-sound-through-the-skin-improves-hearing-in-noisy-places-97033">in noisy environments</a> in cochlear implant users.</p>
<p>In future work, we want to use haptics to help hearing-impaired listeners identify multiple sounds coming from different locations. For example, you might want to hear your friend’s voice to one side, <a href="https://theconversation.com/heres-what-music-sounds-like-through-an-auditory-implant-112457">music</a> from the radio from another side, and the patter of rain against the windowpane. This could help us find new ways to help hearing-impaired people build a rich and more accurate picture of their world through hearing.</p><img src="https://counter.theconversation.com/content/130105/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sean Mills has received PhD funding from the EPSRC.</span></em></p><p class="fine-print"><em><span>Mark Fletcher works for the University of Southampton (UK) and his Salary is funded by the Oticon Foundation.</span></em></p>Hearing-impaired listeners often struggle to locate the source of sounds - haptic technology could change that.Sean R Mills, Post-Graduate Researcher in Tactile Neuroscience, University of SouthamptonMark Fletcher, Research Fellow in Auditory Neuroscience, University of SouthamptonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1241452020-01-09T13:31:44Z2020-01-09T13:31:44ZMonkeys smashing nuts with stones hint at how human tool use evolved<figure><img src="https://images.theconversation.com/files/308857/original/file-20200107-123403-11eithu.jpg?ixlib=rb-1.1.0&rect=347%2C155%2C3245%2C2502&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A capuchin monkey in Brazil hoists a stone tool to crack open nuts.</span> <span class="attribution"><span class="source">Luca Antonio Marino</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>Human beings <a href="https://www.janegoodall.org.uk/chimpanzees/chimpanzee-central/15-chimpanzees/chimpanzee-central/19-toolmaking">used to be defined</a> as “the tool-maker” species. But the uniqueness of this description was challenged in the 1960s when Dr. Jane Goodall discovered that chimpanzees will pick and modify grass stems to use to collect termites. Her observations called into question <em>homo sapiens</em>‘ very place in the world.</p>
<p>Since then scientists’ knowledge of animal tool use has expanded exponentially. We now know that <a href="https://doi.org/10.1002/ajp.20342">monkeys</a>, <a href="https://doi.org/10.1016/j.cub.2007.07.057">crows</a>, <a href="https://doi.org/10.1098/rsbl.2015.0861">parrots</a>, <a href="https://doi.org/10.1016/j.mambio.2019.08.003">pigs</a> and many other animals can use tools, and research on animal tool use changed our understanding of how animals think and learn.</p>
<p>Studying animal <a href="https://doi.org/10.1016/bs.asb.2018.01.001">tooling</a> – defined as the process of using an object to achieve a mechanical outcome on a target – can also provide clues to the mysteries of human evolution.</p>
<p>Our human ancestors’ shift to making and using tools is linked to evolutionary changes in <a href="https://doi.org/10.1098/rstb.2012.0414">hand anatomy</a>, a <a href="https://doi.org/10.1007/s12052-010-0257-6">transition to walking on two rather than four feet</a> and <a href="https://doi.org/10.1016/B978-0-12-804042-3.00085-3">increased brain size</a>. But using found stones as pounding tools doesn’t require any of these advanced evolutionary traits; it likely came about before humans began to manufacture tools. By studying this percussive tool use in monkeys, researchers like my colleagues and I can infer how early human ancestors practiced the same skills before modern hands, posture and brains evolved.</p>
<h2>Monkeys using tools</h2>
<p>Understanding wild animals’ memory, thinking and problem-solving abilities is no easy task. In experimental research where animals are asked to perform a behavior or <a href="https://doi.org/10.1016/j.anbehav.2012.11.003">solve a problem</a>, there should be no distractions – like a predator popping up. But wild animals come and go as they please, over large spaces, and researchers cannot control what is happening around them.</p>
<p>However, some field sites provide a unique opportunity to test wild animals’ cognition. Fazenda Boa Vista in Piauí, Brazil is one of those sites. Here, wild bearded capuchin monkeys (<em>Sapajus libidinosus</em>) naturally use stones and anvils to crack open nuts.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/308434/original/file-20200103-11924-5thx25.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Young capuchin monkey observes adult male monkey eating nuts cracked open using a stone tool.</span>
<span class="attribution"><span class="source">Luca Antonio Marino</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Along with fruit, insects, fungi, and tubers, the Fazenda Boa Vista capuchin monkeys <a href="https://doi.org/10.1016/j.anbehav.2012.03.002">opportunistically crack open nuts</a> as an additional food source. Although these monkeys only spend about <a href="https://doi.org/10.1016/j.anbehav.2012.03.002">2% of their time using tools to access foods</a>, the nuts they eat are an important secondary food item that are available year-round. The challenge is that these nuts have tough shells that <a href="https://doi.org/10.1002/ajp.20578">can’t be cracked open without a tool</a>. This population of monkeys has figured out how to crack nuts by placing them on a wood or stone anvil and then smashing them with rocks that <a href="https://doi.org/10.1016/j.jhevol.2011.02.010">weigh around 25-50% of their body weight</a>.</p>
<p>These bearded capuchin monkeys were the first South American primates that scientists ever observed using tools – only spotted in 2003. Since this discovery, researchers have been studying the <a href="https://doi.org/10.1016/j.anbehav.2010.04.018">decision-making</a> and <a href="https://doi.org/10.1371/journal.pone.0056182">strategies</a> involved in capuchins’ stone tool use.</p>
<p>Because using stones to pound open food looks remarkably like what anthropologists imagine one of the <a href="https://doi.org/10.1038/nature14464">earliest forms of human tool use</a> looked like, researchers study these monkeys as a way to understand our own evolutionary past. </p>
<h2>What happens with a new tool?</h2>
<p>My colleagues and I carried out an <a href="https://doi.org/10.1002/ajp.22958">experimental field study</a> that focused on understanding how these monkeys prepare to use their tools. Just as a person might move her hands around a box to decide how best to lift it, monkeys at this site feel their way through tool use.</p>
<p>First, we placed unfamiliar stones and palm nuts around naturally-occurring wood or stone anvils. Since the monkeys frequently use stones to crack open these tough nuts on the anvils, it was only a matter of time before they tried out the experimental stones.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/tCzLWALwy8E?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Slow-motion video allowed for careful observation of how monkeys used new tools.</span></figcaption>
</figure>
<p>We filmed slow-motion videos of 12 monkeys cracking nuts to understand how monkeys adjust to using an unfamiliar tool. The idea, stemming from <a href="https://www.routledge.com/The-Ecological-Approach-to-Visual-Perception-Classic-Edition/Gibson/p/book/9781848725782">perception-action theory</a>, is that monkeys may obtain helpful information about the tool, like how heavy it is, and where they can hold it securely, by manipulating it before they use it. Like testing a hammer with a few light taps before you use it, this information may then help the monkeys to strike the nut forcefully and accurately.</p>
<p>Back in the U.S., we spent months carefully watching the slow-motion videos and recording the monkeys’ quick behaviors while using tools. The videos showed that for nut-cracking, monkeys grasp the sides of a stone, lift it to shoulder height, quickly move its hands to the top of the stone, then bring it down on the nut.</p>
<p>Given that the stones <a href="https://doi.org/10.1016/j.anbehav.2009.11.004">can weigh about half as much as an adult female monkey</a>, this is an impressive feat. But it’s not always done perfectly. If the monkey’s grip isn’t right, she might lose control of the stone, and if the stone comes down at an angle the nut is likely to fly off the anvil. When this happens, the monkeys lose precious time and energy trying to achieve their goal. </p>
<p>What we found, though, is that the monkeys might avoid these imperfect outcomes by spinning, flipping and doing partial lifts with the stones to test different grips and find the one that’s most likely to be successful. The preparatory lifts didn’t necessarily help the monkeys crack open more nuts, but they might be linked to “tuning” muscular coordination as the monkeys prepare themselves for a heavy lift. Essentially, the preparatory lifts may help the monkeys get a sense of what they’ll need their muscles to do when it comes time to lift the stone and strike the nut in earnest.</p>
<p>This same sort of <a href="https://www.sciencedirect.com/topics/neuroscience/haptic-perception">haptic perception</a> – the process of coming to understand an object by moving it around – plays a key role your own ability to use tools with dexterity. In human beings’ evolutionary past, increasingly refined haptic perception may have contributed to advancing tool use.</p>
<p>Studying how animals think about and use tool offers scientists like me an exciting glimpse into what human evolutionary history may have looked like, while also helping us to better understand animals in their own right.</p>
<p>[ <em>You’re smart and curious about the world. So are The Conversation’s authors and editors.</em> <a href="https://theconversation.com/us/newsletters?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=youresmart">You can read us daily by subscribing to our newsletter</a>. ]</p><img src="https://counter.theconversation.com/content/124145/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kristen S. Morrow does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Capuchin monkeys in Brazil use big stones to crush the shells of nuts they want to eat. An experiment in the field investigated how these monkeys prepare to use new, unfamiliar tools.Kristen S. Morrow, PhD Student in Anthropology and Integrative Conservation, University of GeorgiaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/970332018-09-17T09:33:44Z2018-09-17T09:33:44ZPlaying sound through the skin improves hearing in noisy places<figure><img src="https://images.theconversation.com/files/220759/original/file-20180529-80623-653tc0.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>Hundreds of thousands of people with severe hearing loss depend on surgically implanted electronic devices to recover some of their hearing. These devices, known as auditory or cochlear implants, aren’t perfect. In particular, implant users find it difficult to understand speech when there is background noise. We have a new approach to solve this problem that involves playing sound through the skin.</p>
<p>People with auditory implants hear the world in a very different way to people with healthy hearing (the video below simulates what it’s like to hear through an auditory implant). In an implant user, the sound that is usually transmitted to the brain by tens of thousands of extraordinarily sensitive cells in the ear is instead transmitted by just 22 tiny electrodes. This means that the information transmitted to the brain is severely limited.</p>
<p>This is a big problem in complex sound environments, with a conversation in the corner, music blaring, the bang of a door and the clatter of cutlery. The implant user is unable to join a conversation in a busy office or hear a teacher in a chaotic classroom. We need a new way to get crucial sound information to the brain and bypass the information bottleneck at the implant.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/n9fvlG7LfSc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">‘I beg your pardon?’ Auditory implant users struggle to understand speech in noisy places.</span></figcaption>
</figure>
<h2>Fusing the senses</h2>
<p>The brain is continuously combining information from all our senses to build a picture of the world. When a sense is impaired, as in a deaf or blind person, the brain can compensate by using information from another sense. </p>
<p>In the late 1960s, Paul Bach-y-Rita showed that <a href="https://www.nature.com/articles/221963a0">blind people are able to “see”</a> what is happening in a film when visual information is presented through vibration on the lower back. Since then, researchers have shown that people are able to <a href="https://ieeexplore.ieee.org/abstract/document/720206/">“see” using sound</a>, and that people who have lost their sense of balance are able to balance again when the missing information is <a href="https://www.ncbi.nlm.nih.gov/pubmed/19542601">presented through touch</a>. </p>
<p>As auditory implant users only get limited sound information through their implant, we wondered whether providing extra sound information through touch could improve their hearing.</p>
<p>To do this, we developed a simple, adaptable system that takes speech in a noisy environment and extracts the broad sound-level fluctuations, known as the “speech envelope”. This speech envelope information is not conveyed effectively by the implant and is known to be important for <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2736727/">understanding speech in noise</a>. The speech envelope information is then converted into small vibrations on the skin. The brain can then combine these signals with the implant signal to improve understanding of speech.</p>
<p>In our latest <a href="http://journals.sagepub.com/doi/full/10.1177/2331216518797838">study</a>, published in Trends in Hearing, we presented speech in noise with and without vibration from our system and measured how many words participants were able to identify. We found that the device improved word identification for seven of our eight participants. Training was important. Participants were able to identify an average of 5% more words in noise with the device when first using it, and an average of 11% more words, after just 30 minutes of practice. It is possible that, with everyday use, we may find even larger benefits.</p>
<p>Our goal is to develop a compact, inexpensive, wrist-worn device that can be used in the real world within two years. We hope that this device will help implant users hear in noisy places and expand their access to education, work and leisure.</p><img src="https://counter.theconversation.com/content/97033/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>People who depend on auditory implants to hear struggle to understand speech in noisy places. A new device could change that.Sean R Mills, Post-Graduate Researcher in Tactile Neuroscience, University of SouthamptonMark Fletcher, Research Fellow in Auditory Neuroscience, University of SouthamptonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/540592016-02-05T13:39:10Z2016-02-05T13:39:10ZThe future of TV? How feely-vision could tickle all our senses<figure><img src="https://images.theconversation.com/files/110430/original/image-20160205-18264-spjjy4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/downloading_tips.mhtml?code=&id=264429647&size=huge&image_format=jpg&method=download&super_url=http%3A%2F%2Fdownload.shutterstock.com%2Fgatekeeper%2FW3siZSI6MTQ1NDcwMTY4NSwiYyI6Il9waG90b19zZXNzaW9uX2lkIiwiZGMiOiJpZGxfMjY0NDI5NjQ3IiwiayI6InBob3RvLzI2NDQyOTY0Ny9odWdlLmpwZyIsIm0iOiIxIiwiZCI6InNodXR0ZXJzdG9jay1tZWRpYSJ9LCJTQmdFNDRMZ3ZNQUZRdHo4MWVURFg4WTdmcjAiXQ%2Fshutterstock_264429647.jpg&racksite_id=ny&chosen_subscription=1&license=standard&src=-q7yoWLFd0FRzBr6RHpeVA-1-42">Shutterstock</a></span></figcaption></figure><p>Imagine a party on a warm summer’s evening. You can see the beautiful greenery and the dipping sun, you can smell the freshly cut grass and taste the cool drinks on offer. You hear someone walk up behind you and feel them tap you on the shoulder. Now imagine you’re not really at the party – but sat at home and the scene and all these sensations are coming from your TV. </p>
<p>Working out how television programmes could one day stimulate all our senses is an interesting question for researchers like myself, who are exploring the future of TV. But the bigger, more exciting challenge is how we can not only imitate what is happening on the screen, but also use smell, taste and touch in a way that’s not a novelty and enhances the emotional experience of a show, just as a soundtrack does.</p>
<p>There’s good reason to think about how the TV industry can innovate in this way. Despite the rise of online video, millions if not billions of people still watch <a href="http://qz.com/233451/chart-how-much-of-the-world-watches-tv-vs-internet-video/">traditional broadcast media</a> through television sets. TV remains a powerful format for programme making and watching that follows specific restrictions and guidelines. </p>
<p>But more people are watching TV programmes online <a href="http://www.barb.co.uk/trendspotting/analysis/catch-up-viewing?_s=4">after their original broadcast</a>, on other devices such as <a href="http://media.ofcom.org.uk/news/2015/cmr-uk-2015/">tablets and phones</a>, are even using <a href="http://www.digitaltrends.com/home-theater/multi-screen-viewing-feature/">multiple screens</a> to engage with more than one piece of content at once. Broadcasters need to create new ways of experiencing TV that capture the audience’s full attention and immerse them in a multisensory world.</p>
<h2>Experimenting with the senses</h2>
<p>Creating truly compelling TV that stimulates all our senses is not an easy task. Programme makers and technology manufacturers know how to design their products so you can see depth and distance on the screen. But sound and vision <a href="http://dl.acm.org/citation.cfm?id=2783433">aren’t always enough</a>. Being able to <a href="http://dl.acm.org/citation.cfm?id=2557008">smell the odours</a> that a character on screen would smell, or feel the objects or atmosphere they would feel, can create anticipation and build suspense in the same way as sound currently does.</p>
<p>Cinema is already experimenting with these extra senses. Films with touch and smell sensations can be experienced in newly equipped <a href="http://www1.cineworld.co.uk/4dx/">4DX cinemas</a>, such as the one in Milton Keynes. The sense of taste seems a final frontier for technology development, but the interest in <a href="http://dl.acm.org/citation.cfm?id=2557007">taste experiences</a> has started to take off. For example, audience members at <a href="http://ediblecinema.co.uk">Edible Cinema</a> each receive a package of food and drinks to match what characters on screen experience.</p>
<p>The question for the TV industry is what multisensory experiences should it design for – and how. My Sussex Computer Human Interaction lab is trying to better understand how we use our senses so that designers and developers can help us interact with their technology in the most compelling way possible.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/OBOxuFmsxBY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Our latest work focuses on cutting edge technology such as the mid-air touch feedback or “haptic” device <a href="http://ultrahaptics.com">developed by Ultrahaptics</a>, a start-up in Bristol. We’re looking at how this technology could evoke emotions in the audience by allowing them to feel physical sensations without touching actual objects.</p>
<p>For example, projecting a pattern of ultrasound beams onto your hand can create different <a href="http://dl.acm.org/citation.cfm?id=2466220">tactile sensations</a>, such as a feeling of raindrops on your palm (without the water), or a flow of air as if you were holding your hand out of the window of a moving car. When carefully designed, this haptic feedback can produce even more specific patterns that allow you to feel different shapes, that change in size or that quickly move around.</p>
<h2>Emotional feedback</h2>
<p>By experimenting with different shapes, <a href="http://dl.acm.org/citation.cfm?id=2702361">we’ve studied</a> how this kind of haptic feedback can produce different emotions. We found that short, sharp bursts of air to the area around the thumb, index finger and middle part of the palm generate excitement. Slow and moderate stimulation of the outer palm and the area around the little finger create sad feelings.</p>
<p>This gives us a starting point to find out how mid-air touch sensations could be meaningfully integrated into other experiences, such as watching a movie. One challenge will be to make haptic feedback enhance the viewing experience without seeming intrusive or creepy, as suggested by “the feelies” cinema experience portrayed in the dystopian novel Brave New World. </p>
<p>We’ve recently began a five-year project to expand the research into taste and smell, as well as touch. The SenseX project will aim to provide guidelines and tools on how to design and integrate sensory stimuli for inventors and innovators to create richer interactive experiences. Relatively soon, we may be able to realise truly compelling and multi-faceted media experiences, such as 9-dimensional TV (adding tastes on top of 4DX), that evoke emotions through all our senses.</p><img src="https://counter.theconversation.com/content/54059/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marianna Obrist receives funding from the EC within the Horizon2020 programme through the ERC (Starting Grant Agreement 638605).</span></em></p>New technology could allow you to “feel” a TV show by transmitting touch feedback through the air.Marianna Obrist, Reader in interaction design, University of SussexLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/462262015-08-21T13:26:19Z2015-08-21T13:26:19ZWindows 95 turns 20 – and new ways of interacting show up desktop’s age<figure><img src="https://images.theconversation.com/files/92612/original/image-20150820-7246-1khx9v9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Windows 95 and DOS6: actual museum pieces.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/39908901@N06/7553233074/in/photolist-cvsi3s-4qmJd-9wh8gK-4nceyo-81ngrZ-ssAXm-pySXkd-a1cT8N-tY8LkY-7m8HCx-qVgfB1-62nUCf-o8tJ1N-nT2DYi-aZQQiB-oadDop-cVhbQj-7fHaxo-6BRKo7-99pCEn-nT2JsY-qjGZcb-x7LkbA-x8A5uD-nT2qQq-KGFTw-dAW8eV-onrJtj-jJ46T5-ppwXyX-sDESHv-b8WktX-78Bqme-f2ivoW-8Vaz9P-f7xMEb-ahDbrK-av6ZVx-5aTbBv-7SamS6-vCvu8D-jH3CZc-6YBCsm-x85e3U-dKCEk6-akuMkm-fvWK8M-cWYjHE-fmz5tk-bdSfSZ">m01229</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>The arrival of Microsoft Windows 95 on August 24 1995 brought about a desktop PC boom. With an easier and more intuitive graphical user interface than previous versions it appealed to more than just business, and Bill Gates’ stated aim of one PC per person per desk was set in motion. This was a time of 320Mb hard drives, 8Mb RAM and 15" inch CRT monitors. For most home users, the internet had only just arrived.</p>
<p>Windows 95 introduced the start menu, powered by a button in the bottom-left corner of the desktop. This gives a central point of entry into menus from which to choose commands and applications. The simplicity of this menu enables users to easily find commonly used documents and applications. All subsequent versions of Windows have kept this menu, with the notable exception of Windows 8, a change which <a href="http://www.extremetech.com/computing/141702-how-to-bring-the-start-menu-and-button-back-to-windows-8">prompted an enormous backlash</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/5VPFKnBYOSI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>We take these intuitive graphic interfaces for granted today, but earlier operating systems such as DOS and CP/M allowed the user to interact using only typed text commands. This all changed in the 1970s, with Ivan Sutherland’s work with Sketchpad and the use of lightpens to control CRT displays, Douglas Engelbart’s development of the computer mouse, and the <a href="http://www.parc.com/">Xerox PARC</a> research team’s creation of the <a href="https://www.youtube.com/watch?v=Cn4vC80Pv6Q">Windows Icon Menu Pointer graphical interfaces paradigm</a> (WIMP) – the combination of mouse pointer, window and icons that remains standard to this day. By the early 1980s, Apple had developed graphical operating systems for its Lisa (released 1983) and Macintosh (1984) computers, and Microsoft had released Windows (1985).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=333&fit=crop&dpr=1 600w, https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=333&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=333&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=419&fit=crop&dpr=1 754w, https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=419&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/92611/original/image-20150820-7213-smquui.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=419&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">DOS - these were not good old days.</span>
<span class="attribution"><span class="source">Krzysztof Burghardt</span></span>
</figcaption>
</figure>
<h2>Imagining a desktop</h2>
<p>All these interfaces rely on the central idea of the desktop, a comprehensible metaphor for a computer. We work with information in files and organise them in folders, remove unwanted information to the trash can, and note something of interest with a bookmark. </p>
<p>Metaphors are useful. They enable users to grasp concepts faster, but rely on the metaphor remaining comprehensible to the user and useful for the designer and programmer putting it into effect – without stretching it beyond belief. The advantage is that the pictures used to represent functions (icons) look similar to those in the workplace, and so the metaphor is readily understandable. </p>
<h2>Breaking windows</h2>
<p>But 20 years after Windows 95, the world has changed. We have smartphones and smart televisions, we use the internet prolifically for practically everything. Touchscreens are now almost more ubiquitous than the classic mouse-driven interface approach, and screen resolution is so high individual pixels can be difficult to see. We still have Windows, but things are changing. Indeed, they need to change.</p>
<p>The desktop metaphor has been the metaphor of choice for so long, and this ubiquity has helped computers find a place within households as a common, familiar tool rather than as specialist, computerised equipment. But is it still appropriate? After all, few of us sit in an office today with paper-strewn desks; books are read on a tablet or phone rather than hard-copies; printing emails is discouraged; most type their own letters and write their own emails; files are electronic not physical; we search the internet for information rather than flick through reference books; and increasingly the categorisation and organisation of data has taken second place to granular search.</p>
<p>Mouse-driven interfaces rely on a single point of input, but we’re increasingly seeing touch-based interfaces that accept swipes, touches and shakes in various combinations. We are moving away from the dictatorship of the mouse pointer. Dual-finger scrolling and pinch-to-zoom are new emerging metaphors – <a href="http://research.microsoft.com/en-us/collaboration/focus/nui/">natural user interfaces</a> (NUI) rather than graphical user interfaces. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/hSf2-jm0SsQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>What does the next 20 years hold?</h2>
<p>It’s hard to tell but one thing that is certain is that interfaces will make use of more human senses to display information and to control the computer. Interfaces will become more transparent, more intuitive and less set around items such as boxes, arrows or icons. Human gestures will be more commonplace. And such interfaces will be incorporated into technology throughout the world, through virtual reality and augmented reality.</p>
<p>These interfaces will be appear and feel more natural. Some suitable devices already exist, such as <a href="http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5438990&tag=1">ShiverPad</a>, that provide <a href="https://www.youtube.com/watch?v=Z-jQyeYApU8">shear forces</a> on surfaces that provide a frictional feel to touch devices. Or <a href="http://www.geomagic.com/en/products/phantom-desktop/overview/">Geomagic’s Touch X</a> (formerly the Sensible Phantom Desktop) that delivers three-dimensional forces to make 3D objects feel solid. </p>
<p>Airborne <a href="http://big.cs.bris.ac.uk/projects/ultrahaptics">haptics</a> are another promising technology that develop tactile interfaces in mid-air. Through ultrasound, users can feel <a href="http://www.gizmag.com/ultrasonic-tactile-haptic-interaction-holodeck/29360/">acoustic radiation fields</a> that emanate from devices, without needing to touch any physical surface. Videogame manufacturers have led the way with these interfaces, including the <a href="https://www.microsoft.com/en-us/kinectforwindows/">Microsoft Kinect</a> and <a href="https://www.microsoft.com/microsoft-hololens/en-us">Hololens</a> that allow users to use body gestures to control the interface, or <a href="https://theconversation.com/eye-tracking-is-the-next-frontier-of-human-computer-interaction-37596">with their eyes through head-mounted displays</a>. </p>
<p>Once interaction with a computer or device can be commanded using natural gestures, movements of the body or spoken commands, the necessity for the Windows-based metaphor of computer interaction begins to look dated – as old as it is.</p><img src="https://counter.theconversation.com/content/46226/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jonathan Roberts receives funding from AHRC, EPSRC.</span></em></p>The desktop interface originated in the 1970s and was exemplified by the arrival of Windows 95. Surely there’s a better approach, 20 years on?Jonathan Roberts, Senior Lecturer in Computer Science, Bangor UniversityLicensed as Creative Commons – attribution, no derivatives.