tag:theconversation.com,2011:/us/topics/robotics-316/articlesRobotics – The Conversation2024-02-26T13:37:32Ztag:theconversation.com,2011:article/2222662024-02-26T13:37:32Z2024-02-26T13:37:32Z‘Swarm of one’ robot is a single machine made up of independent modules<figure><img src="https://images.theconversation.com/files/575061/original/file-20240212-16-ex7r9g.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C4000%2C3000&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">This robot mimics simple life forms.</span> <span class="attribution"><a class="source" href="https://ieeexplore.ieee.org/document/10342118">Trevor Smith</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>My colleagues and I have built a robot composed of many building blocks like the cells of a multicellular organism. Without a “brain” or a central controller in the system, our robot, dubbed Loopy, relies on the collective behavior of all of its cells to interact with the world. </p>
<p>In this sense, we call Loopy a <a href="https://doi.org/10.1109/JPROC.2021.3072740">robotic swarm</a>. But Loopy can also be seen as a single robot since all the cells are connected; therefore, Loopy is also “a swarm of one.” This research could lead to adaptive robots that tailor their shapes and movements to their environments – for example, in environmental cleanup applications.</p>
<p>Loopy is a <a href="https://doi.org/10.1109/IROS55552.2023.10342118">primitive form of multicellular robot</a> that is made of a ring of 36 cells. Each cell has a <a href="https://www.youtube.com/watch?v=tHOH-bYjR4k">rotary servo</a> – an electric motor that rotates a shaft with precise controlled angle of rotation – and sensors. Each cell reacts on its own without input from any of the others except for its two immediate neighbors. As the servos move, the angles between the cells determine Loopy’s overall shape.</p>
<p>Loopy is free to morph into various shapes and exhibit a range of motions. But random shapes and motions are not useful. We were hoping something interesting would emerge from self-organization; that is, the spontaneous creation of order from disorder, without us telling Loopy what to do directly. It turned out that Loopy forms stable shapes that recover after Loopy bumps into obstacles.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/lyohCt0UN6A?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Loopy exhibiting spontaneous shapes and motions.</span></figcaption>
</figure>
<p>Famed mathematician <a href="https://www.britannica.com/biography/Alan-Turing">Alan Turing</a> was interested in the idea of self-organization back in 1952. He even envisioned <a href="https://doi.org/10.1098/rstb.1952.0012">a ring of cells</a>. Turing hypothesized the existence of chemicals that diffuse and react with each other, leading to the creation of <a href="https://theconversation.com/how-animals-get-their-skin-patterns-is-a-matter-of-physics-new-research-clarifying-how-could-improve-medical-diagnostics-and-synthetic-materials-217035">patterns in nature</a> like those on bird’s feathers and seashells. This self-organization approach using simulated chemicals enabled Loopy to form and transition between various lobed shapes spontaneously. </p>
<h2>Why it matters</h2>
<p>Engineered systems, and robots in particular, are predominantly designed with a <a href="https://doi.org/10.1007/s10514-007-9080-5">top-down approach</a>, where human designers anticipate the conditions the system may encounter and plan ahead through hardware designs, software programs or both. The problem is, the designers are not likely be there when the robot encounters an unanticipated situation. </p>
<p>This micromanagement approach in robot design is like giving kids a detailed manual when sending them to school the first day. A better way of parenting would be to provide general guidelines and feedback, and expect the kids to solve problems on their own. Similarly, a key motivation of developing Loopy is to unleash the power of <a href="https://link.springer.com/chapter/10.1007/978-3-642-79629-6_11">bottom-up collective “intelligence</a>” so Loopy can find new solutions on its own when a new situation arises; for example, finding the right shape for self to adapt to the environment.</p>
<h2>What other research is being done?</h2>
<p>The vision of programmable matter has been around for decades, yet tangible examples have been scarce. While researchers have explored complex shape formation through <a href="https://doi.org/10.1038/s44172-022-00034-3">self-assembly</a> or <a href="https://doi.org/10.1038/s44172-022-00034-3">reconfigurable robotic systems</a>, these often depend on predetermined shapes. </p>
<p>Similar to Loopy, researchers have applied Turing’s self-organization concept to <a href="https://doi.org/10.1126/scirobotics.aau9178">swarms of robots</a>, such as the small, simple, autonomous <a href="https://ssr.seas.harvard.edu/kilobots">Kilobots</a>, leading to the emergence of complex shapes. However, unlike Loopy, the physical forces between “cells” are not used to influence the final shape and behavior of the collective.</p>
<h2>What’s next?</h2>
<p>We would like Loopy to develop more lifelike traits, such as navigating unforeseen situations, seeking out better conditions, acquiring resources and mitigating threats. This vision extends to eventually enabling Loopy to perform tasks assigned by people, thereby bridging the gap between the open-ended creativity of self-organization and human guidance.</p>
<p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take on interesting academic work.</em></p><img src="https://counter.theconversation.com/content/222266/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Yu Gu works for West Virginia University. </span></em></p><p class="fine-print"><em><span>Trevor Smith works for West Virginia University.</span></em></p>‘Loopy’ is a multicellular robot inspired by biology and designed to react to its environment without instructions on how to do so.Yu Gu, Professor of Mechanical and Aerospace Engineering, West Virginia UniversityTrevor Smith, PhD Candidate in Mechanical Engineering, West Virginia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2208282024-02-14T13:24:42Z2024-02-14T13:24:42ZWe designed wormlike, limbless robots that navigate obstacle courses − they could be used for search and rescue one day<figure><img src="https://images.theconversation.com/files/571646/original/file-20240126-17-1c52dw.JPG?ixlib=rb-1.1.0&rect=55%2C0%2C4024%2C1578&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Limbless robots may not need lots of complex algorithms when they have mechanical intelligence. </span> <span class="attribution"><span class="source">Tianyu Wang</span></span></figcaption></figure><p>Scientists have been trying to build <a href="https://en.wikipedia.org/wiki/Snakebot">snakelike, limbless robots</a> for decades. These robots could come in handy in <a href="https://www.science.org/content/article/searching-survivors-mexico-earthquake-snake-robots">search-and-rescue</a> situations, where they could navigate collapsed buildings to find and assist survivors. </p>
<p>With slender, flexible bodies, limbless robots could readily move through confined and cluttered spaces such as debris fields, where walking or wheeled robots and human rescuers tend to fail.</p>
<p>However, even the most advanced limbless robots have not come close to moving with the agility and versatility of worms and snakes in difficult terrain. Even the tiny nematode worm <em><a href="http://www.wormbook.org/">Caenorhabditis elegans</a></em>, which has a relatively simple nervous systems, can navigate through difficult physical environments. </p>
<p>As part of a team of <a href="https://www.lulab.gatech.edu/">engineers</a>, <a href="https://crablab.gatech.edu/">roboticists and physicists</a>, we wanted to explore this discrepancy in performance. But instead of looking to neuroscience for an answer, <a href="https://en.wikipedia.org/wiki/Biomechanics">we turned to biomechanics</a>. </p>
<p>We set out to build a robot model that drove its body using a mechanism similar to how worms and snakes power their movement. </p>
<h2>Undulators and mechanical intelligence</h2>
<p>Over thousands of years, organisms have evolved <a href="https://www.britannica.com/science/nervous-system">intricate nervous systems</a> that allow them to sense their physical surroundings, process this information and execute precise body movements to navigate around obstacles. </p>
<p>In robotics, engineers design algorithms that take in information from sensors on the robot’s body – a type of robotic nervous system – and use that information to decide how to move. These algorithms and systems are usually complex. </p>
<p>Our team wanted to figure out a way to simplify these systems by highlighting mechanically controlled approaches to dealing with obstacles that don’t require sensors or computation. To do that, we turned to examples from biology.</p>
<p>Animals don’t rely solely on their neurons – brain cells and <a href="https://my.clevelandclinic.org/health/body/23123-peripheral-nervous-system-pns">peripheral nerves</a> – to control movement. They also use the physical properties of their body – for example, the elasticity of their muscles – to help them react to their environment spontaneously, before their neurons even have a chance to respond.</p>
<p>While computational systems are governed by <a href="https://en.wikipedia.org/wiki/Computational_logic">the laws of mathematics</a>, mechanical systems are governed by physics. To achieve the same task, scientists can either design an algorithm or carefully design a physical system. </p>
<p>For example, limbless robots and animals move through the world by bending sections of their body left and right, <a href="https://en.wikipedia.org/wiki/Undulatory_locomotion">a type of movement called undulation</a>. If they collide with an obstacle, they have to turn away and go around it by bending more to one side than the other.</p>
<p>Scientists could achieve this with a robot by attaching sensors to its head or body. They could then design an algorithm that tells the robot to turn away or wind around the obstacle when it “feels” a large enough force on its head or body. </p>
<p>Alternatively, scientists could carefully select the robot’s materials and the arrangement and strength of its motors so that collisions would spontaneously produce a body shape that led to a turn. This robot would have what scientists call “mechanical intelligence.”</p>
<p>If scientists like us can understand how organisms’ bodies respond mechanically to contact with objects in their environment, we can design better robots that can deal with obstacles without having to program complex algorithms. </p>
<p>If you compare a diverse set of undulating organisms with the increasingly large zoo of <a href="https://en.wikipedia.org/wiki/Snakebot">robotic “snakes</a>,” one difference between the robots and biological undulators stands out. Nearly all undulatory robots bend their bodies using a series of connected segments with motors at each joint. But that’s not how living organisms bend.</p>
<p>In contrast, all limbless organisms, from large snakes to the lowly, microscopic nematode, achieve bends not from a single rotational joint-motor system but instead through <a href="http://www.wormbook.org/chapters/www_bodywallmuscle/bodywallmuscle.html">two bands of muscles</a> on either side of the body. To an engineer, this design seems counterintuitive. Why control something with two muscles or motors when one could do the job? </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A diagram showing a gray worm with a window showing the inside of the worm's body, which has two bands of muscle on the left and right side, cuticle on the top and nerve cord on the bottom, top and sides." src="https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=283&fit=crop&dpr=1 600w, https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=283&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=283&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=355&fit=crop&dpr=1 754w, https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=355&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/575078/original/file-20240212-26-it6ean.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=355&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Nematodes have two bands of muscle on the sides of their bodies that control motion.</span>
<span class="attribution"><span class="source">Ralf J. Sommer and WormAtlas</span></span>
</figcaption>
</figure>
<p>To get to the bottom of this question, our team built a new robot called MILLR, for mechanically intelligent limbless robot, inspired by the two bands of muscle on snakes and worms. MILLR has two independently controlled cables that pull each joint left and right, bilaterally.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A diagram showing the design of MILLR, with servo motors on each body segment, and cables and pulleys connecting them." src="https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=275&fit=crop&dpr=1 600w, https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=275&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=275&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=345&fit=crop&dpr=1 754w, https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=345&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/575079/original/file-20240212-20-gtf8t7.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=345&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">MILLR’s design, inspired by nematode <em>C. elegans</em>.</span>
<span class="attribution"><span class="source">Tianyu Wang</span></span>
</figcaption>
</figure>
<p><a href="https://doi.org/10.1126/scirobotics.adi2243">We found</a> this method allows the robot to spontaneously move around obstacles without having to sense its surroundings and actively change its body posture to comply to the environment.</p>
<h2>Building a mechanically intelligent robot</h2>
<p>Rather than mimicking the detailed muscular anatomy of a particular organism, MILLR applies forces to either side of the body by spooling and unspooling a cable. </p>
<p>This way, it mirrors the muscle activation methods that snakes and nematodes use, where the left and right sides take turns activating. This activation mode pulls the body toward one side or another by tightening on one side, while the other side relaxes and is pulled along passively. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="On the left, a photo showing a worm weaving between pegs. On the right, a photo showing a worm-like robot weaving between pegs." src="https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=122&fit=crop&dpr=1 600w, https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=122&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=122&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=153&fit=crop&dpr=1 754w, https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=153&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/575081/original/file-20240212-26-bro51v.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=153&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">MILLR’s design allows it to move through obstacles the same way worms do.</span>
<span class="attribution"><span class="source">Tianyu Wang and Christopher Pierce</span></span>
</figcaption>
</figure>
<p>By changing the amount of slack in the cables, <a href="https://doi.org/10.1126/scirobotics.adi2243">we can achieve</a> varying degrees of body stiffness. When the robot collides with an obstacle, depending on the cable tension, it selectively maintains its shape or bends under the force of the obstacle. </p>
<p><a href="https://doi.org/10.1126/scirobotics.adi2243">We found that</a> if the robot was actively bending to one side and it experienced a force in the same direction, the body complied to the force and bent further. If, alternatively, the robot experienced a force that opposed the bend, it would remain rigid and push itself off the obstacle. </p>
<p>Because of the pattern of the tension along the body, head-on collisions that would normally cause the robot to stop moving or jam itself instead naturally led to a redirection around the obstacle. The robot could push itself forward consistently. </p>
<h2>Testing MILLR</h2>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/21F7IOF9BMs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>To investigate the benefits of mechanical intelligence, we built tiny obstacle courses and sent nematode worms through them to see how well they performed. We sent MILLR through a similar course and compared the results.</p>
<p>MILLR moved through its course <a href="https://doi.org/10.1126/scirobotics.adi2243">about as effectively as the real worms</a>. We noticed that the worms made the same type of body movements when they collided with obstacles as MILLR did.</p>
<p>The principles of mechanical intelligence could extend beyond the realm of nematodes. Future research could look at designing robots based on a host of other types of organisms for applications ranging from search and rescue to <a href="https://youtu.be/e0D9IVo-E9M?si=d8jGaC5GDLaMbEeS">exploring other planets</a>.</p><img src="https://counter.theconversation.com/content/220828/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>This work was supported by the National Science Foundation Physics of Living Systems Student Research Network, NSF-Simons Southeast Center for Mathematics and Biology, Army Research Office Grant, and the Dunn Family Professorship.</span></em></p>Robots often have a hard time navigating through debris, but robots designed based on worms and snakes could move around obstacles faster, thanks to an idea called mechanical intelligence.Tianyu Wang, Ph.D. Student in Robotics, Georgia Institute of TechnologyChristopher Pierce, Postdoctoral Scholar in Physics, Georgia Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2183312024-02-13T13:20:40Z2024-02-13T13:20:40ZOur robot harvests cotton by reaching out and plucking it, like a lizard’s tongue snatching flies<figure><img src="https://images.theconversation.com/files/571200/original/file-20240124-15-t230yk.jpg?ixlib=rb-1.1.0&rect=12%2C6%2C4001%2C2593&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cotton in bloom in Oklahoma.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/field-of-cotton-royalty-free-image/148704945">John Elk/the image Bank via Getty Images</a></span></figcaption></figure><p>Cotton is one of the most valuable crops grown in the U.S., with a harvest value of <a href="https://www.ers.usda.gov/topics/crops/cotton-and-wool/cotton-sector-at-a-glance/">some US$7 billion yearly</a>. It is cultivated across a crescent of 17 states stretching <a href="https://www.ers.usda.gov/topics/crops/cotton-and-wool/cotton-sector-at-a-glance/">from Virginia to California</a> and is used in <a href="https://www.ers.usda.gov/topics/crops/cotton-and-wool/cotton-sector-at-a-glance/">virtually every type of clothing</a>, as well as in medical supplies and home goods such as upholstery. </p>
<p>Cotton grows inside a hard, fibrous case called a boll. About 100 days after planting, the bolls mature and split open, revealing thousands of fluffy white fibers inside. Each boll contains 20 to 40 seeds with fibers attached to them, which is why the cotton plant’s fruit is called seed cotton. </p>
<p>Picking cotton manually, as is still done in some <a href="https://worldpopulationreview.com/country-rankings/cotton-production-by-country">major producing countries</a>, is a meticulous task. Workers have to bend to reach the bolls and can hurt their hands on <a href="https://libguides.nybg.org/c.php?g=1003078&p=7264406">hard, dry parts of the plants</a>. To harvest the seed cotton, they have to grab and twist it to separate it from the boll without leaving fiber behind. </p>
<p>Starting in the 1930s, cotton farmers in the U.S. shifted from manual labor to <a href="https://www.britannica.com/technology/cotton-harvester">large, heavy harvesters</a>. Now the industry is entering a new stage that promises to be more efficient and precise. </p>
<p>I am an engineer and have <a href="https://scholar.google.com/citations?user=AGlJEMQAAAAJ&hl=en">nearly 20 years of research experience</a> working on agricultural machinery. My current focus is on agricultural robotics and automation. During my Ph.D. program at Mississippi State University, I worked with <a href="https://www.abe.msstate.edu/people/faculty/j-alex-thomasson/">Alex Thomasson</a>, who heads the <a href="https://www.abe.msstate.edu/">agricultural and biological engineering department</a> and the <a href="https://www.aai.msstate.edu/">Agricultural Autonomy Institute</a>, to develop a <a href="https://doi.org/10.1016/j.compag.2023.107943">robotic cotton harvester</a> that picks cotton with less damage to the product and the soil where it grows.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man stands in front of a cotton field, next to a wheeled machine with a computer screen on top and wires hanging from it." src="https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571202/original/file-20240124-23-258u5r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Mississippi State University engineering professor Hussein Gharakhani with a prototype robotic cotton harvester.</span>
<span class="attribution"><span class="source">Hussein Gharakhani</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Why use robotics?</h2>
<p>Cotton farmers have economic, environmental and agricultural reasons to want a better option for harvesting. Traditional mechanical harvesters can be up to 14 feet long and weigh more than 30 tons. They remove cotton effectively without damaging the plants but also can cause problems. </p>
<p>One issue is prolonged fiber exposure. Cotton bolls don’t all mature at the same time; the first open bolls in a field may wait for up to 50 days to be picked, until more bolls around them ripen. </p>
<p>Another challenge is that harvesting machines compact the soil as they roll over it. This makes it harder for water and fertilizer to penetrate down to plant roots. And the machines cost roughly US$1 million apiece but are used for only two to three months each year. </p>
<p>Robotics is a potential solution that farmers are already using for other crops, such as <a href="https://doi.org/10.1007/s11119-022-09913-3">fruits and vegetables</a>. Harvesting robots use cameras and sensors to detect when crops are ready to pick and can remove them without damaging the plant. </p>
<p>For cotton, robotics offers more targeted picking of bolls that are ready to harvest. It produces better-quality cotton fiber by picking seed cotton as soon as the bolls open, without leaving it exposed to the weather. The robot targets the seed cotton and avoids touching other parts of the plant. </p>
<p>With robotic picking, cotton farmers <a href="https://ipm.ucanr.edu/agriculture/cotton/scheduling-defoliation/">don’t need to use defoliants</a> to remove leaves from the plants prior to harvesting, which is a common practice now. And small, nimble robots don’t compress the soil as they move over it, so they help maintain soil health.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A large green machine drives through a cotton field with a man riding on an observation deck. The harvester is more than twice the man's height." src="https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=487&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=487&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=487&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=612&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=612&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571208/original/file-20240124-17-ekvdtw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=612&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A mechanical harvester picking cotton in Alabama in 2017.</span>
<span class="attribution"><a class="source" href="https://flic.kr/p/2myChzr">Katie Nichols/Alabama Extension/Flickr</a></span>
</figcaption>
</figure>
<h2>A bioinspired ‘picking hand’</h2>
<p>Our work focuses on designing <a href="https://doi.org/10.1016/j.atech.2022.100043">an end-effector for robotic cotton harvesting</a>. An end-effector is a robotic hand that enables the robot to interact with other objects. Ours is a three-fingered version designed for delicate and efficient cotton picking. It draws inspiration from nature, mimicking the hunting prowess of a lizard.</p>
<p>Each finger is a 3D-printed structure that contains a moving belt with pins attached to it. The pins help the hand grasp and pull in the seed cotton. Like a lizard <a href="https://www.youtube.com/watch?v=z3oh73amxQo">snatching prey with its sticky tongue</a>, our end-effector’s three fingers approach the seed cotton delicately. On contact, the cotton fibers stick to the machine’s fingers, much as an insect sticks to a lizard’s tongue. </p>
<p>Next, the hand retracts quickly, like the lizard’s tongue. The end-effector keeps working to “swallow” the seed cotton, transferring it out of the plant. As the harvester picks and transfers seed cotton out of the plant, the end-effector touches parts of the cotton boll with remaining seed cotton multiple times to pick as much as possible.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/IztKk3E7zSc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A robotic harvester picks cotton in a field test.</span></figcaption>
</figure>
<p>To pick cotton efficiently, our robot has to do three things: detect bolls that are ready for harvest, determine exactly where they are located in a three-dimensional space and pick the cotton.</p>
<p>The robot uses a deep-learning algorithm that we have trained to recognize open bolls on cotton plants. It uses a stereovision camera to calculate their 3D spatial coordinates, which it transfers to the robotic arm. A control algorithm monitors each cotton boll to ensure that the robot picks as much seed cotton as possible. </p>
<h2>Testing and results</h2>
<p>So far, we have tested the robotic cotton harvester in <a href="https://youtu.be/WnzJNlSS5iU?si=HkSDbRiQp3Y-HSUj">the laboratory</a> and in <a href="https://youtu.be/IztKk3E7zSc?si=8iC9gVI3wfXZktPf">cotton fields</a>. The detection system found 78% of ripe cotton bolls; the localization system calculated 3D coordinates for 70% of the detected bolls; and the picking system successfully harvested 83% of these bolls. Overall, the robot picked about 50% of the cotton bolls that were within its reach. </p>
<p>Our harvester picked cotton at a speed of 8.8 seconds per boll. If we can decrease this required time to 0.3 seconds and increase the robot’s efficiency to pick at least 90% of the cotton bolls it can reach, by optimizing the system and adding more arms on a robot, a fleet of 50 robots could harvest a cotton field as quickly as a mechanical harvester, with a comparable yield.</p>
<p>To improve the robot’s overall performance, we plan to adopt better artificial intelligence algorithms, improve our system’s camera and add another degree of movement to the robotic arm – for example, enabling the end-effector to rotate – to increase its dexterity. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman wearing a sun visor and with a cloth bag slung around her waist bends over plants in a cotton field." src="https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=408&fit=crop&dpr=1 600w, https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=408&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=408&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=513&fit=crop&dpr=1 754w, https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=513&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/571214/original/file-20240124-23-1cp4yw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=513&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A woman picks cotton at a plantation in Birlik, Uzbekistan.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/an-uzbek-woman-picks-cotton-buds-at-a-cotton-plantation-in-news-photo/1236076112">Vyacheslav Oseledko/AFP via Getty Images</a></span>
</figcaption>
</figure>
<p>We see great potential for our robot in major cotton-producing countries such as China, India, Pakistan and Uzbekistan, where cotton is currently picked by hand, often <a href="https://www.bbc.com/news/world-asia-34844992">by women and children</a> and sometimes <a href="https://www.reuters.com/sustainability/society-equity/regulatory-crackdown-slavery-cotton-supply-chains-wake-up-call-fashion-brands-2023-08-20/">under abusive conditions</a>. One way to make this technology available for small farmers in low-income countries would be to make smaller, semi-autonomous robots that would require fewer sensors. Producing higher-value cotton with less damage to plants and soil could improve life for millions of people who earn their livings raising this global crop.</p><img src="https://counter.theconversation.com/content/218331/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Hussein Gharakhani receives funding from Cotton Incorporated, a nonprofit research and marketing company that works to improve demand for and profitability of cotton.</span></em></p>Cotton is one of the world’s largest crops and is harvested with large, heavy machines. Robotic harvesting could yield higher-quality cotton with less damage to plants and soil.Hussein Gharakhani, Assistant Professor of Agricultural and Biological Engineering, Mississippi State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2133362024-01-26T13:17:43Z2024-01-26T13:17:43ZWhy are so many robots white?<figure><img src="https://images.theconversation.com/files/568910/original/file-20240111-27-va5e62.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2048%2C1364&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">This little guy is very cute − and very white.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/jiuguangw/4981810943/"> Jiuguang Wang/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>Problems of racial and gender bias in artificial intelligence algorithms and the data used to train large language models like ChatGPT have <a href="https://doi.org/10.1145/3597307">drawn the attention of researchers</a> and <a href="https://www.washingtonpost.com/technology/interactive/2023/ai-generated-images-bias-racism-sexism-stereotypes/">generated headlines</a>. But these problems also arise in social robots, which have physical bodies modeled on nonthreatening versions of humans or animals and are designed to interact with people.</p>
<p>The aim of the subfield of social robotics called socially assistive robotics is to interact with ever more diverse groups of people. Its practitioners’ noble intention is “to create machines that will best help people help themselves,” writes one of its pioneers, <a href="https://www.wsj.com/articles/how-to-build-robots-people-can-relate-to-11570807206">Maja Matarić</a>. The robots are already being used to help people on the <a href="https://theconversation.com/how-robots-can-help-us-embrace-a-more-human-view-of-disability-76815">autism spectrum</a>, children with special needs and stroke patients who need physical rehabilitation. </p>
<p>But these robots do not look like people or interact with people in ways that reflect even basic aspects of society’s diversity. As a <a href="https://scholar.google.com/citations?hl=en&user=9JvGLRcAAAAJ&view_op=list_works&sortby=pubdate">sociologist who studies human-robot interaction</a>, I believe that this problem is only going to get worse. Rates of diagnoses for autism in children of color are now <a href="https://www.cdc.gov/ncbddd/autism/addm-community-report/spotlight-on-racial-ethnic-differences.html">higher than for white kids</a> in the U.S. Many of these children could end up interacting with white robots.</p>
<p>So, to adapt the famous Twitter <a href="https://knowyourmeme.com/memes/oscars-so-white">hashtag around the Oscars</a> in 2015, why #robotssowhite?</p>
<h2>Why robots tend to be white</h2>
<p>Given the diversity of people they will be exposed to, why does <a href="https://robotsguide.com/robots/kaspar">Kaspar</a>, designed to interact with children with autism, have rubber skin that resembles a white person’s? Why are <a href="https://robotsguide.com/robots/nao">Nao</a>, <a href="https://robotsguide.com/robots/pepper">Pepper</a> and <a href="https://robotsguide.com/robots/icub">iCub</a>, robots used in schools and museums, clad with shiny, white plastic? In <a href="https://doi.org/10.1007/s13347-020-00415-6">The Whiteness of AI</a>, technology ethicist Stephen Cave and science communication researcher Kanta Dihal discuss racial bias in AI and robotics and note the preponderance of stock images online of robots with reflective white surfaces. </p>
<p>What is going on here?</p>
<p>One issue is what robots are already out there. Most robots are not developed from scratch but purchased by engineering labs for projects, adapted with custom software, and sometimes integrated with other technologies such as robot hands or skin. Robotics teams are therefore constrained by design choices that the original developers made (Aldebaran for Pepper, Italian Institute of Technology for iCub). These design choices tend to follow the clinical, clean look with shiny white plastic, similar to other technology products like the original iPod.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/wT0RtnCR13o?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Kaspar is a robot designed to interact with children with autism.</span></figcaption>
</figure>
<p>In a paper I presented at the 2023 American Sociological Association meeting, I call this “<a href="https://convention2.allacademic.com/one/asa/asa23/index.php?program_focus=view_paper&selected_paper_id=2066209&cmd=online_program_direct_link&sub_action=online_program">the poverty of the engineered imaginary</a>.”</p>
<h2>How society imagines robots</h2>
<p>In anthropologist Lucy Suchman’s <a href="https://www.cambridge.org/us/universitypress/subjects/psychology/developmental-psychology/human-machine-reconfigurations-plans-and-situated-actions-2nd-edition">classic book on human-machine interaction</a>, which was updated with chapters on robotics, she discusses a “cultural imaginary” of what robots are supposed to look like. A cultural imaginary is what is shared through representations in texts, images and films, and which collectively shapes people’s attitudes and perceptions. For robots, the cultural imaginary is derived from science fiction. </p>
<p>This cultural imaginary can be contrasted with the more practical concerns of how computer science and engineering teams view robot bodies, what Neda Atanasoski and Kalindi Vora call the “engineered imaginary.” This is a hotly contested area in feminist science studies, with, for example, Jennifer Rhee’s “<a href="https://www.upress.umn.edu/book-division/books/the-robotic-imaginary">The Robotic Imaginary</a>” and Atanasoski and Vora’s “<a href="https://www.dukeupress.edu/surrogate-humanity">Surrogate Humanity</a>” critical of the gendered and racial assumptions that lead people to design service robots – designed to carry out mundane tasks – as female.</p>
<p>The cultural imaginary that enshrines robots as white, and in fact usually female, stretches back to European antiquity, along with an explosion of novels and films at the height of industrial modernity. From the first mention of the word “android” in Auguste Villiers de l’Isle-Adam’s 1886 novel “The Future Eve,” the introduction of the word “robot” in Karel Čapek’s 1920 play “Rossum’s Universal Robots,” and the sexualized robot Maria in the 1925 novel “Metropolis” by Thea von Harbou – the basis of her husband Fritz Lang’s famous 1927 film of the same name – fictional robots were quick to be feminized and made servile. </p>
<p>Perhaps the prototype for this cultural imaginary lies in ancient Rome. A poem in Ovid’s “Metamorphoses” (8 C.E.) describes a statue of Galatea “of snow-white ivory” that its creator Pygmalion falls in love with. Pygmalion prays to Aphrodite that Galatea come to life, and his wish is granted. There are numerous literary, poetic and film adaptations of the story, including one of the first special effects in cinema in <a href="https://youtu.be/lw8ckUGbbMY">Méliès’ 1898 film</a>. Paintings that depict this moment, for example by Raoux (1717), Regnault (1786), and Burne-Jones (1868-70 and 1878), accentuate the whiteness of Galatea’s flesh.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A painting of a man embracing a nude female figure whose bottom half is a marble statue and upper half is a woman" src="https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=739&fit=crop&dpr=1 600w, https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=739&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=739&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=928&fit=crop&dpr=1 754w, https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=928&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/568932/original/file-20240111-23-qycib1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=928&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The painting Pygmalion and Galatea by Jean-Léon Gérôme depicts an ancient Roman tale of a statue brought to life.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/peterjr1961/2920107167/">Peter Roan/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<h2>Interdisciplinary route to diversity and inclusion</h2>
<p>What can be done to counter this cultural legacy? After all, all human-machine interaction should be designed with diversity and inclusion in mind, according to engineers <a href="https://theconversation.com/building-machines-that-work-for-everyone-how-diversity-of-test-subjects-is-a-technology-blind-spot-and-what-to-do-about-it-174757">Tahira Reid and James Gibert</a>. But outside of Japan’s ethnically Japanese-looking robots, robots designed to be nonwhite are rare. And Japan’s robots tend to follow the subservient <a href="https://www.forbes.com/sites/zarastone/2018/02/27/ten-incredibly-lifelike-humanoid-robots-to-get-on-your-radar/?sh=2f7f323334d2">female gender stereotype</a>.</p>
<p>The solution is not simply to encase machines in brown or black plastic. The problem goes deeper. The <a href="https://www.hansonrobotics.com/bina48-9/">Bina48 “custom character robot”</a> modeled on the head and shoulders of a millionaire’s African American wife, Bina Aspen, is notable, but its <a href="https://www.nytimes.com/2010/07/05/science/05robotside.html">speech and interactions are limited</a>. A series of conversations between Bina48 and the African American artist <a href="https://www.stephaniedinkins.com/about.html">Stephanie Dinkins</a> is the basis of a <a href="https://www.stephaniedinkins.com/conversations-with-bina48.html">video installation</a>. </p>
<p>The absurdity of talking about racism with a disembodied animated head becomes apparent in one such conversation – it literally has no personal experience to speak of, yet its AI-powered answers refer to an unnamed person’s experience of racism growing up. These are implanted memories, like the “memories” of the <a href="https://bladerunner.fandom.com/wiki/Replicant">replicant</a> androids in the <a href="https://www.imdb.com/list/ls092704633/">“Blade Runner” movies</a>.</p>
<p>Social science methods can help produce a more inclusive “engineered imaginary,” as I discussed at Edinburgh’s <a href="https://www.cdcs.ed.ac.uk/events/imagining-artificial-life">Being Human festival</a> in November 2022. For example, working with Guy Hoffman, a roboticist from Cornell, and Caroline Yan Zheng, then a Ph.D. design student from Royal College of Art, we invited contributions for a publication titled <a href="https://doi.org/10.1145/3594713">Critical Perspectives on Affective Embodied Interaction</a>. </p>
<p>One of the persistent threads in that collaboration and other work is just how much people’s bodies communicate to others through gesture and expression, as well as vocalization, and how this differs between cultures. In which case, making robots’ appearance reflect the diversity of people who benefit from their presence is one thing, but what about diversifying forms of interaction? Along with making robots less universally white and female, social scientists, interaction designers and engineers can work together to produce more <a href="https://doi.org/10.1080/17458927.2023.2179231">cross-cultural sensitivity in gestures and touch</a>, for example. </p>
<p>Such work promises to make human-robot interaction less scary and <a href="https://doi.org/10.1027/2151-2604/a000486">uncanny</a>, especially for people who need assistance from the new breeds of socially assistive robots.</p><img src="https://counter.theconversation.com/content/213336/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark Paterson has received funding in the past from AHRC-EPSRC and OC Robotics in the U.K.</span></em></p>Humanoid robots tend to be white or resemble white people. Here’s why this is a problem and what social scientists, designers and engineers can do about it.Mark Paterson, Professor of Sociology, University of PittsburghLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2201242024-01-16T13:41:20Z2024-01-16T13:41:20ZWhat social robots can teach America’s students<figure><img src="https://images.theconversation.com/files/568716/original/file-20240110-29-vri22q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Some researchers predict social robots will become common in K-12 classrooms.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/elementary-schoolboy-touching-robotic-hand-royalty-free-image/1280407754">selimaksan/E+ Collection/Getty Images</a></span></figcaption></figure><p>How would you feel if your child were being tutored by a robot?</p>
<p><a href="https://doi.org/10.1016/S0921-8890(02)00373-1">Social robots</a> – robots that can talk and mimic and respond to human emotion – have been introduced into <a href="https://doi.org/10.1016/j.edurev.2021.100388">classrooms around the world</a>. Researchers have used them to <a href="https://www.youtube.com/watch?v=tBDI6kjj4nI">read stories</a> to <a href="https://www.imda.gov.sg/resources/blog/blog-articles/archived/2016/04/pepper-spices-up-classroom-learning">preschool students in Singapore</a>, help 12-year-olds in Iran <a href="https://doi.org/10.3102/0034654318821286">learn English</a>, <a href="http://www.doi.org/10.1109/HRI.2016.7451758">improve handwriting</a> among young children in Switzerland and teach students with autism in England <a href="https://doi.org/10.1007/s12369-014-0250-2">appropriate physical distance</a> during social interactions.</p>
<p>Some experts believe these robots could become <a href="https://www.doi.org/10.1126/scirobotics.aat5954">“as common as paper, whiteboards and computer tablets”</a> in schools. </p>
<p>Because social robots have a body, humans <a href="https://news.stanford.edu/2023/05/15/respond-social-robots/">react to them differently</a> than we do to a computer screen. Studies have shown that little children sometimes accept social robots as peers. For example, in the <a href="https://www.doi.org/10.1109/HRI.2016.7451758">handwriting study</a>, a 5-year-old boy continued to send letters to the robot months after the interactions ended. </p>
<p>As a professor of education, I study the different ways that <a href="https://scholar.google.com/citations?user=VCt87SkAAAAJ&hl=en">teachers around the world do their jobs</a>. To understand how social robots could affect teaching, graduate student Raisa Gray and I introduced a 4-foot-tall <a href="https://us.softbankrobotics.com/pepper">humanoid robot called “Pepper”</a> into a public elementary and middle school in the U.S. Our research <a href="https://doi.org/10.1111/jcal.12872">revealed many problems</a> with the current generation of social robots, making it unlikely that social robots will be running classrooms anytime soon.</p>
<h2>Not ready for prime time</h2>
<p>Much of the research on social robots in schools is done in <a href="https://link.springer.com/article/10.1007/s12369-010-0069-4">very restricted ways</a>. Children and social robots are not allowed to freely interact with each other without the assistance, or intervention, of researchers. Only a few studies have used social robots in <a href="https://doi.org/10.1016/j.edurev.2021.100388">real-life classroom settings</a>.</p>
<p>Also, robotic researchers often use <a href="https://doi.org/10.1007/s00146-021-01202-3">“Wizard of Oz” techniques</a> in classroom settings. That means that a person is operating the robot remotely, giving the impression that the robot can <a href="https://www.youtube.com/watch?v=zJHyaD1psMc">really talk to humans</a>. </p>
<h2>Limited social skills</h2>
<p>Robots need quiet.</p>
<p>Any kind of background noise – class-change bells, loudspeaker announcements or other conversations – can disrupt the robot’s ability to follow a conversation. This is one of the major problems facing the integration of robots into schools. </p>
<p>It is extremely difficult for programmers to create software and hardware systems that can achieve what humans do unconsciously. For example, the current generation of social robots cannot interact with a small group and, for example, track multiple people’s facial expressions. If a person is talking to two other people about their favorite football team and one of the listeners frowns or rolls their eyes, a human will likely pick up on that.</p>
<p>A robot will not. </p>
<p>Also, unless a bar code or other identification device is used, today’s social robots cannot recognize individuals. This makes it very unlikely for them to have realistic social interactions. Facial recognition software is difficult to use in a room full of moving, shifting people, and also raises serious ethical questions about keeping students’ personal information safe. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A child stands in front of Pepper the robot" src="https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=396&fit=crop&dpr=1 600w, https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=396&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=396&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=497&fit=crop&dpr=1 754w, https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=497&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/568317/original/file-20240108-19-ynwsuc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=497&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Students talked to the ‘Pepper’ robot as if it were a person.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/april-2018-hanover-germany-a-girl-speakng-with-the-robot-news-photo/978204290">Julian Stratenschulte/picture alliance via Getty Images</a></span>
</figcaption>
</figure>
<h2>Dialogue is preprogrammed</h2>
<p>To get the robot to perform, our students had to master the tutorials that came with the robot. Some students quickly figured out that the robot could respond only to certain basic routines.</p>
<p>For example, Pepper could respond to “How old are you?” but not “What age are you?” Other students kept trying to interact with the robot as if it were a person and got very frustrated with its nonhuman responses.</p>
<p>When a robot <a href="https://doi.org/10.1016/j.chb.2017.12.030">fails to answer a question</a>, or responds in the wrong way, students realize the robot isn’t really understanding them and that the robot’s dialogue is preprogrammed. The robot can’t really make sense of the social context. </p>
<p>In our study, students learned to adapt to the robot.</p>
<p>One group of girls would stand around the robot while one kept petting its head. This caused the robot to do either its “I feel like a cat” or its “I’m ticklish today” routine. This seemed to delight the girls. They appeared content to have one person interact with the robot while others watched.</p>
<h2>Cannot move around classroom with ease</h2>
<p>Students who have seen YouTube videos of <a href="https://www.youtube.com/watch?v=bmNaLtC6vkU">robotic dogs</a> that run and jump may be disappointed to realize that most social robots can’t move around a classroom with ease. The teachers in our study were disappointed that Pepper couldn’t bring them coffee. </p>
<p>These problems aren’t limited to school settings.</p>
<p>Service robots in some health care facilities have been programmed to deliver medicine, but this requires special sensors and programming. And while stores and restaurants are experimenting with <a href="https://www.washingtonpost.com/technology/2019/01/14/giant-food-stores-will-place-robotic-assistants-inside-locations-company-says/">delivery and cleaning robots</a>, when a grocery store in Scotland tried to use Pepper for customer interactions, the robot was <a href="https://www.digitaltrends.com/cool-tech/pepper-robot-grocery-store">fired after a week</a>.</p>
<h2>What social robots can teach kids</h2>
<p>While the social robots currently used in schools are finicky and limited in functions, they can still provide useful learning experiences. Students can use them to learn more about robotics, artificial intelligence and the complexity of real human behavior. </p>
<p><a href="https://www.actapress.com/PaperInfo.aspx?paperId=43268">As one researcher wrote</a>, “Robots act as a bridge in enabling students to understand humans.”</p>
<p>Struggling with a robot’s limitations gives students real insights into the complicated nature of human social interaction. The opportunity to work hands-on with a social robot shows students how difficult it is to program robots to mimic human behavior.</p>
<p>Social robots can also provide students with important learning opportunities about artificial intelligence. In Japan, Pepper is being used to <a href="https://www.softbankrobotics.com/jp/product/education/">introduce students to generative AI</a>. Students can link ChatGPT with Pepper’s physical presence to see how much AI improves Pepper’s communication and whether that makes it more lifelike. </p>
<p>As AI becomes a bigger part of our work and lives, educators need to prepare students to think critically about what it means to live and work with social machines. And with a real human teacher’s guidance and oversight, students can explore why we want to talk to robots as if they were people.</p><img src="https://counter.theconversation.com/content/220124/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gerald K. LeTendre receives funding from Harry L. Batschelet II Endowed Chair within the College of Education, The Pennsylvania State University</span></em></p>Social robots can be useful tools to help students learn about programming, but here’s why they won’t be replacing classroom teachers anytime soon.Gerald K. LeTendre, Professor of Educational Administration, Penn StateLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2186382024-01-03T13:19:07Z2024-01-03T13:19:07ZAI could make cities autonomous, but that doesn’t mean we should let it happen<p>You are walking back home. Suddenly the ground seems to open and a security drone
emerges, blocking your way to verify your identity. This might sound far-fetched but it is <a href="https://sunflower-labs.com/">based on an existing technology</a> – a drone system made by the AI company Sunflower Labs. </p>
<p>As part of an international project looking at the impact of AI on cities, we recently “broke ground” on <a href="https://journals.sagepub.com/doi/full/10.1177/00420980231203386">a new field of research called AI urbanism</a>. This is different from the concept of a “smart city”. Smart cities gather information from technology, such as sensor systems, and use it to manage operations and run services more smoothly.</p>
<p>AI urbanism represents a new way of shaping and governing cities, by means of artificial intelligence (AI). It departs substantially from contemporary models of urban development and management. While it’s vital that we closely monitor this emerging area, we should also be asking whether we should involve AI so closely in the running of cities in the first place.</p>
<p>The development of AI is intrinsically connected to the development of cities. Everything that city dwellers do teaches AI something precious about our world. The way you <a href="https://www.technologyreview.com/2022/05/27/1052826/ai-reinforcement-learning-self-driving-cars-autonomous-vehicles-wayve-waabi-cruise/">drive your car or ride your bike</a> helps train the AI behind an autonomous vehicle in how urban transport systems function.</p>
<p>What you eat and what you buy tells AI systems about your preferences. Multiply these individual records by the billions of people that live in cities, and you will get a feeling for how much data AI can harvest from urban settings.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/l9Rt8eh8_zU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Sunflower Labs has made a home security drone designed to verify the identity of visitors.</span></figcaption>
</figure>
<h2>Predictive policing</h2>
<p>Under the traditional concept of smart cities, technologies <a href="https://en.wikipedia.org/wiki/Internet_of_things">such as the Internet of Things</a> use connected sensors to observe and quantify what is happening. For example, smart buildings can calculate how much energy we consume and real-time technology can quantify how many people are using a subway at any one time. AI urbanism does not simply quantify, it tells stories, explaining why and how certain events take place.</p>
<p>We are not talking about complex narratives, but even a basic story can have substantial repercussions. Take the AI system developed by US company Palantir, that is <a href="https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd">already employed in several cities</a>, to predict where crimes will take place and who will be involved. </p>
<p>These predictions may be acted on by police officers <a href="https://www.theguardian.com/us-news/2021/nov/07/lapd-predictive-policing-surveillance-reform">in terms of where to assign resources</a>. Predictive policing in general is one of the most controversial powers that artificial intelligences are gaining under AI urbanism: the capacity to determine what is right or wrong, and who is “good” or “bad” in a city.</p>
<p>This is a problem because, as the <a href="https://link.springer.com/article/10.1007/s13347-023-00621-y">recent example of ChatGPT has made clear</a>, AI can produce a detailed account, without grasping its meaning. It is an amoral intelligence, in the sense that it is indifferent to questions of right or wrong. </p>
<p>And yet this is exactly the kind of question that we are increasingly delegating to AI in urban governance. This might save our city managers some time, given AI’s extraordinary velocity in analysing large volumes of data, but the price that we are paying in terms of social justice is enormous. </p>
<h2>A human problem</h2>
<p>Recent studies indicate that AI-made decisions are penalising racial minorities <a href="https://onlinelibrary.wiley.com/doi/full/10.1111/1468-2427.12833">in the fields of housing and real-estate</a>. There is also a <a href="https://journals.sagepub.com/doi/full/10.1177/2053951720935141">substantial environmental cost to bear in mind</a>, since AI technology is energy intensive. It is projected to contribute significantly to carbon emissions from the tech sector in coming decades, and the infrastructure needed to maintain it consumes critical raw materials. AI seems to <a href="https://www.nature.com/articles/s41467-019-14108-y">promise a lot in terms of sustainability</a>), but when we look at its actual costs and applications in cities, the negatives can easily outweigh the positives.</p>
<p>It is not that AI is getting out of control, as we see in sci-fi movies and read in novels. Quite the opposite: we humans are consciously making political decisions that place AI in the position to make decisions about the governance of cities. We are willingly ceding some of our decision-making responsibilities to machines and, in different parts of the world, we can already see the genesis of new cities supposed to be completely operated by AI.</p>
<figure class="align-center ">
<img alt="The Line, artist's rendition." src="https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=350&fit=crop&dpr=1 600w, https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=350&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=350&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=440&fit=crop&dpr=1 754w, https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=440&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/564469/original/file-20231208-29-jfdagi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=440&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The NEOM project in Saudi Arabia would include a linear city called The Line.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/line-neom-sustainable-autonomous-futuristic-city-2292908383">Corona Borealis Studio / Shutterstock</a></span>
</figcaption>
</figure>
<p>This trend <a href="https://www.neom.com/en-us">is exemplified by Neom</a>, a colossal project of regional development currently under construction in Saudi Arabia. Neom will feature new urban spaces, including a linear city called The Line, managed by a multitude of AI systems, and it is supposed to become a paragon of urban sustainability. These cities of the future will feature self-driving vehicles transporting people, robots cooking and serving food and algorithms predicting your behaviour to anticipate your needs.</p>
<p>These visions resonate with the <a href="https://journals.sagepub.com/doi/full/10.1177/00420980231203386">concept of the autonomous city</a> which refers to urban spaces where AI autonomously performs social and managerial functions with humans out of the loop.</p>
<p>We need to remember that autonomy is a zero sum game. As the autonomy of AI grows, ours decreases and the rise of autonomous cities risks severely undermining our role in urban governance. A city run not by humans but by AIs would challenge the autonomy of human stakeholders, as it would also challenge many people’s wellbeing. </p>
<p>Are you going to qualify for a home mortgage and be able to buy a property to raise a family? Will you be able to secure life insurance? Is your name on a list of suspects that the police are going to target? Today the answers to these questions are already influenced by AI. In the future, should the autonomous city become the dominant reality, AI could become the sole arbiter.</p>
<p>AI needs cities to keep devouring our data. As citizens, it is now time to carefully question the spectre of the autonomous city as part of an expanded public debate, and ask one very simple question: do we really need AI to make our cities sustainable?</p><img src="https://counter.theconversation.com/content/218638/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Federico Cugurullo does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>AI could take us beyond the concept of smart cities, telling us how and why things happen in urban settings.Federico Cugurullo, Assistant Professor in Smart and Sustainable Urbanism, Trinity College DublinLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2116882023-11-23T08:26:29Z2023-11-23T08:26:29ZFaced with dwindling bee colonies, scientists are arming queens with robots and smart hives<p>Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world’s in bad shape.</p>
<p>In the last three decades, the global biomass of flying insects has shrunk by <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0185809">75%</a>. Among the trend’s most notables victims is the world’s most important pollinator, the honeybee. In the United States, <a href="https://www.statista.com/chart/30260/honey-bee-colony-losses-in-the-united-states-timeline/">48% of honeybee colonies died</a> in 2023 alone, making it the second deadliest year on record. This significant loss is due in part to colony collapse disorder (CCD), the sudden disappearance of bees. In contrast, European countries report lower but still worrisome rates of colony losses, <a href="https://www.tandfonline.com/doi/full/10.1080/00218839.2020.1797272">ranging from 6% to 32%</a>.</p>
<p>This decline causes many of our essential food crops to be under-pollinated, a phenomenon that threatens our society’s <a href="https://academic.oup.com/bioscience/article/70/2/109/5637848">food security</a>.</p>
<h2>Debunking the sci-fi myth of robotic bees</h2>
<p>So, what can be done? Given <a href="https://theconversation.com/deciphering-the-mysterious-decline-of-honey-bees-56648">pesticides’ role in the decline of bee colonies</a>, commonly proposed solutions include <a href="https://www.bee-life.eu/post/the-positive-impact-of-organic-farming-in-bee-health">a shift away from industrial farming</a> and toward less pesticide-intensive, more sustainable forms of agriculture.</p>
<p>Others tend to look toward the sci-fi end of things, with some scientists imagining that we could eventually replace live honeybees with robotic ones. Such artificial bees could interact with flowers like natural insects, maintaining pollination levels despite the declining numbers of natural pollinators. The vision of artificial pollinators contributed to ingenious designs of <a href="https://www.agritechfuture.com/robotics-automation/winged-robot-smaller-than-a-pea-could-pollinate-crops/">insect-sized robots capable of flying</a>.</p>
<p>In reality, such inventions are more effective at educating us over engineers’ fantasies than they are at reviving bee colonies, so slim are their prospects of materialising. First, these artificial pollinators would have to be equipped for much more more than just flying. Daily tasks carried out by the common bee include searching for plants, identifying flowers, unobtrusively interacting with them, locating energy sources, ducking potential predators, and dealing with adverse weather conditions. Robots would have to perform all of these in the wild with a very high degree of reliability since any broken-down or lost robot can cause damage and spread pollution. Second, it remains to be seen whether our technological knowledge would be even capable of manufacturing such inventions. This is without even mentioning the price tag of a swarm of robots capable of substituting pollination provided by a single honeybee colony.</p>
<h2>Inside a smart hive</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=746&fit=crop&dpr=1 600w, https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=746&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=746&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=938&fit=crop&dpr=1 754w, https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=938&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/561054/original/file-20231122-31-mn9kzr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=938&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Bees crawl on one of Hiveopolis’s augmented hives.</span>
<span class="attribution"><span class="source">Hiveopolis</span>, <span class="license">Fourni par l'auteur</span></span>
</figcaption>
</figure>
<p>Rather than trying to replace honeybees with robots, our two latest projects funded by the European Union propose that the robots and honeybees actually team up. Were these to succeed, struggling honeybee colonies could be transformed into bio-hybrid entities consisting of biological and technological components with complementary skills. This would hopefully boost and secure the colonies’ population growth as more bees survive over harsh winters and yield more foragers to pollinate surrounding ecosystems.</p>
<p>The first of these projects, <a href="https://cordis.europa.eu/project/id/824069">Hiveopolis</a>, investigates how the complex decentralised decision-making mechanism in a honeybee colony can be nudged by digital technology. Begun in 2019 and set to end in March 2024, the experiment introduces technology into three observation hives each containing 4,000 bees, by contrast to 40,000 bees for a normal colony.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/561057/original/file-20231122-15-d7ixut.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The foundation of an augmented honeycomb.</span>
<span class="attribution"><span class="source">Hiveopolis</span>, <span class="license">Fourni par l'auteur</span></span>
</figcaption>
</figure>
<p>Within this honeybee smart home, combs have integrated temperature sensors and heating devices, allowing the bees to enjoy optimal conditions inside the colony. Since bees tend to snuggle up to warmer locations, the combs also enables us to direct them toward different areas of the hive. And as if that control weren’t enough, the hives are also equipped with a system of electronic gates that monitors the insects movements. Both technologies allow us to decide where the bees store honey and pollen, but also when they vacate the combs so as to enable us to harvest honey. Last but not least, the smart hive contains a robotic dancing bee that can direct foraging bees toward areas with plants to be pollinated.</p>
<p>Due to the experiment’s small scale, it is impossible to draw conclusions on the extent to which our technologies may have prevented bee losses. However, there is little doubt what we have seen thus far give reasons to be hopeful. We can confidently assert that our smart beehives allowed colonies to survive extreme cold during the winter in a way that wouldn’t otherwise be possible. To precisely assess how many bees these technologies have saved would require upscaling the experiment to hundreds of colonies.</p>
<h2>Pampering the queen bee</h2>
<p>Our second EU-funded project, RoboRoyale, focuses on the honeybee queen and her courtyard bees, with robots in this instance continuously monitoring and interacting with her Royal Highness.</p>
<p>Come 2024, we will equip each hive with a group of six bee-sized robots, which will groom and feed the honeybee queen to affect the number of eggs she lays. Some of these robots will be equipped with royal jelly micro-pumps to feed her, while others will feature compliant micro-actuators to groom her. These robots will then be connected to a larger robotic arm with infrared cameras, that will continuously monitor the queen and her vicinity.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1335&fit=crop&dpr=1 600w, https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1335&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1335&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1677&fit=crop&dpr=1 754w, https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1677&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/561096/original/file-20231122-22-bmtv9s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1677&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A RoboRoyale robot arm susses out a honeybee colony.</span>
<span class="attribution"><span class="source">RoboRoyale</span>, <span class="license">Fourni par l'auteur</span></span>
</figcaption>
</figure>
<p>As witnessed by the photo to the right and also below, we have already been able to successfully introduce the robotic arm within a living colony. There it continuously monitored the queen and determined her whereabouts through light stimuli.</p>
<h2>Emulating the worker bees</h2>
<p>In a second phase, it is hoped the bee-sized robots and robotic arm will be able to emulate the behaviour of the workers, the female bees lacking reproductive capacity who attend to the queen and feed her royal jelly. Rich in water, proteins, carbohydrates, lipids, vitamins and minerals, this nutritious substance secreted by the glands of the worker bees enables the queen to lay up to thousands of eggs a day.</p>
<p>Worker bees also engage in cleaning the queen, which involves licking her. During such interactions, they collect some of the queen’s pheromones and disperse them throughout the colony as they move across the hive. The presence of these pheromones controls many of the colony’s behaviours and notifies the colony of a queen’s presence. For example, in the event of the queen’s demise, a new queen must be quickly reared from an egg laid by the late queen, leaving only a narrow time window for the colony to react.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=270&fit=crop&dpr=1 600w, https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=270&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=270&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=339&fit=crop&dpr=1 754w, https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=339&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/561113/original/file-20231122-25-d7ixut.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=339&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">One of RoboRoyale’s first experiments has consisted in simple interactions with the queen bee through light stimulus. The next months will then see the robotic arm stretch out to physically touch and groom her.</span>
<span class="attribution"><span class="source">RoboRoyale</span>, <span class="license">Fourni par l'auteur</span></span>
</figcaption>
</figure>
<p>Finally, it is believed worker bees may also act as the queen’s guides, leading her to laying eggs in specific comb cells. The size of these cells can determine if the queen lays a diploid or haploid egg, resulting in the bee developing into either into drone (male) or worker (female) bee. Taking over these guiding duties could affect no less than the rate’s entire reproductive rate.</p>
<h2>How robots can prevent bee cannibalism</h2>
<p>This could have another virtuous effect: preventing cannibalism.</p>
<p>During tough times, such as long periods of rain, bees have to make do with little pollen intake. This forces them to feed young larvae to older ones so that at least the older larvae has a chance to survive. Through RoboRoyale, we will look not only to reduce chances of this behaviour occurring, but also quantify to what extent it occurs under normal conditions.</p>
<p>Ultimately, our robots will enable us to deepen our understanding of the very complex regulation processes inside honeybee colonies through novel experimental procedures. The insights gained from these new research tracks will be necessary to better protect these valuable social insects and ensure sufficient pollination in the future – a high stakes enterprise for food security.</p>
<hr>
<p><em>This article is the result of The Conversation’s collaboration with <a href="https://ec.europa.eu/research-and-innovation/en/horizon-magazine">Horizon</a>, the EU research and innovation magazine. In February, the authors published an <a href="https://projects.research-and-innovation.ec.europa.eu/en/horizon-magazine/robotic-bees-and-roots-offer-hope-healthier-environment-and-sufficient-food">interview with the magazine</a>.</em></p><img src="https://counter.theconversation.com/content/211688/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Farshad Arvin is a member of the Department of Computer Science at Durham University in the UK. The research of Farshad Arvin is primarily funded by the EU H2020 and Horizon Europe programmes.</span></em></p><p class="fine-print"><em><span>Martin Stefanec is a member of the Institute of Biology at the University of Graz. He has received funding from the EU programs H2020 and Horizon Europe.</span></em></p><p class="fine-print"><em><span>Tomas Krajnik is member of the Institute of Electrical and Electronics Engineers (IEEE). The research of Tomas Krajnik is primarily funded by EU H2020 Horizon programme and Czech National Science Foundation.</span></em></p>Two EU-funded projects are looking at high-tech solutions that could transform honeybee colonies into bio-hybrid entities.Farshad Arvin, Associate professor in robotics, Durham UniversityMartin Stefanec, University assistant in biology, University of GrazTomas Krajnik, Associate professor in robotics, Czech Technical UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2054652023-08-17T12:34:06Z2023-08-17T12:34:06ZMobile robots get a leg up from a more-is-better communications principle<figure><img src="https://images.theconversation.com/files/542418/original/file-20230811-38693-1jf8u.jpg?ixlib=rb-1.1.0&rect=0%2C2%2C799%2C529&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Getting a leg up from mobile robots comes down to getting a bunch of legs.</span> <span class="attribution"><a class="source" href="https://research.gatech.edu/scurrying-centipedes-inspire-many-legged-robots-can-traverse-difficult-landscapes">Georgia Institute of Technology</a></span></figcaption></figure><p>Adding legs to robots that have minimal awareness of the environment around them can help the robots operate more effectively in difficult terrain, my colleagues and I found.</p>
<p>We were inspired by mathematician and engineer Claude Shannon’s <a href="https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/">communication theory</a> about how to transmit signals over distance. Instead of spending a huge amount of money to build the perfect wire, Shannon illustrated that it is good enough to use redundancy to reliably convey information over noisy communication channels. We wondered if we could do the same thing for transporting cargo via robots. That is, if we want to transport cargo over “noisy” terrain, say fallen trees and large rocks, in a reasonable amount of time, could we do it by just adding legs to the robot carrying the cargo and do so without sensors and cameras on the robot?</p>
<p>Most mobile robots use inertial sensors to gain an awareness of <a href="https://doi.org/10.3390/designs6010017">how they are moving through space</a>. Our key idea is to forget about inertia and replace it with the simple function of repeatedly making steps. In doing so, our theoretical analysis confirms our hypothesis of reliable and predictable robot locomotion – and hence cargo transport – without additional sensing and control.</p>
<p>To verify our hypothesis, we built robots inspired by centipedes. We discovered that the more legs we added, <a href="https://doi.org/10.1126/science.ade4985">the better the robot could move across uneven surfaces</a> without any additional sensing or control technology. Specifically, we conducted a series of experiments where we built terrain to mimic an inconsistent natural environment. We evaluated the robot locomotion performance by gradually increasing the number of legs in increments of two, beginning with six legs and eventually reaching a total of 16 legs. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/6NhOervars4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Navigating rough terrain can be as simple as taking it a step at a time, at least if you have a lot of legs.</span></figcaption>
</figure>
<p>As the number of legs increased, we observed that the robot exhibited enhanced agility in traversing the terrain, even in the absence of sensors. To further assess its capabilities, we conducted outdoor tests on real terrain to evaluate its performance in more realistic conditions, where it performed just as well. There is potential to use many-legged robots for agriculture, space exploration and search and rescue.</p>
<h2>Why it matters</h2>
<p>Transporting things – food, fuel, building materials, medical supplies – is essential to modern societies, and effective goods exchange is the cornerstone of commercial activity. For centuries, transporting material on land has required building roads and tracks. However, roads and tracks are not available everywhere. Places such as hilly countryside have had limited access to cargo. Robots might be a way to transport payloads in these regions.</p>
<h2>What other research is being done in this field</h2>
<p>Other researchers have been developing <a href="https://doi.org/10.1017/S0269888919000158">humanoid robots</a> and <a href="https://doi.org/10.1016/j.asej.2020.11.005">robot dogs</a>, which have become increasingly agile in recent years. These robots rely on accurate sensors to know where they are and what is in front of them, and then make decisions on how to navigate. </p>
<p>However, their strong dependence on environmental awareness <a href="https://doi.org/10.1109/ACCESS.2020.2975643">limits them in unpredictable environments</a>. For example, in search-and-rescue tasks, sensors can be damaged and environments can change.</p>
<h2>What’s next</h2>
<p>My colleagues and I have taken valuable insights from our research and applied them to the field of crop farming. We have founded a company that uses these robots to efficiently weed farmland. As we continue to advance this technology, we are focused on refining the robot’s design and functionality. </p>
<p>While we understand the functional aspects of the centipede robot framework, our ongoing efforts are aimed at determining the optimal number of legs required for motion without relying on external sensing. Our goal is to strike a balance between cost-effectiveness and retaining the benefits of the system. Currently, we have shown that 12 is the minimum number of legs for these robots to be effective, but we are still investigating the ideal number.</p>
<p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take on interesting academic work.</em></p><img src="https://counter.theconversation.com/content/205465/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors has received funding from NSF-Simons Southeast Center for Mathematics and Biology (Simons Foundation SFARI 594594), Georgia Research Alliance (GRA.VL22.B12), Army Research Office (ARO) MURI program, Army Research Office Grant W911NF-11-1-0514 and a Dunn Family Professorship.
The author and his colleagues have one or more pending patent applications related to the research covered in this article.
The author and his colleagues have established a start-up company, Ground Control Robotics, Inc., partially based on this work.</span></em></p>A study found that adding legs does more for you than having a good sense of the ground around you − if you’re a mobile robot.Baxi Chong, Postdoctoral Fellow, Complex Rheology And Biomechanics Lab, Georgia Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2094912023-07-12T19:39:14Z2023-07-12T19:39:14ZPutting a price on exoskeleton assistance puts users in the driver’s seat of honing the tech<figure><img src="https://images.theconversation.com/files/537138/original/file-20230712-22-keuc4s.jpg?ixlib=rb-1.1.0&rect=0%2C16%2C2761%2C3389&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How much would these robo-boots be worth to you?</span> <span class="attribution"><span class="source">Neurobionic Lab/University of Michigan Robotics Department</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>My colleagues and I have used a tool from economics to measure the costs and benefits of wearing an exoskeleton, and we found that it offers a modest average benefit of US$3.40 per hour while walking uphill, when considering the combined effects of the assistance and device weight. This modest value is in contrast to the value of the assistance alone, which was much greater, at $19.80 per hour. These values were derived using our novel approach, which subtracts the values of the costs and benefits. </p>
<p>Exoskeletons are mechanical devices that people can wear to boost their power or efficiency. They can be used to assist in manual labor or to aid in rehabilitation from injuries. Our approach brings the user into the evaluation process, which makes it possible to take into account the many ways people experience exoskeletons.</p>
<p>Perceptions are important. Users must want exoskeletons in their lives if the technology is to meet its potential of assisting mobility, endurance and safety. And that means users must perceive the benefits, and that these <a href="https://books.google.com/books?hl=en&lr=&id=v1ii4QsB7jIC&oi=fnd&pg=PR15&dq=Rogers+EM.+Diffusion+of+innovations.+Simon+and+Schuster%3B+2010+Jul+6.&ots=DM_xrPYsaV&sig=rhK6garGd-qY4CFKCb75s92fA24#v=onepage&q=Rogers%20EM.%20Diffusion%20of%20innovations.%20Simon%20and%20Schuster%3B%202010%20Jul%206.&f=false">benefits must outweigh the costs</a> of wearing the exoskeleton, including any added discomfort, weight or noise.</p>
<p>As a way to measure user perception, <a href="https://doi.org/10.1038/s44172-023-00091-2">we studied the economic value</a>, measured in U.S. dollars, of wearing an exoskeleton. To find these dollar values, we asked people how much money would be required for them to walk uphill for two minutes on a treadmill. The two-minute bouts were repeated in series for approximately 30 minutes. We also repeated these trials with users wearing the exoskeleton while it was unpowered, as well as not wearing the exoskeleton. By comparing the costs among conditions, our approach provides a foundation for assessing the economic value of the complete exoskeleton, its assistance and the cost of the added weight. </p>
<p>To ensure the participants provide a truthful estimate of their cost to walk for two minutes – and not to maximize their earnings – we used a special type of auction that ensures honest valuations, known as the <a href="https://doi.org/10.2307/2977633">Vickrey auction</a>. In a seller’s Vickrey auction, the winner is the lowest bid but gets paid the second-lowest bid. It is often used to measure the value of abstract concepts, because the Vickrey auction breaks the link between the auction winner and their bid and removes incentives to under- or overbid.</p>
<p>Today, many ways of evaluating exoskeletons are focused on hard-to-get data, such as <a href="https://doi.org/10.1126/science.aal5054">number of calories burned</a> and <a href="https://doi.org/10.1152/japplphysiol.01133.2014">complex motion analysis</a>. More subjective measures, such as <a href="https://doi.org/10.1126/scirobotics.abj3487">user preferences</a> of exoskeleton assistance, are difficult to standardize and are only recently beginning to gain traction. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/YDqp5mcxQus?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Exoskeletons are being used in construction and manufacturing.</span></figcaption>
</figure>
<h2>Why it matters</h2>
<p>Our study highlights the sizable potential benefit of exoskeleton assistance. This is despite most of the value’s being offset by the cost of the extra weight. Through measuring this user experience, researchers and developers can refine exoskeletons for user perceptions. For example, these results highlight the need to develop lighter and more compact exoskeletons.</p>
<p>Our method of finding the economic value is intuitive, as the public is more likely to understand the value of dollars over energy-related biomechanical metrics, such as watts or joules. This approach also does not not require expensive specialized equipment, as is required for measuring metabolic rate, or how much energy a person would burn while wearing an exoskeleton.</p>
<p>This technique can be used to measure the value of not only different types of exoskeleton assistance, but also a wide range of technologies, activities and experimental conditions, and represents a useful alternative to the standard approach of assessing metabolic rate.</p>
<h2>What’s next</h2>
<p>With this data and approach, we plan to design exoskeletons that reduce the cost of wearing the added weight of the device. We also plan to research better control systems to increase the economic benefits.</p>
<p>In our study, we found that the benefits and costs varied greatly across individuals. Some people reported a negative overall value. We want to study these differences to determine why people have such different perceptions. Doing so can help overcome major hurdles in the adoption of this technology.</p><img src="https://counter.theconversation.com/content/209491/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The company that developed the exoskeleton used in this research has licensed intellectual property on which Prof. Rouse is an inventor. </span></em></p>Asking users the dollar value of the costs and benefits of walking in exoskeletons is a better way of finding out how users feel about them than measuring calories saved.Elliott Rouse, Associate Professor of Robotics and Mechanical Engineering, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2083262023-06-23T12:29:43Z2023-06-23T12:29:43ZTitan submersible disaster underscores dangers of deep-sea exploration – an engineer explains why most ocean science is conducted with crewless submarines<figure><img src="https://images.theconversation.com/files/533576/original/file-20230622-19-hnt7xe.jpg?ixlib=rb-1.1.0&rect=6%2C6%2C4594%2C3055&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Researchers are increasingly using small, autonomous underwater robots to collect data in the world's oceans.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/noaaphotolib/27555260673/">NOAA Teacher at Sea Program,NOAA Ship PISCES</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p><em>Rescuers spotted debris from the tourist submarine Titan on the ocean floor near the wreck of the Titanic on June 22, 2023, <a href="https://www.nytimes.com/live/2023/06/22/us/titanic-missing-submarine/heres-the-latest-on-the-missing-submersible">indicating that the vessel suffered a catastrophic failure</a> and the five people aboard were killed.</em></p>
<p><em>Bringing people to the bottom of the deep ocean is inherently dangerous. At the same time, climate change means collecting data from the world’s oceans is more vital than ever. Purdue University mechanical engineer <a href="https://scholar.google.com/citations?user=z1BeTeYAAAAJ&hl=en">Nina Mahmoudian</a> explains how researchers reduce the risks and costs associated with deep-sea exploration: Send down subs, but keep people on the surface.</em></p>
<h2>Why is most underwater research conducted with remotely operated and autonomous underwater vehicles?</h2>
<p>When we talk about water studies, we’re talking about vast areas. And covering vast areas requires tools that can work for extended periods of time, sometimes months. Having people aboard underwater vehicles, especially for such long periods of time, is expensive and dangerous.</p>
<p>One of the tools researchers use is <a href="https://oceanexplorer.noaa.gov/facts/rov.html">remotely operated vehicles</a>, or ROVs. Basically, there is a cable between the vehicle and operator that allows the operator to command and move the vehicle, and the vehicle can relay data in real time. ROV technology has progressed a lot to be able to reach deep ocean – up to a depth of 6,000 meters (19,685 feet). It’s also better able to provide the mobility necessary for observing the sea bed and gathering data.</p>
<p><a href="https://oceanexplorer.noaa.gov/facts/auv.html">Autonomous underwater vehicles</a> provide another opportunity for underwater exploration. They are usually not tethered to a ship. They are typically programmed ahead of time to do a specific mission. And while they are underwater they usually don’t have constant communication. At some interval, they surface, relay the whole amount of data that they have gathered, change the battery or recharge and receive renewed instructions before again submerging and continuing their mission.</p>
<h2>What can remotely operated and autonomous underwater vehicles do that crewed submersibles can’t, and vice versa?</h2>
<p>Crewed submersibles will be exciting for the public and those involved and helpful for the increased capabilities humans bring in operating instruments and making decisions, similar to crewed space exploration. However, it will be much more expensive compared with uncrewed explorations because of the required size of the platforms and the need for life-support systems and safety systems. Crewed submersibles today <a href="https://www.nytimes.com/2015/09/15/science/piloted-deep-sea-research-is-bottoming-out.html">cost tens of thousands of dollars a day</a> to operate.</p>
<p>Use of unmanned systems will provide better opportunities for exploration at less cost and risk in operating over vast areas and in inhospitable locations. Using remotely operated and autonomous underwater vehicles gives operators the opportunity to perform tasks that are dangerous for humans, like observing under ice and detecting underwater mines.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/1jCdAwRML7I?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Remotely operated vehicles can operate under Antarctic ice and other dangerous places.</span></figcaption>
</figure>
<h2>How has the technology for deep ocean research evolved?</h2>
<p>The technology has advanced dramatically in recent years due to progress in sensors and computation. There has been great progress in <a href="https://doi.org/10.3390%2Fs21237849">miniaturization of acoustic sensors and sonars</a> for use underwater. Computers have also become more miniaturized, capable and power efficient. There has been a lot of work on battery technology and connectors that are watertight. <a href="https://www.additivemanufacturing.media/articles/autonomous-underwater-vehicle-with-3d-printed-hull-the-cool-parts-show-24">Additive manufacturing and 3D printing also help build hulls</a> and components that can withstand the high pressures at depth at much lower costs.</p>
<p>There has also been great progress toward increasing autonomy using more advanced algorithms, in addition to traditional methods for navigation, localization and detection. For example, machine learning algorithms can <a href="https://doi.org/10.1109/ICITR49409.2019.9407797">help a vehicle detect and classify objects</a>, whether stationary like a pipeline or mobile like schools of fish. </p>
<h2>What kinds of discoveries have been made using remotely operated and autonomous underwater vehicles?</h2>
<p>One example is underwater gliders. These are buoyancy-driven autonomous underwater vehicles. They can stay in water for months. They can collect data on pressure, temperature and salinity as they go up and down in water. All of these are very helpful for researchers to have an understanding of changes that are happening in oceans. </p>
<p>One of these platforms traveled across the North Atlantic Ocean <a href="https://www.marine.ie/site-area/news-events/news/silbo-autonomous-glider-finds-its-way-ireland-having-travelled-across">from the coast of Massachusetts to Ireland</a> for nearly a year in 2016 and 2017. The amount of data that was captured in that amount of time was unprecedented. To put it in perspective, a vehicle like that costs about $200,000. The operators were remote. Every eight hours the glider came to the surface, got connected to GPS and said, “Hey, I am here,” and the crew basically gave it the plan for the next leg of the mission. If a crewed ship was sent to gather that amount of data for that long it would cost in the millions. </p>
<p>In 2019, researchers used an autonomous underwater vehicle to <a href="https://www.wired.com/story/submarine-under-thwaites-glacier-gauge-rising-seas/">collect invaluable data</a> about the <a href="https://doi.org/10.1126/sciadv.abd7254">seabed beneath the Thwaites glacier</a> in Antarctica.</p>
<p>Energy companies are also using remotely operated and autonomous underwater vehicles for <a href="https://www.offshore-technology.com/news/deepocean-autonomous-drone-offshore/">inspecting and monitoring</a> offshore renewable energy and oil and gas infrastructure on the seabed.</p>
<h2>Where is the technology headed?</h2>
<p>Underwater systems are slow-moving platforms, and if researchers can deploy them in large numbers that would give them an advantage for covering large areas of ocean. A great deal of effort is being put into coordination and fleet-oriented autonomy of these platforms, as well as into advancing data gathering using onboard sensors such as cameras, sonars and dissolved oxygen sensors. Another aspect of advancing vehicle autonomy is real-time underwater decision-making and data analysis.</p>
<h2>What is the focus of your research on these submersibles?</h2>
<p>My team and I focus on developing navigational and mission-planning algorithms for persistent operations, meaning long-term missions with minimal human oversight. The goal is to respond to two of the main constraints in the deployment of autonomous systems. One is battery life. The other is unknown situations. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/_kS0_-qc_r0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The author’s research includes a project to allow autonomous underwater vehicles to recharge their batteries without human intervention.</span></figcaption>
</figure>
<p>For battery life, we work on at-sea recharging, both underwater and surface water. We are developing tools for autonomous deployment, recovery, recharging and data transfer for longer missions at sea. For unknown situations, we are working on recognizing and avoiding obstacles and adapting to different ocean currents – basically allowing a vehicle to navigate in rough conditions on its own. </p>
<p>To adapt to changing dynamics and component failures, we are working on methodologies to help the vehicle detect the change and compensate to be able to continue and finish the mission.</p>
<p>These efforts will enable long-term ocean studies including observing environmental conditions and mapping uncharted areas.</p><img src="https://counter.theconversation.com/content/208326/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nina Mahmoudian receives funding from National Science Foundation and Office of Naval Research. </span></em></p>Dramatic improvements in computing, sensors and submersible engineering are making it possible for researchers to ramp up data collection from the oceans while also keeping people out of harm’s way.Nina Mahmoudian, Associate Professor of Mechanical Engineering, Purdue UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2062972023-05-25T15:54:59Z2023-05-25T15:54:59ZAI could threaten some jobs, but it is more likely to become our personal assistant<figure><img src="https://images.theconversation.com/files/527950/original/file-20230524-23-zxjoti.jpg?ixlib=rb-1.1.0&rect=0%2C11%2C7460%2C4943&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">AI systems are likely to find quick traction as assistants to humans.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/world-tablet-overlay-business-black-woman-2264932239">PeopleImages.com - Yuri A / Shutterstock</a></span></figcaption></figure><p>BT recently announced that it would be reducing its staff by 55,000, with around 11,000 of these related to the use of artificial intelligence (AI). The remainder of the cuts were due to business efficiencies, such as replacing copper cables with more reliable <a href="https://www.bbc.co.uk/news/business-65631168">fibre optic alternatives</a>.</p>
<p>The point regarding AI raises several questions about its effect on the wider economy: what jobs will be most affected by the technology, how will these changes happen and how will these changes be felt?</p>
<p>The development of technology and its associated impact on job security has been a recurring theme since the industrial revolution. Where mechanisation was once the cause of anxiety about job losses, today it is more capable AI algorithms. But for many or most categories of job, retaining humans will remain vital for the foreseeable future.</p>
<p>The technology behind this current revolution is primarily what is known as a large language model (LLM), which is capable of producing relatively human-like responses to questions. It is the basis for OpenAI’s ChatGPT, Google’s Bard system and Microsoft’s Bing AI. </p>
<p>These are all neural networks: mathematical computing systems crudely modelled on the way nerve cells (neurons) fire in the human brain. These complex neural networks are trained on – or familiarised with – text, often sourced from the internet.</p>
<p>The training process enables a user to ask a question in conversational language and for the algorithm to break the question down into components. These components are then processed to generate a response that is appropriate to the question asked. </p>
<p>The result is a system that’s able to provide sensible sounding answers to any question it gets asked. The implications are more wide-ranging than they might seem.</p>
<h2>Humans in the loop</h2>
<p>In the same way that GPS navigation for a driver can replace the need for them to know a route, AI provides an opportunity for workers to have all the information they need at their fingertips, without “Googling”.</p>
<p>Effectively, it removes humans from the loop, meaning any situation where a person’s job involves looking up an item and making links between them could be at risk. The most obvious example here is call centre jobs. </p>
<p>However, it remains possible that members of the public would not accept an AI solving their problems, even if call waiting times became much shorter.</p>
<p>Any manual job has a very remote risk of replacement. While robotics is becoming more capable and dexterous, it operates in highly constrained environments. It relies on sensors giving information about the world and then making decisions on this imperfect data. </p>
<figure class="align-center ">
<img alt="Plumber at work" src="https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/528010/original/file-20230524-19393-l8xecx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Plumbers, electricians and other more complex manual roles are not under immediate threat.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/high-angle-view-male-plumber-overall-555972427">Andrey_Popov / Shutterstock</a></span>
</figcaption>
</figure>
<p>AI isn’t ready for this workspace just yet, the world is a messy and uncertain place that adaptable humans excel in. Plumbers, electricians and complex jobs in manufacturing – for example, automotive or aircraft – face little or no competition in the long-term. </p>
<p>However, AI’s true impact is likely to be felt in terms of efficiency savings rather than outright job replacement. The technology is likely to find quick traction as an assistant to humans. This is already happening, especially in domains such as software development. </p>
<p>Rather than using Google to find out how to write a particular piece of code, it’s much more efficient to ask ChatGPT. The solution that comes back can be tailored strictly to a person’s requirements, delivered efficiently and without unnecessary detail. </p>
<h2>Safety-critical systems</h2>
<p>This type of application will become more commonplace as future AI tools become true intelligent assistants. Whether companies use this as an excuse to look to reduce workforces becomes dependent on their workload. </p>
<p>As the UK is suffering a shortage of Stem (science, technology, engineering and mathematics) graduates, especially in disciplines such as engineering, it’s unlikely that there will be a loss of jobs in this area, just a more efficient manner of tackling the current workload. </p>
<p>This relies on staff making the most of the opportunities that the technology affords. Naturally, there will always be scepticism, and the adoption of AI into the development of safety-critical systems, <a href="https://www.thelancet.com/journals/landig/article/PIIS2589-7500(23)00083-3/fulltext">such as medicine</a>, will take a considerable amount of time. This is because trust in the developer is key, and the simplest way that it develops is through having a human at the heart of the process.</p>
<p>This is critical, as these LLMs are trained using the internet, so biases and errors are woven in. These can arise accidentally, for example, through a person to a particular event simply because they share the same name as someone else. More seriously, they may also occur through malicious intent, deliberately allowing training data to be presented that is wrong or even intentionally misleading. </p>
<p>Cybersecurity becomes an increasing concern as systems become more networked, as does the source of data used to build the AI. LLMs rely on open information as a building block that is refined by interaction. This raises the possibility of new methods for attacking systems by creating deliberate falsehoods. </p>
<p>For example, hackers could create malicious sites and put them in places where they are likely to be <a href="https://ui.adsabs.harvard.edu/abs/2023arXiv230212173G/abstract">picked up by an AI chatbot</a>. Because of the requirement to train the systems on lots of data, it’s difficult to verify everything is correct.</p>
<p>This means that, as workers, we need to look to harness the capability of AI systems and use them to their full potential. This means always questioning what we receive from them, rather than just trusting their output blindly. This period brings to mind the early days of GPS, when the systems often led users down roads unsuitable for their vehicles. </p>
<p>If we apply a sceptical mindset to how we use this new tool, we’ll maximise its capability while simultaneously growing the workforce – as we’ve seen through all the previous industrial revolutions.</p><img src="https://counter.theconversation.com/content/206297/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jonathan Aitken does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>AI is definitely going to change the workplace, but don’t get too worried about your job.Jonathan Aitken, Senior University Teacher in Robotics, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2022792023-04-27T15:22:31Z2023-04-27T15:22:31ZWe need to discuss what jobs robots should do, before the decision is made for us<figure><img src="https://images.theconversation.com/files/519868/original/file-20230406-217-ddq4a5.jpg?ixlib=rb-1.1.0&rect=662%2C6%2C3347%2C2139&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/robotic-housekeeper-using-vacuum-cleaner-on-1929438644">Shutterstock / Frame Stock Footage</a></span></figcaption></figure><p>The social separation imposed by the pandemic led us to rely on technology to an extent we might never have imagined – from Teams and Zoom to online banking and vaccine status apps.</p>
<p>Now, society faces an increasing number of decisions about our relationship with technology. For example, do we want our workforce needs fulfilled by automation, migrant workers, or an increased birth rate?</p>
<p>In the coming years, we will also need to balance technological innovation with people’s wellbeing – both in terms of the work they do and the social support they receive.</p>
<p>And there is the question of trust. When humans should trust robots, and vice versa, is a question our <a href="https://trust.tas.ac.uk/team">Trust Node team</a> is researching as part of the <a href="https://tas.ac.uk/home/the-nodes/">UKRI Trustworthy Autonomous Systems</a> hub. We want to better understand human-robot interactions – based on an individual’s <a href="https://www.sciencedirect.com/science/article/pii/S2590260122000145">propensity to trust others</a>, the <a href="https://www.abotdatabase.info/collection">type of robot</a>, and the nature of the task. This, and projects like it, could ultimately help inform robot design.</p>
<p>This is an important time to discuss what roles we want robots and AI to take in our collective future – before decisions are taken that may prove hard to reverse. One way to frame this dialogue is to think about the various roles robots can fulfil.</p>
<h2>Robots as our servants</h2>
<p>The word “robot” was first used by the Czech writer, Karel Čapek, in his 1920 sci-fi play <a href="https://www.gutenberg.org/files/59112/59112-h/59112-h.htm">Rossum’s Universal Robots</a>. It comes from the word “robota”, meaning to do the drudgery or donkey work. This etymology suggests robots exist to do work that humans would rather not. And there should be no obvious controversy, for example, in tasking robots to maintain nuclear power plants or repair offshore wind farms.</p>
<figure class="align-center ">
<img alt="The Softbank Pepper robot." src="https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/522201/original/file-20230420-14-eyi8jp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The more human a robot looks, the more we trust it.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/softbank-pepper-robot-provide-assistance-automation-1313364728">Antonello Marangi/Shutterstock</a></span>
</figcaption>
</figure>
<p>However, some service tasks assigned to robots are more controversial, because they could be seen as taking jobs from humans. </p>
<p>For example, studies show that people who have lost movement in their upper limbs could benefit from <a href="https://www.science.org/doi/10.1126/scirobotics.abm6010">robot-assisted dressing</a>. But this could be seen as automating tasks that nurses currently perform. Equally, it could free up time for nurses and careworkers – currently sectors that are very short-staffed – to focus on other tasks that require more sophisticated human input.</p>
<h2>Authority figures</h2>
<p>The dystopian 1987 film <a href="https://www.imdb.com/title/tt0093870/">Robocop</a> imagined the future of law enforcement as autonomous, privatised, and delegated to cyborgs or robots. </p>
<p>Today, some elements of this vision are not so far away: the San Francisco Police Department has <a href="https://eu.usatoday.com/story/news/nation/2022/11/30/california-police-deploy-robots-kill/10801825002/">considered deploying robots</a> – albeit under direct human control – to kill dangerous suspects. </p>
<figure class="align-center ">
<img alt="MAARS military robot." src="https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/522204/original/file-20230420-2117-p65p5l.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">This US military robot is fitted with a machine gun to turn it into a remote weapons platform.</span>
<span class="attribution"><a class="source" href="https://www.army.mil/article/11592/robots_can_stand_in_for_soldiers_during_risky_missions">US Army</a></span>
</figcaption>
</figure>
<p>But having robots as authority figures needs careful consideration, as research has shown that humans can place excessive trust in them.</p>
<p><a href="https://ieeexplore.ieee.org/document/7451740/">In one experiment</a>, a “fire robot” was assigned to evacuate people from a building during a simulated blaze. All 26 participants dutifully followed the robot, even though half had previously seen the robot perform poorly in a navigation task.</p>
<h2>Robots as our companions</h2>
<p>It might be difficult to imagine that a human-robot attachment would have the same quality as that between humans or with a pet. However, increasing levels of loneliness in society might mean that for some people, having a non-human companion is better than nothing.</p>
<p><a href="https://www.paroseal.co.uk">The Paro Robot</a> is one of the most commercially successful companion robots to date – and is designed to look like a baby harp seal. Yet research suggests that the more human a robot looks, <a href="https://dl.acm.org/doi/abs/10.1145/3319502.3374839">the more we trust it</a>. </p>
<figure class="align-center ">
<img alt="Paro robot" src="https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/522206/original/file-20230420-16-g0flvn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The Paro companion robot is designed to look like a baby seal.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/fukuoka-japanmay-12-2017-paro-therapeutic-651654589">Angela Ostafichuk / Shutterstock</a></span>
</figcaption>
</figure>
<p>A study has also shown that <a href="https://royalsocietypublishing.org/doi/epdf/10.1098/rstb.2018.0033">different areas of the brain</a> are activated when humans interact with either another human or a robot. This suggests our brains may recognise interactions with a robot differently from human ones.</p>
<p>Creating useful robot companions involves a complex interplay of computer science, engineering and psychology. A robot pet might be ideal for someone who is not physically able to take a dog for its exercise. It might also be able to detect falls and remind someone to take their medication. </p>
<p>How we tackle social isolation, however, raises questions for us as a society. Some might regard efforts to “solve” loneliness with technology as the wrong solution for this pervasive problem.</p>
<h2>What can robotics and AI teach us?</h2>
<p>Music is a source of interesting observations about the differences between human and robotic talents. Committing errors in the way humans do all the time, but robots might not, appears to be a vital component of creativity.</p>
<p><a href="https://dl.acm.org/doi/abs/10.1145/3290605.3300260">A study by Adrian Hazzard and colleagues</a> pitted professional pianists against an autonomous disklavier (an automated piano with keys that move as if played by an invisible pianist). The researchers discovered that, eventually, the pianists made mistakes. But they did so in ways that were interesting to humans listening to the performance.</p>
<p>This concept of “aesthetic failure” can also be applied to how we live our lives. It offers a powerful counter-narrative to the idealistic and perfectionist messages we constantly receive through television and social media – on everything from physical appearance to career and relationships.</p>
<p>As a species, we are approaching many crossroads, including how to respond to climate change, gene editing, and the role of robotics and AI. However, these dilemmas are also opportunities. AI and robotics can mirror our less-appealing characteristics, such as gender and racial biases. But they can also free us from drudgery and highlight unique and appealing qualities, such as our creativity.</p>
<p>We are in the driving seat when it comes to our relationship with robots – nothing is set in stone, yet. But to make educated, informed choices, we need to learn to ask the right questions, starting with: what do we actually want robots to do for us?</p><img src="https://counter.theconversation.com/content/202279/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Thusha Rajendran receives funding from the UKRI and EU. He would like to acknowledge evolutionary anthropologist Anna Machin’s contribution to this article through her book Why We Love, personal communications and draft review.</span></em></p>Robots and AI could transform our lives, so we must decide how we want to use them.Thusha Rajendran, Professor of Psychology, The National Robotarium, Heriot-Watt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1970652023-04-12T12:12:01Z2023-04-12T12:12:01ZRobots are everywhere – improving how they communicate with people could advance human-robot collaboration<figure><img src="https://images.theconversation.com/files/520308/original/file-20230411-28-8juan4.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2070%2C1449&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">'Emotionally intelligent' robots could improve their interactions with people. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/robotic-arm-holding-flower-royalty-free-image/1349021072">Andriy Onufriyenko/Moment via Getty Images</a></span></figcaption></figure><p><a href="https://robots.ieee.org/learn/what-is-a-robot/">Robots</a> are machines that can sense the environment and use that information to perform an action. You can find them nearly everywhere in industrialized societies today. There are household robots that vacuum floors and <a href="https://www.osha.gov/robotics">warehouse robots</a> that pack and ship goods. <a href="https://www.dailycal.org/2020/05/03/uc-berkeley-ucsf-researchers-use-robotics-to-expedite-covid-19-testing">Lab robots</a> test hundreds of clinical samples a day. <a href="https://doi.org/10.3389/feduc.2019.00125">Education robots</a> support teachers by acting as one-on-one tutors, assistants and discussion facilitators. And <a href="https://www.wired.com/story/this-brain-controlled-robotic-arm-can-twist-grasp-and-feel/">medical robotics</a> composed of prosthetic limbs can enable someone to grasp and pick up objects with their thoughts. </p>
<p>Figuring out how humans and robots can collaborate to effectively carry out tasks together is a rapidly growing area of interest to the scientists and engineers that design robots as well as the people who will use them. For successful collaboration between humans and robots, communication is key.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Physical therapist monitoring young patient walking on treadmill with robotic assistance" src="https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/520314/original/file-20230411-26-dhdpcu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Robotics can help patients recover physical function in rehabilitation.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/rossetti-health-center-france-rehabilitation-center-with-news-photo/838193362">BSIP/Universal Images Group via Getty Images</a></span>
</figcaption>
</figure>
<h2>How people communicate with robots</h2>
<p>Robots were originally designed to <a href="https://futura-automation.com/2019/05/15/a-history-timeline-of-industrial-robotics/">undertake repetitive and mundane tasks</a> and operate exclusively in robot-only zones like factories. Robots have since advanced to work collaboratively with people with new ways to communicate with each other.</p>
<p><a href="https://doi.org/10.1007/s12541-012-0128-x">Cooperative control</a> is one way to transmit information and messages between a robot and a person. It involves combining human abilities and decision making with robot speed, accuracy and strength to accomplish a task. </p>
<p>For example, robots in the <a href="https://doi.org/10.3390/agronomy11091818">agriculture industry</a> can help farmers monitor and harvest crops. A human can control a semi-autonomous vineyard sprayer through a user interface, as opposed to manually spraying their crops or broadly spraying the entire field and risking pesticide overuse. </p>
<p>Robots can also <a href="https://doi.org/10.1186/s12984-018-0383-x">support patients in physical therapy</a>. Patients who had a stroke or spinal cord injury can use robots to practice hand grasping and assisted walking during rehabilitation.</p>
<p>Another form of communication, <a href="https://www.pbs.org/wgbh/nova/article/robots-emotional-intelligence/">emotional intelligence perception</a>, involves developing robots that adapt their behaviors based on social interactions with humans. In this approach, the robot detects a person’s emotions when collaborating on a task, assesses their satisfaction, then modifies and improves its execution based on this feedback. </p>
<p>For example, if the robot detects that a physical therapy patient is dissatisfied with a specific rehabilitation activity, it could direct the patient to an alternate activity. <a href="https://doi.org/10.3389/frobt.2021.730317">Facial expression</a> and body gesture recognition ability are important design considerations for this approach. <a href="https://doi.org/10.3389/frobt.2020.532279">Recent advances in machine learning</a> can help robots decipher emotional body language and better interact with and perceive humans.</p>
<h2>Robots in rehab</h2>
<p>Questions like how to make robotic limbs feel more natural and capable of more complex functions like typing and playing musical instruments have yet to be answered.</p>
<p>I am an <a href="https://scholar.google.com/citations?user=Ok92zD4AAAAJ&hl=en">electrical engineer</a> who studies how the brain controls and communicates with other parts of the body, and <a href="http://vinjamurilab.cs.umbc.edu">my lab</a> investigates in particular how the <a href="https://doi.org/10.3390/s22145349">brain</a> and <a href="https://doi.org/10.3390/s22114177">hand</a> coordinate signals between each other. Our goal is to design technologies like prosthetic and wearable <a href="https://doi.org/10.1109/TBCAS.2019.2950145">robotic exoskeleton devices</a> that could help improve function for individuals with stroke, spinal cord and traumatic brain injuries. </p>
<p>One approach is through <a href="https://theconversation.com/brain-computer-interfaces-could-allow-soldiers-to-control-weapons-with-their-thoughts-and-turn-off-their-fear-but-the-ethics-of-neurotechnology-lags-behind-the-science-194017">brain-computer interfaces</a>, which use brain signals to communicate between robots and humans. By accessing an individual’s brain signals and providing targeted feedback, this technology can potentially improve recovery time in <a href="https://doi.org/10.1088/1741-2552/aba162">stroke rehabilitation</a>. Brain-computer interfaces may also help <a href="https://doi.org/10.1016/S1388-2457(02)00057-3">restore some communication abilities</a> and <a href="https://doi.org/10.1016/s0140-6736(12)61816-9">physical manipulation of the environment</a> for patients with motor neuron disorders.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Person sitting in chair wearing electrode cap with a computer screen and robotic arms on a table in front of them" src="https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=270&fit=crop&dpr=1 600w, https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=270&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=270&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=340&fit=crop&dpr=1 754w, https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=340&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/520871/original/file-20230413-26-98uwwp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=340&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Brain-computer interfaces could allow people to control robotic arms by thought alone.</span>
<span class="attribution"><span class="source">Ramana Kumar Vinjamuri</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>The future of human-robot interaction</h2>
<p>Effective integration of robots into human life requires balancing responsibility between people and robots, and designating clear roles for both in different environments.</p>
<p>As robots are increasingly working hand in hand with people, the ethical questions and challenges they pose cannot be ignored. Concerns surrounding <a href="https://ssrn.com/abstract=1599189">privacy</a>, <a href="https://doi.org/10.1007/s11948-017-9975-2">bias and discrimination</a>, <a href="https://doi.org/10.1145/2909824.3020255">security risks</a> and <a href="https://doi.org/10.1145/2696454.2696458">robot morality</a> need to be seriously investigated in order to create a more comfortable, safer and trustworthy world with robots for everyone. Scientists and engineers studying the <a href="https://doi.org/10.1109/HRI.2019.8673184">“dark side” of human-robot interaction</a> are developing guidelines to identify and prevent negative outcomes.</p>
<p>Human-robot interaction has the potential to affect every aspect of daily life. It is the collective responsibility of both the designers and the users to create a human-robot ecosystem that is safe and satisfactory for all.</p>
<p><em>A photo was replaced to more accurately reflect the work of the author.</em></p><img src="https://counter.theconversation.com/content/197065/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ramana Vinjamuri receives funding from National Science Foundation. </span></em></p>Robots are already carrying out tasks in clinics, classrooms and warehouses. Designing robots that are more receptive to human needs could help make them more useful in many contexts.Ramana Vinjamuri, Assistant Professor of Computer Science and Electrical Engineering, University of Maryland, Baltimore CountyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2012212023-03-07T21:19:17Z2023-03-07T21:19:17ZAmazon still seems hell bent on turning workers into robots – here’s a better way forward<figure><img src="https://images.theconversation.com/files/513745/original/file-20230306-1219-3pj7o2.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Workers at Amazon fulfilment centres are under enormous pressure.</span> <span class="attribution"><a class="source" href="https://www.alamy.com/stock-photo-swansea-wales-uk-1st-september-2016-workers-pass-rows-and-rows-of-116968599.html?imageid=13E771BE-5781-41EA-9C1A-0F01A3C073DA&p=316069&pn=1&searchId=d027e7e14481635dde30dc7eb70d8566&searchtype=0">Robert Melen/Alamy</a></span></figcaption></figure><p><a href="https://theface.com/society/meet-the-amazon-uk-workers-fighting-back-stikes-coventry">The strikes</a> by hundreds of <a href="https://www.theguardian.com/commentisfree/2023/feb/28/amazon-warehouse-robots-striking-50p-pay-jeff-bezos">Amazon workers</a> at the company’s Coventry warehouse in the English Midlands have brought into relief some of the problems of work in today’s high-tech society. </p>
<p>While primarily focused on pay, the workers are <a href="https://www.theguardian.com/business/2023/jan/25/the-job-is-not-human-uk-retail-warehouse-staff-describe-gruelling-work">pushing back</a> against long hours and an automated surveillance system that times how long they take to do each task, as well as going to the toilet. It all contributes to a high pressure and intensive work environment – plus more <a href="https://www.theguardian.com/technology/2020/feb/17/concerns-over-safety-at-amazon-warehouses-as-number-of-incidents-rise">accidents</a>. </p>
<p>We have much to learn from this painful situation about the future of work and technology. On the one hand, Amazon’s whole employment model goes against the <a href="https://www.economist.com/graphic-detail/2018/04/24/a-study-finds-nearly-half-of-jobs-are-vulnerable-to-automation">general assumption</a> that technology destroys jobs. Equally, however, the company’s employment practices show how automation can make workplaces oppressive, forcing workers to become more like robots.</p>
<p>Pessimistic predictions about the threat to jobs from technology are nothing new. One <a href="https://www.oxfordmartin.ox.ac.uk/publications/the-future-of-employment/">frequently cited study</a> published in 2013 predicted that up to 47% of jobs in the US would be removed by automation over a 20-year period. Now that we’re halfway through that period, <a href="https://www.commerce.gov/news/blog/2023/02/news-unemployment-its-lowest-level-54-years">jobs in the US</a> remain plentiful and unemployment is low. Similarly, there’s <a href="https://cepr.org/voxeu/columns/rise-robots-german-labour-market">evidence from Germany</a> that the use of robots has had no effect on total employment. </p>
<p>Across the G7 as a whole, employment has been holding up well. Aside from a COVID blip, unemployment has generally been falling for the past decade in spite of automation and robotics gradually becoming more important to workplaces. The reality is that paid employment has been surviving bouts of technological progress for centuries. </p>
<p><strong>G7 unemployment rates 2005-21</strong></p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Graph showing unemployment rates across the OECD" src="https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=295&fit=crop&dpr=1 600w, https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=295&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=295&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=371&fit=crop&dpr=1 754w, https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=371&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/513642/original/file-20230306-28-4yvfjl.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=371&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://data.oecd.org/unemp/unemployment-rate.htm">OECD</a></span>
</figcaption>
</figure>
<p>As the Amazon example suggests, the greater threat from technology is almost certainly to the quality of jobs. This threat should concern us in thinking about ways to use and deploy technology in workplaces now and in the future. </p>
<h2>Reimagining automation</h2>
<p>In a <a href="https://link.springer.com/article/10.1007/s10551-022-05258-z">recent paper</a>, I put forward a couple of basic principles related to the aims behind automation for society as a whole. </p>
<p>Firstly, automation should help to promote more meaningful work. In discussions about the future of work, fears of job losses are often the starting point for arguing that workers’ wages will need to be replaced by a universal basic income. But this sees work as purely instrumental, pursued for income only. Work also matters for who we are and are able to become. </p>
<p>Once you recognise these intrinsic benefits to doing a job, it becomes important to see technology not as a way to eliminate work but to make it better. This means automating the least appealing aspects of work. Technology should complement labour that enlivens and enthuses workers. You can see the potential in sectors as diverse as <a href="https://www.heraldscotland.com/business_hq/14959986.rise-robots-bringing-huge-benefits-farming/">farming</a>, where robots can replace the human toil of harvesting produce, and <a href="https://www.cnbc.com/2015/04/30/pricy-robots-tug-hospital-supplies.html">medicine</a>, where they can be used to transport things like medical waste around hospitals.</p>
<p>Secondly, automation should make it possible for people to spend more time away from work. This is not to contradict the idea that work is beneficial for our well-being but to recognise that a life well lived entails experiencing rewarding activities in and outside of work. Automation should make more time for us to achieve well-being in work and leisure. </p>
<h2>The reality</h2>
<p>Unfortunately, these aims are not generally the priority with technological advancement. This stems from the fact that employees have less say over its nature and direction than employers, which explains why automation makes many workers anxious. </p>
<p>When workers are given more say, the dynamic can change. Take Germany, where there is evidence that the use of robots has <a href="https://cepr.org/voxeu/columns/rise-robots-german-labour-market">actually improved</a> the chances of workers staying in employment. The presence of work councils and strong labour unions in Germany <a href="https://cepr.org/voxeu/columns/effect-new-technologies-workers-jobs-and-skills">seems to be</a> a large part of the explanation. </p>
<p>This partnership approach appears to have helped to create an environment that has protected jobs while allowing workers to upskill to adjust to technological change. It’s no coincidence that Germany has the second lowest unemployment rate in the G7. </p>
<p>Amazon has been <a href="https://www.theguardian.com/technology/2022/sep/29/amazon-us-workers-battle-unionize-second-warehouse">introducing robots</a> over the past decade to help make its warehouses more efficient too. It looks likely to scale this up in the next few years, though the company insists this is not about removing jobs. </p>
<p>Time will tell on that front, but it is hard to be confident in Amazon’s approach to technology when its workers’ interests seem so subordinate to those of the company. In tandem with the UK protests, Amazon workers in places like <a href="https://www.theguardian.com/technology/2022/sep/29/amazon-us-workers-battle-unionize-second-warehouse">the US</a> and <a href="https://www.reuters.com/markets/europe/german-union-warns-amazon-rolling-pre-christmas-strikes-2022-12-18/">Germany</a> have also been battling against its conditions. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Workman holding a laptop next to some boiler pipes" src="https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/513737/original/file-20230306-28-a3y2nz.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It doesn’t need to be a disaster.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/professional-worker-modern-factory-using-laptop-1944508705">1st Footage</a></span>
</figcaption>
</figure>
<p>Amazon <a href="https://www.etf-europe.org/amazon-has-a-european-works-council-despite-management/">did agree</a> in 2022 to form a European works council, which has worker representatives from 35 countries including the UK, and is consulted on company issues that cross borders. But the council’s operations are fairly restricted, while the company’s <a href="https://www.theguardian.com/technology/2022/sep/29/amazon-us-workers-battle-unionize-second-warehouse">general reluctance</a> to engage with unions suggest that warehouse workers are still struggling to further their interests. </p>
<p>In the end, technology will only work for workers if it is democratised. If workers and society rather than big tech companies such as Amazon are to benefit from automation, they need to have a larger <a href="https://wol.iza.org/articles/who-owns-the-robots-rules-the-world/long">influence and stake</a> in it. If this can be achieved, <a href="https://www.politybooks.com/bookdetail?book_slug=making-light-work-an-end-to-toil-in-the-twenty-first-century--9781509548620">less and better work</a> remains the prize.</p><img src="https://counter.theconversation.com/content/201221/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Spencer has received funding from the ESRC </span></em></p>With Amazon facing worker battles in the UK, US and Germany, no wonder people worry about how technology is changing workplaces.David Spencer, Professor of Economics and Political Economy, University of LeedsLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1966642023-01-31T19:12:20Z2023-01-31T19:12:20ZOur future could be full of undying, self-repairing robots. Here’s how<figure><img src="https://images.theconversation.com/files/507247/original/file-20230131-24-1wnmot.jpg?ixlib=rb-1.1.0&rect=419%2C14%2C4109%2C2200&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">frank60/Shutterstock</span></span></figcaption></figure><p>With generative artificial intelligence (AI) systems such as <a href="https://theconversation.com/chatgpt-dall-e-2-and-the-collapse-of-the-creative-process-196461">ChatGPT</a> and <a href="https://theconversation.com/ai-image-generation-is-advancing-at-astronomical-speeds-can-we-still-tell-if-a-picture-is-fake-191674">StableDiffusion</a> being the talk of the town right now, it might feel like we’ve taken a giant leap closer to a sci-fi reality where AIs are physical entities all around us.</p>
<p>Indeed, computer-based AI appears to be advancing at an unprecedented rate. But the rate of advancement in robotics – which we could think of as the potential physical embodiment of AI – is slow.</p>
<p>Could it be that future AI systems will need robotic “bodies” to interact with the world? If so, will nightmarish ideas like the self-repairing, shape-shifting <a href="https://en.wikipedia.org/wiki/T-1000">T-1000 robot</a> from the Terminator 2 movie come to fruition? And could a robot be created that could “live” forever?</p>
<h2>Energy for ‘life’</h2>
<p>Biological lifeforms like ourselves need energy to operate. We get ours via a combination of food, water, and oxygen. The majority of plants also need access to light to grow.</p>
<p>By the same token, an everlasting robot needs an ongoing energy supply. Currently, electrical power dominates energy supply in the world of robotics. Most robots are powered by the <a href="https://blog.mentyor.com/chemistry-of-batteries/">chemistry of batteries</a>. </p>
<p>An alternative battery type has been proposed that uses <a href="https://www.popularmechanics.com/science/green-tech/a35970222/radioactive-diamond-battery-will-run-for-28000-years/">nuclear waste and ultra-thin diamonds at its core</a>. The inventors, a San Francisco startup called <a href="https://ndb.technology/">Nano Diamond Battery</a>, claim a possible battery life of tens of thousands of years. Very small robots would be an ideal user of such batteries.</p>
<p>But a more likely long-term solution for powering robots may involve different chemistry – and even biology. In 2021, scientists from the Berkeley Lab and UMAss Amherst in the US demonstrated tiny nanobots could get their energy from chemicals in the <a href="https://newscenter.lbl.gov/2021/12/08/liquid-robots-never-run-out/">liquid they swim in</a>.</p>
<p>The researchers are now working out how to scale up this idea to larger robots that can work on solid surfaces.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/BdS72O2c9nQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Repairing and copying oneself</h2>
<p>Of course, an undying robot might still need occasional repairs.</p>
<p>Ideally, a robot would repair itself if possible. In 2019, a Japanese research group demonstrated <a href="https://robots.ieee.org/robots/pr2/">a research robot called PR2</a> tightening its <a href="https://ieeexplore.ieee.org/document/9035045">own screw using a screwdriver</a>. This is like self-surgery! However, such a technique would only work if non-critical components needed repair.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/47NjYRWVjLk?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Other research groups are exploring how soft robots can self-heal when damaged. A group in Belgium showed how a robot they developed recovered after being stabbed six times in one of its legs. It stopped for a few minutes until its skin healed itself, <a href="https://www.newscientist.com/article/2350609-self-healing-robot-recovers-from-being-stabbed-then-walks-off/">and then walked off</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/KTJaxxzTKYc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Another unusual concept for repair is to use other things a robot might find in the environment to replace its broken part.</p>
<p>Last year, scientists reported how <a href="https://www.popularmechanics.com/technology/robots/a40746165/dead-spider-leg-grippers/">dead spiders can be used as robot grippers</a>. This form of robotics is known as “necrobotics”. The idea is to use dead animals as ready-made mechanical devices and attach them to robots to become part of the robot.</p>
<figure class="align-center ">
<img alt="A video of a spider attached to a syringe being lowered onto another spider and picking it up" src="https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=472&fit=crop&dpr=1 600w, https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=472&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=472&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=593&fit=crop&dpr=1 754w, https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=593&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/507011/original/file-20230130-26-2uvwwp.gif?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=593&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The proof-of-concept in necrobotics involved taking a dead spider and ‘reanimating’ its hydraulic legs with air, creating a surprisingly strong gripper.</span>
<span class="attribution"><span class="source">Preston Innovation Laboratory/Rice University</span></span>
</figcaption>
</figure>
<h2>A robot colony?</h2>
<p>From all these recent developments, it’s quite clear that in principle, a single robot may be able to live forever. But there is a very long way to go.</p>
<p>Most of the proposed solutions to the energy, repair and replication problems have only been demonstrated in the lab, in very controlled conditions and generally at tiny scales.</p>
<p>The ultimate solution may be one of large colonies or swarms of tiny robots who share a common brain, or mind. After all, this is exactly how many species of insects have evolved.</p>
<p>The concept of the “mind” of an ant colony has been pondered for decades. Research published in 2019 showed ant colonies themselves have a form of memory that is <a href="https://aeon.co/ideas/an-ant-colony-has-memories-that-its-individual-members-dont-have">not contained within any of the ants</a>.</p>
<p>This idea aligns very well with one day having massive clusters of robots that could use this trick to replace individual robots when needed, but keep the cluster “alive” indefinitely.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A close-up swarm of orange ants forming a living bridge between two green leaves" src="https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/507246/original/file-20230130-10893-la43e0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ant colonies can contain ‘memories’ that are distributed between many individual insects.</span>
<span class="attribution"><span class="source">frank60/Shutterstock</span></span>
</figcaption>
</figure>
<p>Ultimately, the scary robot scenarios outlined in countless science fiction books and movies are unlikely to suddenly develop without anyone noticing.</p>
<p>Engineering ultra-reliable hardware is extremely difficult, especially with complex systems. There are currently no engineered products that can last forever, or even for hundreds of years. If we do ever invent an undying robot, we’ll also have the chance to build in some safeguards.</p><img src="https://counter.theconversation.com/content/196664/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jonathan Roberts is Director of the Australian Cobotics Centre, the Technical Director of the Advanced Robotics for Manufacturing (ARM) Hub, and is a Chief Investigator at the QUT Centre for Robotics. He receives funding from the Australian Research Council. He was the co-founder of the UAV Challenge - an international drone competition.</span></em></p>If we’re going to put an AI brain somewhere, it’s likely going to be a robot. The next step – making that robot immortal.Jonathan Roberts, Professor in Robotics, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1928002022-10-19T18:37:54Z2022-10-19T18:37:54ZA new type of material called a mechanical neural network can learn and change its physical properties to create adaptable, strong structures<figure><img src="https://images.theconversation.com/files/490679/original/file-20221019-12170-qt1idp.JPG?ixlib=rb-1.1.0&rect=48%2C78%2C3977%2C2939&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">This connection of springs is a new type of material that can change shape and learn new properties.</span> <span class="attribution"><span class="source">Jonathan Hopkins</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>A new type of material can learn and improve its ability to deal with unexpected forces thanks to a unique lattice structure with connections of variable stiffness, as <a href="https://doi.org/10.1126/scirobotics.abq7278">described in a new paper</a> by my colleagues and me. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A hand holding a small, complex cube of plastic." src="https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/490682/original/file-20221019-23-rnqscu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Architected materials – like this 3D lattice – get their properties not from what they are made out of, but from their structure.</span>
<span class="attribution"><span class="source">Ryan Lee</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>The new material is a type of architected material, which gets its properties mainly from the geometry and specific traits of its design rather than what it is made out of. Take hook-and-loop fabric closures like Velcro, for example. It doesn’t matter whether it is made from cotton, plastic or any other substance. As long as one side is a fabric with stiff hooks and the other side has fluffy loops, the material will have the sticky properties of Velcro.</p>
<p>My colleagues and I based our new material’s architecture on that of an artificial neural network – layers of interconnected nodes that can <a href="https://doi.org/10.1109/ACCESS.2019.2945545">learn to do tasks</a> by changing how much importance, or weight, they place on each connection. We hypothesized that a mechanical lattice with physical nodes could be trained to take on certain mechanical properties by adjusting each connection’s rigidity. </p>
<p>To find out if a mechanical lattice would be able to adopt and maintain new properties – like taking on a new shape or changing directional strength – we started off by building a computer model. We then selected a desired shape for the material as well as input forces and had a computer algorithm tune the tensions of the connections so that the input forces would produce the desired shape. We did this training on 200 different lattice structures and found that a triangular lattice was best at achieving all of the shapes we tested. </p>
<p>Once the many connections are tuned to achieve a set of tasks, the material will continue to react in the desired way. The training is – in a sense – remembered in the structure of the material itself.</p>
<p>We then built a physical prototype lattice with adjustable electromechanical springs arranged in a triangular lattice. The prototype is made of 6-inch connections and is about 2 feet long by 1½ feet wide. And it worked. When the lattice and algorithm worked together, the material was able to learn and change shape in particular ways when subjected to different forces. We call this new material a mechanical neural network.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A photo of hydraulic springs arranged in a triangular lattice" src="https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/490683/original/file-20221019-14-emmwwr.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The prototype is 2D, but a 3D version of this material could have many uses.</span>
<span class="attribution"><span class="source">Jonathan Hopkins</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Why it matters</h2>
<p>Besides some <a href="https://doi.org/10.1007/BF00436764">living tissues</a>, very few materials can learn to be better at dealing with unanticipated loads. Imagine a plane wing that suddenly catches a gust of wind and is forced in an unanticipated direction. The wing can’t change its design to be stronger in that direction.</p>
<p>The prototype lattice material we designed can adapt to changing or unknown conditions. In a wing, for example, these changes could be the accumulation of internal damage, changes in how the wing is attached to a craft or fluctuating external loads. Every time a wing made out of a mechanical neural network experienced one of these scenarios, it could strengthen and soften its connections to maintain desired attributes like directional strength. Over time, through successive adjustments made by the algorithm, the wing adopts and maintains new properties, adding each behavior to the rest as a sort of muscle memory.</p>
<p>This type of material could have far reaching applications for the longevity and efficiency of built structures. Not only could a wing made of a mechanical neural network material be stronger, it could also be trained to morph into shapes that maximize fuel efficiency in response to changing conditions around it.</p>
<h2>What’s still not known</h2>
<p>So far, our team has worked only with 2D lattices. But using computer modeling, we predict that 3D lattices would have a much larger capacity for learning and adaptation. This increase is due to the fact that a 3D structure could have tens of times more connections, or springs, that don’t intersect with one another. However, the mechanisms we used in our first model are far too complex to support in a large 3D structure. </p>
<h2>What’s next</h2>
<p>The material my colleagues and I created is a proof of concept and shows the potential of mechanical neural networks. But to bring this idea into the real world will require figuring out how to make the individual pieces smaller and with precise properties of flex and tension.</p>
<p>We hope new research in the <a href="https://doi.org/10.1039/C8MH01100A">manufacturing of materials at the micron scale</a>, as well as work on <a href="https://doi.org/10.1016/j.eml.2020.101120">new materials with adjustable stiffness</a>, will lead to advances that make powerful smart mechanical neural networks with micron-scale elements and dense 3D connections a ubiquitous reality in the near future.</p><img src="https://counter.theconversation.com/content/192800/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ryan Lee has received funding from the Air Force Office of Science Research . </span></em></p>Computer-based neural networks can learn to do tasks. A new type of material, called a mechanical neural network, applies similar ideas to a physical structure.Ryan H. Lee, PhD Student in Mechanical and Aerospace Engineering, University of California, Los AngelesLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1921702022-10-16T19:02:23Z2022-10-16T19:02:23Z‘Killer robots’ will be nothing like the movies show – here’s where the real threats lie<figure><img src="https://images.theconversation.com/files/489521/original/file-20221013-12-lm966h.jpg?ixlib=rb-1.1.0&rect=143%2C201%2C1386%2C862&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Ghost Robotics Vision 60 Q-UGV.</span> <span class="attribution"><a class="source" href="https://www.dvidshub.net/image/7351259/ghost-robotics-vision-60-q-ugv-demo">US Space Force photo by Senior Airman Samuel Becker</a></span></figcaption></figure><p>You might suppose Hollywood is good at predicting the future. Indeed, Robert Wallace, head of the CIA’s Office of Technical Service and the US equivalent of MI6’s fictional Q, has recounted how Russian spies <a href="https://www.popularmechanics.com/military/a12043/4267549/">would watch the latest Bond movie</a> to see what technologies might be coming their way.</p>
<p>Hollywood’s continuing obsession with killer robots might therefore be of significant concern. The newest such movie is Apple TV’s forthcoming <a href="https://www.thewrap.com/florence-pugh-dolly-movie-murderous-sex-robot-apple-tv-plus/">sex robot courtroom drama Dolly</a>.</p>
<p>I never thought I’d write the phrase “sex robot courtroom drama”, but there you go. Based on a <a href="https://apex-magazine.com/short-fiction/dolly/">2011 short story</a> by Elizabeth Bear, the plot concerns a billionaire killed by a sex robot that then asks for a lawyer to defend its murderous actions.</p>
<h2>The real killer robots</h2>
<p>Dolly is the latest in a long line of movies featuring killer robots – including HAL in Kubrick’s 2001: A Space Odyssey, and Arnold Schwarzenegger’s T-800 robot in the Terminator series. Indeed, conflict between robots and humans was at the centre of the very first feature-length science fiction film, Fritz Lang’s 1927 classic <a href="https://www.britannica.com/topic/Metropolis-film-1927">Metropolis</a>.</p>
<p>But almost all these movies get it wrong. Killer robots won’t be sentient humanoid robots with evil intent. This might make for a dramatic storyline and a box office success, but such technologies are many decades, if not centuries, away.</p>
<p>Indeed, contrary to recent fears, robots may never be sentient.</p>
<p>It’s much simpler technologies we should be worrying about. And these technologies are starting to turn up on the battlefield today in places like Ukraine and <a href="https://www.militarystrategymagazine.com/article/drones-in-the-nagorno-karabakh-war-analyzing-the-data/">Nagorno-Karabakh</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/drones-over-ukraine-fears-of-russian-killer-robots-have-failed-to-materialise-180244">Drones over Ukraine: fears of Russian 'killer robots' have failed to materialise</a>
</strong>
</em>
</p>
<hr>
<h2>A war transformed</h2>
<p>Movies that feature much simpler armed drones, like Angel has Fallen (2019) and Eye in the Sky (2015), paint perhaps the most accurate picture of <a href="https://theconversation.com/eye-in-the-sky-movie-gives-a-real-insight-into-the-future-of-warfare-56684">the real future of killer robots</a>. </p>
<p>On the nightly TV news, we see how modern warfare is being transformed by ever-more autonomous drones, tanks, ships and submarines. These robots are only a little more sophisticated than those you can buy in your local hobby store. </p>
<p>And increasingly, the decisions to identify, track and destroy targets are being handed over to their algorithms. </p>
<p>This is taking the world to a dangerous place, with a host of moral, legal and technical problems. Such weapons will, for example, further upset our troubled geopolitical situation. We already see <a href="https://www.forbes.com/sites/amirhusain/2022/06/30/turkey-builds-a-hyperwar-capable-military/?sh=1500c4b855e1">Turkey emerging as a major drone power</a>.</p>
<p>And such weapons cross a moral red line into a terrible and terrifying world where unaccountable machines decide who lives and who dies. </p>
<p>Robot manufacturers are, however, starting to push back against this future.</p>
<h2>A pledge not to weaponise</h2>
<p>Last week, six leading robotics companies pledged they would <a href="https://www.theguardian.com/technology/2022/oct/07/killer-robots-companies-pledge-no-weapons">never weaponise their robot platforms</a>. The companies include Boston Dynamics, which makes the Atlas humanoid robot, which can <a href="https://youtu.be/knoOXBLFQ-s">perform an impressive backflip</a>, and the Spot robot dog, which looks like it’s <a href="https://youtu.be/wlkCQXHEgjA">straight out of the Black Mirror TV series</a>. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1578400002056953858"}"></div></p>
<p>This isn’t the first time robotics companies have spoken out about this worrying future. Five years ago, I organised <a href="https://newsroom.unsw.edu.au/news/science-tech/world%E2%80%99s-tech-leaders-urge-un-ban-killer-robots">an open letter</a> signed by Elon Musk and more than 100 founders of other AI and robot companies calling for the United Nations to regulate the use of killer robots. The letter even knocked the Pope into third place for a <a href="https://newsroom.unsw.edu.au/news/science-tech/unsws-toby-walsh-voted-runner-global-award">global disarmament award</a>.</p>
<p>However, the fact that leading robotics companies are pledging not to weaponise their robot platforms is more virtue signalling than anything else.</p>
<p>We have, for example, already seen <a href="https://www.vice.com/en/article/m7gv33/robot-dog-not-so-cute-with-submachine-gun-strapped-to-its-back">third parties mount guns</a> on clones of Boston Dynamics’ Spot robot dog. And such modified robots have proven effective in action. Iran’s top nuclear scientist was <a href="https://www.nytimes.com/2021/09/18/world/middleeast/iran-nuclear-fakhrizadeh-assassination-israel.html">assassinated by Israeli agents</a> using a robot machine gun in 2020.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/lethal-autonomous-weapons-and-world-war-iii-its-not-too-late-to-stop-the-rise-of-killer-robots-165822">Lethal autonomous weapons and World War III: it's not too late to stop the rise of 'killer robots'</a>
</strong>
</em>
</p>
<hr>
<h2>Collective action to safeguard our future</h2>
<p>The only way we can safeguard against this terrifying future is if nations collectively take action, as they have with chemical weapons, biological weapons and even nuclear weapons.</p>
<p>Such regulation won’t be perfect, just as the regulation of chemical weapons isn’t perfect. But it will prevent arms companies from openly selling such weapons and thus their proliferation. </p>
<p>Therefore, it’s even more important than a pledge from robotics companies to see the UN Human Rights council <a href="https://www.ohchr.org/en/news/2022/10/human-rights-council-adopts-six-resolutions-appoints-special-rapporteur-situation">has recently unanimously decided</a> to explore the human rights implications of new and emerging technologies like autonomous weapons. </p>
<p>Several dozen nations have already called for the UN to regulate killer robots. The European Parliament, the African Union, the UN Secretary General, Nobel peace laureates, church leaders, politicians and thousands of AI and robotics researchers like myself have all called for regulation. </p>
<p>Australian is not a country that has, so far, supported these calls. But if you want to avoid this Hollywood future, you may want to take it up with your political representative next time you see them.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/new-zealand-could-take-a-global-lead-in-controlling-the-development-of-killer-robots-so-why-isnt-it-166168">New Zealand could take a global lead in controlling the development of 'killer robots' — so why isn't it?</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/192170/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Toby Walsh does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The sentient, murderous humanoid robot is a complete fiction, and may never become reality. But that doesn’t mean we’re safe from autonomous weapons – they are already here.Toby Walsh, Professor of AI at UNSW, Research Group Leader, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1917612022-10-03T19:03:32Z2022-10-03T19:03:32ZTesla’s Optimus robot isn’t very impressive – but it may be a sign of better things to come<figure><img src="https://images.theconversation.com/files/487699/original/file-20221003-12-a5mrry.jpg?ixlib=rb-1.1.0&rect=17%2C21%2C2846%2C1481&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Tesla</span></span></figcaption></figure><p>In August 2021, Tesla CEO Elon Musk <a href="https://www.washingtonpost.com/technology/2021/08/19/tesla-ai-day-robot/">announced</a> the electric car manufacturer was planning to get into the robot business. In a presentation accompanied by a human dressed as a robot, Musk said work was beginning on a “friendly” humanoid robot to “navigate through a world built for humans and eliminate dangerous, repetitive and boring tasks”.</p>
<p>Musk has now <a href="https://www.abc.net.au/news/2022-10-01/elon-musk-unveils-hummanoid-robot-optimus/101493862">unveiled</a> a prototype of the robot, called Optimus, which he hopes to mass-produce and sell for less than US$20,000 (A$31,000).</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1576045399697084416"}"></div></p>
<p>At the unveiling, the robot walked on a flat surface and waved to the crowd, and was shown doing simple manual tasks such as carrying and lifting in a video. As a robotics researcher, I didn’t find the demonstration very impressive – but I am hopeful it will lead to bigger and better things.</p>
<h2>Why would we want humanoid robots?</h2>
<p>Most of the robots used today don’t look anything like people. Instead, they are machines designed to carry out a specific purpose, like the industrial robots used in factories or the robot vacuum cleaner you might have in your house.</p>
<p>So why would you want one shaped like a human? The basic answer is they would be able to operate in environments designed for humans. </p>
<p>Unlike industrial robots, humanoid robots might be able to move around and interact with humans. Unlike robot vacuum cleaners, they might be able to go up stairs or traverse uneven terrain.</p>
<p>And as well as practical considerations, the idea of “artificial humans” has long had an appeal for inventors and science-fiction writers! </p>
<h2>Room for improvement</h2>
<p>Based on what we saw in the Tesla presentation, Optimus is a long way from being able to operate with humans or in human environments. The capabilities of the robot showcased fall far short of the state of the art in humanoid robotics.</p>
<p>The <a href="https://www.bostondynamics.com/atlas">Atlas robot</a> made by Boston Dynamics, for example, can walk outdoors and carry out flips and other acrobatic manoeuvres. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/tF4DML7FIWk?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The Atlas robot, made by Boston Dynamics, has some impressive skills.</span></figcaption>
</figure>
<p>And while Atlas is an experimental system, even the commercially available <a href="https://agilityrobotics.com/robots">Digit</a> from Agility Robotics is much more capable than what we have seen from Optimus. Digit can walk on various terrains, avoid obstacles, rebalance itself when bumped, and pick up and put down objects.</p>
<p>Bipedal walking (on two feet) alone is no longer a great achievement for a robot. Indeed, with a bit of knowledge and determination you can build such a robot yourself using <a href="https://hackaday.io/project/181799-redacted-the-first-fully-open-bipedal-robot">open source software</a>.</p>
<p>There was also no sign in the Optimus presentation of how it will interact with humans. This will be essential for any robot that works in human environments: not only for collaborating with humans, but also for basic safety.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-robot-breaks-the-finger-of-a-7-year-old-a-lesson-in-the-need-for-stronger-regulation-of-artificial-intelligence-187612">A robot breaks the finger of a 7-year-old: a lesson in the need for stronger regulation of artificial intelligence</a>
</strong>
</em>
</p>
<hr>
<p>It can be very tricky for a robot to accomplish seemingly simple tasks such as handing an object to a human, but this is something we would want a domestic humanoid robot to be able to do. </p>
<h2>Sceptical consumers</h2>
<p>Others have tried to build and sell humanoid robots in the past, such as Honda’s <a href="https://asimo.honda.com">ASIMO</a> and SoftBank’s <a href="https://www.bbc.com/news/technology-57651405">Pepper</a>. But so far they have never really taken off.</p>
<p>Amazon’s recently released <a href="https://www.cnet.com/home/smart-home/amazon-astro-review/">Astro robot</a> may make inroads here, but it may also go the way of its predecessors.</p>
<p>Consumers seem to be sceptical of robots. To date, the only widely adopted household robots are the Roomba-like vacuum cleaners, which have been available since 2002. </p>
<p>To succeed, a humanoid robot will need be able to do something humans can’t to justify the price tag. At this stage the use case for Optimus is still not very clear.</p>
<h2>Hope for the future</h2>
<p>Despite these criticisms, I am hopeful about the Optimus project. It is still in the very early stages, and the presentation seemed to be aimed at recruiting new staff as much as anything else.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1576382868062081024"}"></div></p>
<p>Tesla certainly has plenty of resources to throw at the problem. We know it has the capacity to mass produce the robots if development gets that far.</p>
<p>Musk’s knack for gaining attention may also be helpful – not only for attracting talent to the project, but also to drum up interest among consumers.</p>
<p>Robotics is a challenging field, and it’s difficult to move fast. I hope Optimus succeeds, both to make something cool we can use – and to push the field of robotics forward.</p><img src="https://counter.theconversation.com/content/191761/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Wafa Johal receives funding from the Australian Research Council. </span></em></p>Humanoid robots could be useful in all kinds of situations, but the one Elon Musk unveiled last week is far from being ready to roll out.Wafa Johal, Senior Lecturer, Computing & Information Systems, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1885972022-09-22T20:17:59Z2022-09-22T20:17:59ZAustralia has relied on agricultural innovation to farm our dry land. We’ll need more for the uncertain years ahead<figure><img src="https://images.theconversation.com/files/484187/original/file-20220913-1734-gdncvt.jpg?ixlib=rb-1.1.0&rect=6%2C0%2C4083%2C2152&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Since European colonisation, Australia’s farmers have had to pioneer new technologies to adapt agriculture to this dry land.</p>
<p>Think of innovations such as the world’s first <a href="https://www.samemory.sa.gov.au/site/page.cfm?u=323">mechanical grain stripper</a>, which saved workers from the tedious task of stripping wheat from the stalk, or the <a href="https://trove.nla.gov.au/newspaper/article/93182324">stump jump plough</a>, invented to avoid ploughs constantly breaking when they hit mallee roots on newly cleared ground. </p>
<p>The pace of innovation hasn’t slowed, and has led in part to Australia becoming an agricultural powerhouse. We produce <a href="https://www.afgc.org.au/news-and-media/2020/06/no-need-to-panic-australia-produces-enough-food-for-75-million">enough food</a> for 75 million people, according to the Australian Food and Grocery Council, and export around 70% of the food we produce. </p>
<p>We will need more innovation to cope with the changing climate – which will make water supplies more uncertain and add heat stress to livestock – as well as other environmental issues such as nutrient runoff from too much fertiliser. </p>
<p>In future, expect to see farmers go high-tech, relying more on drones to optimise fertiliser and water use, on harvest robots to tackle challenges with labour shortages, and on sensors to measure the health of the soil. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="stump jump plough" src="https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=386&fit=crop&dpr=1 600w, https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=386&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=386&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=485&fit=crop&dpr=1 754w, https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=485&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/484190/original/file-20220913-1755-juud1b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=485&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The stump jump plough was an early Australian innovation designed to stop mallee roots breaking ploughs.</span>
<span class="attribution"><span class="source">State Library of South Australia</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>We need agricultural innovation, now more than ever</h2>
<p>It’s impossible to overstate the importance of agricultural innovation. By some estimates, close to half of the world’s population <a href="https://www.bbc.com/news/business-38305504">owes its existence</a> to the Haber-Bosch process, which pulls nitrogen from the air to produce fertiliser. The famous mid-20th-century Green Revolution that introduced high-yield varieties of crops also paved the way for major boosts in food security – and population. </p>
<p>Now we face a less certain future. Hunger is growing again. Last year, around 828 million people went to bed <a href="https://www.wfp.org/global-hunger-crisis">hungry every night</a>. </p>
<p>For farmers, it has been a difficult few years. COVID travel restrictions and supply chain disruption coupled with the Russian invasion of Ukraine have caused global uncertainty – and major increases in costs for farmers. </p>
<p>It’s become harder to find workers. Fertilisers have become more expensive, as have herbicides, insecticides, seeds and fuel. Some of the increases are huge: fertiliser costs shot up from A$380 a tonne to a whopping $867 a tonne in just two months, between December 2021 and January 2022.</p>
<p>We will need ways of optimising how we farm and making the most of our farmland, if we are to make farming more resilient to climate shocks, more efficient users of water, fertilisers and chemicals, and keep food affordable. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/robot-farmers-could-improve-jobs-and-help-fight-climate-change-if-theyre-developed-responsibly-162718">Robot farmers could improve jobs and help fight climate change – if they're developed responsibly</a>
</strong>
</em>
</p>
<hr>
<p>Luckily, innovators are responding. By 2030, high-tech agricultural approaches are expected to add up to A$20 billion a year in farm production, according to the Australian Farm Institute. </p>
<p>If these new approaches deliver on their promise, it would take us most of the way to the industry’s goal of $100 billion by the end of the decade. At present, we produce <a href="https://www.abs.gov.au/statistics/industry/agriculture/value-agricultural-commodities-produced-australia/latest-release">$71 billion</a> worth of food a year. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Sheep from above" src="https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/484192/original/file-20220913-26-xkr789.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Precision agriculture is about optimising farming and producing more with less.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>What does high-tech farming look like?</h2>
<p>Traditionally, farmers have relied on common sense and experience to gauge the health of their soils and how well their crops are growing.</p>
<p>Increasingly, though, it’s becoming possible to get real-time information on a field-by-field basis using agricultural sensors. Sensors can measure soil moisture, temperature and salinity. If you deploy sensors throughout your fields, you can find out about issues early and respond quickly. </p>
<p>Broader technological advances are proving their worth for farmers too. Drones can give farmers an <a href="https://www.daf.qld.gov.au/agtech/be-inspired/future-trends/drones">eye in the sky</a>, which, coupled with AI image recognition, can detect and classify issues affecting plants. Think of getting a notification if telltale signs of an insect pest or destructive fungus are spotted on your farm. Farmers are already using drones to <a href="https://www.abc.net.au/news/rural/2022-06-24/drones-agtech-hands-free-farming/101174622">spot feral pigs</a>. Drones can even apply fertiliser or agrochemicals in hard-to-access places. </p>
<p>For livestock farmers, drones offer a much faster way to count stock. Soon, drones may even be able to muster sheep or cattle. For plantation managers, drones can be used to <a href="https://www.abc.net.au/news/2022-06-19/giant-drones-dropping-tree-seeds/101150496">plant trees</a> by firing bundles of seeds and nutrients into the ground. </p>
<h2>Farm robots and vertical farms</h2>
<p>New advances in robotics are similarly useful. Many farmers were hard-hit by labour shortages due to COVID-linked lockdowns and restrictions on travel. In response, some are turning to the fast-developing field of farm robots. These robots can fertilise, apply pesticides, mow and are even becoming capable of picking fruit and vegetables. </p>
<p>Here, too, Australia has innovators such as Queensland’s SwarmFarm, which makes robots able to accurately spray weeds with herbicide and other routine tasks. As one farmer <a href="https://www.abc.net.au/news/2021-02-28/swarmfarm-agricultural-robots-queensland/13193394">told the ABC</a>, the robot has cut his use of chemicals by fully 80%. Overseas, robots are even being used to <a href="https://www.nytimes.com/2020/02/13/science/farm-agriculture-robots.html">speed up the breeding</a> of new crop hybrids. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="farm robot" src="https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/484194/original/file-20220913-18-11jo3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The capabilities of farm robots are growing rapidly.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>Vertical farming – indoor farms done in vertical layers – has the potential to slash water use, food miles and boost climate resilience. Queensland’s Vertical Farm Systems is one of the leaders making vertical farm systems cheaper, which has long been a challenge slowing uptake. Their automated leafy green growing farms are <a href="https://www.sunshinecoastnews.com.au/2021/12/02/vertical-farming-systems/">now exported</a> to countries such as Canada and the United Arab Emirates.</p>
<p>Inventing and applying advanced technologies helps Australian farmers make decisions backed by hard data, to boost productivity and profitability. Some new technologies can also help prevent the overuse of fertilisers and other agrochemicals, and help make the wider environment cleaner. </p>
<p>Chemical overuse in farming is a well-known problem, with effects ranging from dangerous blue-green algae blooms linked to nitrogen fertiliser run-off from farms, human health issues from chemicals leaking into groundwater and watercourses, and direct consumption by humans, such as traces of pesticides on foods.</p>
<p>What these agricultural innovations have in common is a focus on precision, where key inputs like fertiliser and herbicides are applied as needed – no more, no less. Similarly, real-time data makes it possible for farmers to make the most out of their crop by fine-tuning irrigation and fertiliser as the plants require. </p>
<p>We will need all of these innovations – and more – to meet the challenges ahead. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/3-technologies-poised-to-change-food-and-the-planet-153852">3 technologies poised to change food and the planet</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/188597/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sam Rudd works as a researcher at the University of South Australia and is a co-inventor on a Joint patent and co-author of joint publications with Sentek Sensor Technology. Sam is currently on a SIEF Ross Metcalf STEM+Business Fellowship, supported by the Science and Industry Endowment Fund (SIEF) and Sentek to facilitate the development of a world-first sensor based on the joint patent.</span></em></p><p class="fine-print"><em><span>Drew Evans works as a Professor and Professorial Lead at the University of South Australia. He receives funding from Sentek Sensor Technology, is co-inventor on a joint patent and co-author on joint publications with Sentek staff. Drew is a current member of the National Committee for Materials Science and Engineering under the Australian Academy of Science, and executive member of the Australian Materials Research Society.</span></em></p>To maintain our role as a global food bowl, Australia has to keep innovating in agricultural technology.Sam Rudd, SIEF Ross Metcalf STEM+Business Fellow - Future Industries Institute, University of South AustraliaDrew Evans, Associate Professor of Energy & Advanced Manufacturing, Australian Research Council Future Fellow, University of South AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1842272022-09-07T12:23:00Z2022-09-07T12:23:00ZWhy household robot servants are a lot harder to build than robotic vacuums and automated warehouse workers<figure><img src="https://images.theconversation.com/files/483088/original/file-20220906-16-3sovqs.jpg?ixlib=rb-1.1.0&rect=28%2C28%2C3804%2C3430&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Who wouldn’t want a robot to handle all the household drudgery?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/robot-assistant-domestic-cleaner-robot-royalty-free-illustration/886205496">Skathi/iStock via Getty Images</a></span></figcaption></figure><p>With recent advances in artificial intelligence and robotics technology, there is growing interest in developing and marketing household robots capable of handling a variety of domestic chores. </p>
<p>Tesla is <a href="https://www.theregister.com/2022/08/05/tesla_musk_robot/">building a humanoid robot</a>, which, according to CEO Elon Musk, could be used for cooking meals and helping elderly people. Amazon recently <a href="https://press.aboutamazon.com/news-releases/news-release-details/amazon-and-irobot-sign-agreement-amazon-acquire-irobot">acquired iRobot</a>, a prominent robotic vacuum manufacturer, and has been investing heavily in the technology through the <a href="https://www.amazon.science/research-areas/robotics">Amazon Robotics program</a> to expand robotics technology to the consumer market. In May 2022, Dyson, a company renowned for its power vacuum cleaners, announced that it plans to build the U.K.’s largest robotics center devoted to <a href="https://www.theguardian.com/technology/2022/may/25/dyson-reveals-its-big-bet-robots">developing household robots</a> that carry out daily domestic tasks in residential spaces. </p>
<p>Despite the growing interest, would-be customers may have to wait awhile for those robots to come on the market. While devices such as smart thermostats and security systems are widely used in homes today, the commercial use of household robots is still in its infancy.</p>
<p>As a <a href="https://scholar.google.com/citations?hl=en&user=Ul2F7OwAAAAJ&view_op=list_works&sortby=pubdate">robotics researcher</a>, I know firsthand how household robots are considerably more difficult to build than smart digital devices or industrial robots.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/DTGfY_Dl9wY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Robots that can handle a variety of domestic chores are an age-old staple of science fiction.</span></figcaption>
</figure>
<h2>Handling objects</h2>
<p>One major difference between digital and robotic devices is that household robots <a href="https://manipulation.csail.mit.edu/intro.html">need to manipulate objects</a> through physical contact to carry out their tasks. They have to carry the plates, move the chairs and pick up dirty laundry and place it in the washer. These operations require the robot to be able to handle fragile, soft and sometimes heavy objects with irregular shapes. </p>
<p>The state-of-the-art AI and machine learning algorithms perform well in simulated environments. But contact with objects in the real world often trips them up. This happens because physical contact is often difficult to model and even harder to control. While a human can easily perform these tasks, there exist significant technical hurdles for household robots to reach human-level ability to handle objects. </p>
<p>Robots have difficulty in two aspects of manipulating objects: control and sensing. Many pick-and-place robot manipulators like those on assembly lines are equipped with a simple gripper or specialized tools dedicated only to certain tasks like grasping and carrying a particular part. They often struggle to manipulate objects with irregular shapes or elastic materials, especially because they lack the efficient <a href="https://doi.org/10.3389/fnbot.2019.00053">force, or haptic, feedback</a> humans are naturally endowed with. Building a general-purpose robot hand with flexible fingers is still technically challenging and expensive.</p>
<p>It is also worth mentioning that traditional robot manipulators require a stable platform to operate accurately, but the accuracy drops considerably when using them with platforms that move around, particularly on a variety of surfaces. Coordinating locomotion and manipulation in a mobile robot is an open problem in the robotics community that needs to be addressed before broadly capable household robots can make it onto the market. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/PvxrM0-qhlQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A sophisticated robotic kitchen is already on the market, but it operates in a highly structured environment, meaning all of the objects it interacts with – cookware, food containers, appliances – are where it expects them to be, and there are no pesky humans to get in the way.</span></figcaption>
</figure>
<h2>They like structure</h2>
<p>In an assembly line or a warehouse, the environment and sequence of tasks are strictly organized. This allows engineers to preprogram the robot’s movements or use simple methods like QR codes to locate objects or target locations. However, household items are often disorganized and placed randomly.</p>
<p>Home robots must deal with many uncertainties in their workspaces. The robot must first locate and identify the target item among many others. Quite often it also requires clearing or avoiding other obstacles in the workspace to be able to reach the item and perform given tasks. This requires the robot to have an excellent perception system, efficient navigation skills, and powerful and accurate manipulation capability.</p>
<p>For example, users of robot vacuums know they must remove all small furniture and other obstacles such as cables from the floor, because even the best robot vacuum cannot clear them by itself. Even more challenging, the robot has to operate in the presence of moving obstacles when people and pets walk within close range. </p>
<h2>Keeping it simple</h2>
<p>While they appear straightforward for humans, many household tasks are too complex for robots. Industrial robots are excellent for repetitive operations in which the robot motion can be preprogrammed. But household tasks are often unique to the situation and could be full of surprises that require the robot to constantly make decisions and change its route in order to perform the tasks. </p>
<p>Think about cooking or cleaning dishes. In the course of a few minutes of cooking, you might grasp a sauté pan, a spatula, a stove knob, a refrigerator door handle, an egg and a bottle of cooking oil. To wash a pan, you typically hold and move it with one hand while scrubbing with the other, and ensure that all cooked-on food residue is removed and then all soap is rinsed off.</p>
<p>There has been significant development in recent years using machine learning to train robots to make intelligent decisions when picking and placing different objects, meaning grasping and moving objects from one spot to another. However, to be able to train robots to master all different types of kitchen tools and household appliances would be another level of difficulty even for the best learning algorithms.</p>
<p>Not to mention that people’s homes often have stairs, narrow passageways and high shelves. Those hard-to-reach spaces limit the use of today’s mobile robots, which tend to use wheels or four legs. Humanoid robots, which would more closely match the environments humans build and organize for themselves, have yet to be reliably used outside of lab settings. </p>
<p>A solution to task complexity is to build special-purpose robots, such as robot vacuum cleaners or kitchen robots. Many different types of such devices are likely to be developed in the near future. However, I believe that general-purpose home robots are still a long way off.</p><img src="https://counter.theconversation.com/content/184227/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ayonga Hereid does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Videos of humanoid robots dancing and performing backflips in the lab notwithstanding, robots that wash your dishes and fold your laundry are still years away. A roboticist explains why.Ayonga Hereid, Assistant Professor of Mechanical and Aerospace Engineering, The Ohio State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1885202022-08-21T20:03:06Z2022-08-21T20:03:06ZAustralia’s pursuit of ‘killer robots’ could put the trans-Tasman alliance with New Zealand on shaky ground<figure><img src="https://images.theconversation.com/files/479984/original/file-20220818-546-nyccc.jpg?ixlib=rb-1.1.0&rect=0%2C242%2C8986%2C4944&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Getty Images</span></span></figcaption></figure><p>Australia’s recently <a href="https://www.defence.gov.au/about/reviews-inquiries/defence-strategic-review">announced</a> defence review, intended to be the most thorough in almost four decades, will give us a good idea of how Australia sees its role in an increasingly tense strategic environment.</p>
<p>As New Zealand’s only formal military ally, Australia’s defence choices will have significant implications, both for New Zealand and regional geopolitics.</p>
<p>There are several areas of contention in the trans-Tasman relationship. One is Australia’s pursuit of nuclear-powered submarines, which clashes with New Zealand’s anti-nuclear stance. Another lies in the two countries’ diverging approaches to autonomous weapons systems (AWS), colloquially known as “killer robots”. </p>
<figure class="align-center ">
<img alt="Boeing Australia's autonomous 'loyal wingman' aircraft" src="https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/479242/original/file-20220816-20306-j1c4ti.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Boeing Australia is developing autonomous ‘loyal wingman’ aircraft to complement manned aircraft.</span>
<span class="attribution"><a class="source" href="https://www.flightglobal.com/defence/boeing-australia-pushes-loyal-wingman-maiden-flight-to-2021/141691.article">Boeing</a>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>In general, AWS are <a href="https://www.beehive.govt.nz/sites/default/files/2021-11/Autonomous-Weapons-Systems-Cabinet-paper.pdf">considered</a> to be “weapons systems that, once activated, can select and engage targets without further human intervention”. There is, however, no internationally agreed definition.</p>
<p>New Zealand is involved with international attempts to ban and regulate AWS. It seeks a ban on systems that “are not sufficiently predictable or controllable to meet legal or ethical requirements” and advocates for “rules or limits to govern the development and use of AWS”. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1424978867614228485"}"></div></p>
<p>If this seems vague to you, it should. This ambiguity in definition makes it difficult to determine which systems New Zealand seeks to ban or regulate.</p>
<h2>Australia’s prioritisation of AWS</h2>
<p>Australia, meanwhile, has been developing what it more commonly refers to as robotics and autonomous systems (RAS) with <a href="https://www.tandfonline.com/doi/full/10.1080/10357718.2022.2095615">gusto</a>. Since 2016, Australia has identified RAS as a priority area of development and substantially increased <a href="https://www.dst.defence.gov.au/nextgentechfund">funding</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/new-zealand-could-take-a-global-lead-in-controlling-the-development-of-killer-robots-so-why-isnt-it-166168">New Zealand could take a global lead in controlling the development of 'killer robots' — so why isn't it?</a>
</strong>
</em>
</p>
<hr>
<p>The Australian <a href="https://www.navy.gov.au/sites/default/files/documents/RAN_WIN_RASAI_Strategy_2040f2_hi.pdf">navy</a>, <a href="https://researchcentre.army.gov.au/sites/default/files/2020-03/robototic_autonomous_systems_strategy.pdf">army</a> and defence force (<a href="https://tasdcrc.com.au/wp-content/uploads/2020/12/ADF-Concept-Robotics.pdf">ADF</a>) have each released concept documents since 2018, discussing RAS and their associated benefits, risks, challenges and opportunities.</p>
<p>Key systems Australia is pursuing include the autonomous aircraft <a href="https://news.defence.gov.au/service/introducing-ghost-bat">Ghost Bat</a>, three different kinds of <a href="https://www.australiandefence.com.au/defence/sea/navy-s-uncrewed-undersea-plans">extra-large underwater autonomous vehicles</a> and <a href="https://www.minister.defence.gov.au/minister/melissa-price/media-releases/autonomous-truck-project-passes-major-milestone">autonomous trucks</a>.</p>
<h2>Why is Australia seeking to develop these technologies?</h2>
<p>The short answer is three-fold: seeking military advantage, saving lives and economics.</p>
<p>Australia and its allies and partners, particularly the US, are <a href="https://www.ussc.edu.au/analysis/us-china-technology-competition-and-what-it-means-for-australia">fearful</a> of losing the technological superiority they have long held over rivals such as China. </p>
<p>Large military capabilities, like nuclear-powered submarines, take both time and money to acquire. Australia is further limited in what it can do by the size of its defence force. RAS are seen as a way to potentially maintain advantage, and to do more with less.</p>
<p>RAS are also seen as a way to save lives. A <a href="https://media.defense.gov/2020/Nov/23/2002540369/-1/-1/1/WYATT.PDF">survey</a> of Australian military personnel found they considered reduction of harm and injury to defence personnel, allied personnel and civilians among the most important potential benefits of RAS. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/un-fails-to-agree-on-killer-robot-ban-as-nations-pour-billions-into-autonomous-weapons-research-173616">UN fails to agree on 'killer robot' ban as nations pour billions into autonomous weapons research</a>
</strong>
</em>
</p>
<hr>
<p>The Australian Defence Force also <a href="https://tasdcrc.com.au/wp-content/uploads/2020/12/ADF-Concept-Robotics.pdf">believes</a> RAS will be cheaper than large platforms. Inflation means money already committed to defence has less purchasing power. RAS present an opportunity to achieve the same outcomes at a lower cost.</p>
<p>Meanwhile, in 2018, the Australian government outlined its intention to become a top-ten <a href="https://www.ft.com/content/d743d758-04b2-11e8-9650-9c0ad2d7c5b5">defence exporter</a>. There are keen <a href="https://breakingdefense.com/2022/03/aussies-aim-for-1b-in-exports-of-loyal-wingman-now-ghost-bat/">hopes</a> the Ghost Bat will become a successful defence export. </p>
<p>At the same time, the government is keen to <a href="https://apo.org.au/sites/default/files/resource-files/2016-02/apo-nid93621.pdf">build</a> closer ties between defence, industry and academia. Industry and academia both vie for defence funding, and this drives development of RAS.</p>
<p>Of course, the technology is new. It’s not guaranteed RAS will save lives, save money or achieve military advantage. The extent to which RAS will be used, and what they will be used for, is not foreseeable. It is in this uncertainty that New Zealand must make judgments about AWS and alliance management.</p>
<figure class="align-center ">
<img alt="Armed Autonomous aerial vehicle on runway" src="https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/479985/original/file-20220818-164-hnhgr1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Autonomous systems are seen as a way to save lives.</span>
<span class="attribution"><span class="source">Getty Images</span></span>
</figcaption>
</figure>
<h2>What this means for the trans-Tasman relationship</h2>
<p>The nuclear-powered submarines captured attention when Australia’s new AUKUS partnership with the US and UK was announced, but its primary purpose is a much broader partnership that shares defence technology, including RAS. </p>
<p>The most recent statement from the AUKUS working groups <a href="https://www.gov.uk/government/news/readout-of-aukus-joint-steering-group-meetings--2">says</a> they “will seek opportunities to engage allies and close partners”. Last week, US Deputy Secretary of State Wendy Sherman made it clear New Zealand was one such <a href="https://www.rnz.co.nz/news/political/472583/us-would-have-conversations-with-new-zealand-if-time-comes-for-others-to-join-aukus-top-diplomat">partner</a>.</p>
<p>Australia’s focus on RAS, particularly in the context of AUKUS, may soon bring alliance questions to the fore. Strategic studies expert Robert Ayson has argued AUKUS, combined with increased strategic tension, <a href="https://pacforum.org/publication/pacnet-48-new-zealand-and-aukus-affected-without-being-included">means</a> that “year by year New Zealand’s alliance commitment to the defence of Australia will carry bigger implications”. AWS will play a role in these implications.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/nukes-allies-weapons-and-cost-4-big-questions-nzs-defence-review-must-address-188732">Nukes, allies, weapons and cost: 4 big questions NZ's defence review must address</a>
</strong>
</em>
</p>
<hr>
<p>AWS may seem an insignificant trans-Tasman difference compared to the use of nuclear technologies. But AWS come with a lot more uncertainty and fuzziness than, say, <a href="https://www.smh.com.au/world/oceania/not-in-our-waters-ardern-says-no-to-visits-from-australia-s-new-nuclear-subs-20210916-p58s7k.html">banning</a> nuclear-powered submarines in New Zealand waters. This fuzziness creates ample room for misperceptions and poor communication.</p>
<p>Trust in alliance relationships is easily damaged, and difficult to manage. Clear communication and ensuring a good understanding of each other’s positions is essential. The ambiguity of AWS makes these things difficult. </p>
<p>New Zealand and Australia may need to clarify their respective positions before Australia’s defence review is released next March. Otherwise, they run the risk of fuelling misunderstandings at a delicate moment for trans-Tasman relations.</p><img src="https://counter.theconversation.com/content/188520/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sian Troath receives funding from The Royal Society of New Zealand Marsden Fund.</span></em></p>Diverging views on automated weapons systems could make it difficult for Australia and New Zealand to manage military ties at a delicate time in trans-Tasman relations.Sian Troath, Postdoctoral fellow, University of CanterburyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1808842022-06-15T03:30:05Z2022-06-15T03:30:05Z‘I couldn’t see a future’: what ex-automotive workers told us about job loss, shutdowns, and communities on the edge<figure><img src="https://images.theconversation.com/files/457026/original/file-20220407-11-t6cnqf.jpg?ixlib=rb-1.1.0&rect=48%2C97%2C3546%2C2274&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Economies are forever changing and the loss of some industries or businesses is part of that transformation. But change often comes at great cost for workers, many of whom are already vulnerable.</p>
<p>The stories of retrenched workers give us important insights into the often complex effects of job loss. To find out more about these experiences, we interviewed 28 workers made redundant from the auto sector around South Australia and Victoria over the past five years, as part of a larger research project about disadvantaged communities.</p>
<p>Our <a href="https://rsa.tandfonline.com/doi/full/10.1080/21681376.2022.2078737#.YqVksHZBw2w">paper, published in the journal Regional Studies, Regional Science</a>, reveals how economic change interrupts careers and life plans, casting people into new worlds of precarious work and long, indefinite journeys in search of security.</p>
<p>The stories of these automotive workers are not unique; they reflect the experiences of many workers in Australia who have faced retrenchment and redundancy as industries and businesses have closed.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-the-departure-of-toyota-holden-and-ford-really-means-for-workers-23137">What the departure of Toyota, Holden and Ford really means for workers</a>
</strong>
</em>
</p>
<hr>
<h2>Bad jobs are easy to find</h2>
<p>Since being retrenched, many of our interviewees have struggled to find a job that is secure, safe and pays a decent wage.</p>
<p>Bad jobs – with undesirable hours and low pay – are easy to find, and many are forced to take them. Many are also shocked by what they find at their new workplaces – poor safety standards, toxic cultures and boring or “disgusting” work. These included jobs as diverse as food processing, cleaning, warehousing, chicken killing and grout manufacturing. </p>
<p>As one worker who’d been made redundant three years before <a href="https://rsa.tandfonline.com/doi/full/10.1080/21681376.2022.2078737#.YqVksHZBw2w">told</a> us:</p>
<blockquote>
<p>I got a job as a prefabrication supervisor […] And that was absolutely horrible, horrible, horrible […] just the safety stuff, you know, like they talked a lot of safety, but there was never much action […] just a bullying culture.</p>
</blockquote>
<p>Another left a processing job with a food company after just two days, saying:</p>
<blockquote>
<p>I couldn’t do that job. It was absolutely disgusting. It was hot. They were arrogant towards you.</p>
</blockquote>
<p>Workers often left jobs quickly, or struggled through while looking for something else. The result was a high level of employment instability, as people cycled through multiple jobs searching for one they could tolerate long term.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two men working on automotive engineering." src="https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=402&fit=crop&dpr=1 600w, https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=402&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=402&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=505&fit=crop&dpr=1 754w, https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=505&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/457029/original/file-20220407-19249-cwi93t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=505&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ex-automotive workers shared their experiences candidly.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>‘It really, really scarred me’</h2>
<p>Workers at the bottom of the labour market often experience demanding or demoralising recruitment processes for casual positions through labour hire agencies. These workers are made to feel feel they can’t afford to be choosy:</p>
<blockquote>
<p>So labour hire, I just pretty much I just said yes to everything. And that’s the way, that’s the work in labour hire. If you start saying no, then you go to the back of the list.</p>
</blockquote>
<p>Casual jobs often serve as a kind of probation, but there are no guarantees:</p>
<blockquote>
<p>I couldn’t see a future. Yeah. So I would just continue to look around […] because I couldn’t see them taking me any further than casual.</p>
</blockquote>
<p>One worker who had already experienced bad employers <a href="https://rsa.tandfonline.com/doi/full/10.1080/21681376.2022.2078737#.YqVksHZBw2w">described</a> the difficult choice she faced:</p>
<blockquote>
<p>I would like [to leave this job and look for something] permanent. But I really don’t want to go into another workplace like [company name], it really, really scarred me.</p>
</blockquote>
<p>Workers want their old lives back – even if that’s not the “real world” any more. As one <a href="https://rsa.tandfonline.com/doi/full/10.1080/21681376.2022.2078737#.YqVksHZBw2w">put it</a>:</p>
<blockquote>
<p>I just think there’s a lot of work out there that, there’s just bits and pieces, and it doesn’t really support someone to have a proper job or be able to afford a decent life […] I’ve probably had maybe six, seven, eight jobs since [the closures]. And none of them have been that good. And I mean, I’ve hated most of them.</p>
</blockquote>
<h2>A new world of precarious work</h2>
<p>In many established sectors, workers once enjoyed good working conditions – often over decades of employment in what they believed were “jobs for life”. Job loss thrust them into a new world of precarious work very different from what they’d known.</p>
<p>Many were downhearted about this new reality:</p>
<blockquote>
<p>It’s just very, very dodgy […] it’s sad, really sad to think that there’s, like, these places out there. And there’s so many of them and they’re operating the way they do and, and nobody’s really controlling any of it.</p>
</blockquote>
<p>Some never stopped longing for a job that made them feel the way their old job did:</p>
<blockquote>
<p>I just miss [my old firm], I miss their way of working. Building up you as a person, as a team.</p>
</blockquote>
<p>Even those who had adjusted to their new working lives admitted that you needed to be willing to do anything:</p>
<blockquote>
<p>[T]here is work out there […] Too many people are too choosy, that’s the problem […] I didn’t give a shit what sort of work I did […] There’s money in shit.</p>
</blockquote>
<h2>Better jobs – not just more jobs</h2>
<p>At the start of the pandemic, the nation’s leaders talked about “building back better”. </p>
<p>For those living on the margins of our workforce and those made redundant through processes beyond their control, “building back better” means finding ways to create better – not just more – jobs.</p>
<p>Australian workers want security, decent conditions and job satisfaction, not a choice between one “shit” workplace and another.</p>
<p>Most of all, they want work they can build their lives around. If we don’t listen to the voices of those living on the fringe, the problems we know all too well today will haunt our communities into the future.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australias-choice-pay-for-a-car-industry-or-live-with-the-consequences-8305">Australia's choice: pay for a car industry, or live with the consequences</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/180884/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>This story is part of The Conversation's Breaking the Cycle series, which is about escaping cycles of disadvantage. It is supported by a philanthropic grant from the Paul Ramsay Foundation.</span></em></p><p class="fine-print"><em><span>Andrew Beer receives funding from Australian Research Council. </span></em></p>Our interviews with ex-automotive workers reveal how economic change interrupts lives, casting people into new worlds of precarious work and long, indefinite journeys in search of security.Helen Dinmore, Research Fellow, University of South AustraliaAndrew Beer, Executive Dean, UniSA Business, University of South AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1847902022-06-14T15:57:57Z2022-06-14T15:57:57ZFive of the world’s tiniest robots<figure><img src="https://images.theconversation.com/files/468754/original/file-20220614-12-4k2fet.jpg?ixlib=rb-1.1.0&rect=50%2C0%2C6659%2C4466&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The RoboFly</span> <span class="attribution"><a class="source" href="https://drive.google.com/drive/u/0/folders/16SfWScilJ02rqkcvoQk6BQMsowKSqJG9">University of Washington</a></span></figcaption></figure><p>Allow me to take you on a trip down my memory lane. As a young lad, a film I saw captured my imagination: <a href="https://www.imdb.com/title/tt0060397/">Fantastic Voyage</a>, a 1966 release about people shrunk to microscopic size and sent into the body of an injured scientist to repair his brain. The idea struck a chord with me. I envisioned one day science would be able to create some sort of miniature machine that performs medical procedures from the inside. </p>
<p>Fast forward several decades into the 21st century, when I started my career as a robotics researcher taking inspiration from neuroscience to implement artificial perception systems. I thought of robots as machines that range from the size of a pet animal to big devices designed to carry out heavy-duty chores. However, I soon started to hear the first hints about research into miniature robots playing exactly the type of role the miniature scientists in Fantastic Voyage acted out. Did this mean that what I imagined as a child was about to come true? </p>
<p>Recently, a team of researchers from Stanford University, California, achieved the first milestone towards the development of 7.8mm wide origami robots: a proof-of- concept prototype. They dubbed it a <a href="https://www.nature.com/articles/s41467-022-30802-w">millirobot</a>. The robot uses the folding/unfolding of Kresling origami to roll, flip and spin. These robots are operated wirelessly using magnetic fields to move in narrow spaces and morph their shape for medical tasks, such as disease diagnosis, drug delivery, and even surgery. They are a part of a new trend in what is called <a href="https://spectrum.ieee.org/the-tiny-robots-will-see-you-now">“tiny robot” research</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/xnwQx55ZgOY?wmode=transparent&start=40" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>The range of technologies and uses for tiny robots is broad, from drones to pills, and from surveillance and rescue to <a href="https://www.science.org/doi/10.1126/scirobotics.aam6431">biomedicine</a>.</p>
<p>Here are five outstanding examples of tiny robots: </p>
<h2>1. Black hornet spy drones</h2>
<p>Designed and commercialised by American <a href="https://www.flir.co.uk/products/black-hornet-prs/?vertical=uas-norway&segment=uis">tech conglomerate Teledyne</a> to give foot soldiers covert awareness of potential threats. It’s small enough to fit into an adult’s palm and is almost silent. It has a battery life up to 25 minutes and a range of up to 2km. These drones transmit live video and high definition images back to the operator. They <a href="https://boingboing.net/2022/02/25/this-is-the-us-militarys-200k-drone-that-fits-in-your-palm.html">cost $200,000</a> (£165,000).</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/9IkaP6XMNZw?wmode=transparent&start=470" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>2. RoboBee</h2>
<p>A robot inspired by the <a href="https://wyss.harvard.edu/technology/robobees-autonomous-flying-microrobots/">biology of a bee</a>. It’s about the size of a penny and has potential future uses in crop pollination, search and rescue missions and surveillance, as well as weather and climate monitoring. The model robot is powered and controlled by a small electrical tether. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/9JWGiyr9FcE?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>3. RoboFly</h2>
<p><a href="https://www.washington.edu/news/2018/05/15/robofly/">Very similar</a> to the RoboBee (in fact the development team from the University of Washington includes one of the original researchers behind the RoboBee). It’s slightly heavier than a toothpick, around the size of a real fly and powered by a laser beam that needs to be pointed towards its body. Robofly’s makers hope it will eventually be able to find gas leaks and harvest energy from radio frequency signals or use a little battery as a power source. </p>
<figure class="align-center ">
<img alt="Red laser beam points at RoboFly" src="https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/468753/original/file-20220614-16-x0bhx7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The RoboFly is powered by a laser.</span>
<span class="attribution"><a class="source" href="https://drive.google.com/drive/u/0/folders/16SfWScilJ02rqkcvoQk6BQMsowKSqJG9">MarkStone/University of Washington</a></span>
</figcaption>
</figure>
<h2>4. Micro-scallops</h2>
<p>Engineered scallops, a fraction of <a href="https://spectrum.ieee.org/robotic-microscallops-can-swim-through-your-eyeballs">a millimetre small</a>. They are prototypes designed to navigate inside the bloodstream or around the eye. They are intended for use in future medical applications, powered by an external magnetic field, much like the millirobots. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/eZ05z6ebKDQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>5. Rani Therapeutics’ robotic pill</h2>
<p>A <a href="https://www.ranitherapeutics.com/technology/">purple capsule</a> which can be swallowed and navigates through the stomach and intestines to inject drugs such as insulin into the intestinal wall, where there are no sharp pain receptors. </p>
<figure class="align-center ">
<img alt="Large purple pills held in the palm of a hand" src="https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=470&fit=crop&dpr=1 600w, https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=470&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=470&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=591&fit=crop&dpr=1 754w, https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=591&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/468813/original/file-20220614-18-tkoza3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=591&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Rani Therapeutics RaniPill.</span>
</figcaption>
</figure>
<p>All these systems seem to share the same <a href="https://spectrum.ieee.org/the-tiny-robots-will-see-you-now">challenges</a>. They are hard to power. There is still no battery technology small enough for energy storage on these robots. That means that, either these robots display short power life, or they need a tether or some kind of wireless energy pointed at them. </p>
<p>They are also too small to carry and power fully-fledged artificial brains. The computational hardware you can mount on one of these robots to provide on-board intelligence will not provide the capability to go much beyond “how to flap its left wing”; nothing that would make them fully autonomous. The RoboBee would know how to flap its wings but would be incapable of deciding where to go on its own.</p>
<p>Tiny robots are closer than ever to mainstream use. Proofs of concept are here, and <a href="https://spectrum.ieee.org/the-tiny-robots-will-see-you-now">becoming more and more convincing</a>. But you can’t guarantee this technology will be viable in the near future (the next ten years). Nonetheless the child in me tells me that these tiny robots have a big future ahead of them.</p><img src="https://counter.theconversation.com/content/184790/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Joao Filipe Ferreira does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Robots a fraction of a millimetre small could swim around your eye or bloodstream for medical treatments in the future.Joao Filipe Ferreira, Senior lecturer in Computer Science, Nottingham Trent UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1828142022-05-11T12:05:04Z2022-05-11T12:05:04ZBeyond honey: 4 essential reads about bees<figure><img src="https://images.theconversation.com/files/462307/original/file-20220510-12-bnz0cm.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2330%2C1681&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Bumblebees at work, dotted with pollen.</span> <span class="attribution"><a class="source" href="https://flic.kr/p/cFk2Cm">Crabchick/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>As spring gardening kicks into high gear, bees emerge from hibernation and start moving from flower to flower. These hardworking insects play an essential role <a href="https://www.fs.fed.us/wildflowers/pollinators/animals/bees.shtml">pollinating plants</a>, but they’re also interesting for many other reasons. Scientists study bees to learn about their intricate social networks, learning patterns and adaptive behaviors. These four stories from The Conversation’s archive offer diverse views of life in the hive.</p>
<h2>1. Females are the future</h2>
<p>The survival of bee colonies <a href="https://theconversation.com/spring-signals-female-bees-to-lay-the-next-generation-of-pollinators-134852">depends on female bees</a>, although they play different roles depending on their species. In social bee species, females find nesting spots to establish new colonies and lay hundreds of eggs there. </p>
<p>Other species are solitary, meaning that each bee lives alone. Females create segmented nests, lay an egg in each segment, deposit a ball of pollen to feed the larva, and then die off. </p>
<p>Female bees need support, especially early in the year when foraging options are few, doctoral student <a href="https://www.researchgate.net/profile/Lila-Westreich">Lila Westreich</a> notes. “It’s best to provide female bees with many early spring flowers – they rely on nectar from flowers to fuel their search for a nesting spot. Planting early-flowering plants such as willow, poplar, cherry trees and other spring blooms provides nectar for queen bees,” she writes. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/spring-signals-female-bees-to-lay-the-next-generation-of-pollinators-134852">Spring signals female bees to lay the next generation of pollinators</a>
</strong>
</em>
</p>
<hr>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vf8QyIF3eoY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">In solitary bee species, females play the roles of queen and worker.</span></figcaption>
</figure>
<h2>2. Some bees are curious, others are focused</h2>
<p>All bees forage, but they do it in different ways. Some become very focused on the smell, colors and locations of known food sources and return to those flowers over and over. Others are more willing to explore and will change their behavior when they learn about new food sources. </p>
<p>As part of an experiment, Marquette University biologist <a href="https://scholar.google.com/citations?user=lGDvqJ8AAAAJ&hl=en">Chelsea Cook</a> and her colleagues bred populations of bees that were genetically programmed to be <a href="https://theconversation.com/some-bees-are-born-curious-while-others-are-more-single-minded-new-research-hints-at-how-the-hive-picks-which-flowers-to-feast-on-144900">either curious or focused</a>, and a colony that mixed these two styles together. Then they offered the bees a familiar food source and novel sources. Sure enough, the focused colony concentrated on the familiar source and the curious colony visited both known and novel sources. </p>
<p>In the mixed colony, bees came to concentrate more on the familiar source than the new ones over time. Why? The researchers observed how the bees communicated through their “waggle dance,” which tells nestmates where to find food, and saw that the focused bees were dancing faster. This conveyed their message more intensely than signals from slower dancers.</p>
<p>“Because curious bees are interested in everything, including new information about possible food locations, they are perfect listeners and are easily convinced to visit the chosen feeder of their enthusiastic nestmates,” Cook observes.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/some-bees-are-born-curious-while-others-are-more-single-minded-new-research-hints-at-how-the-hive-picks-which-flowers-to-feast-on-144900">Some bees are born curious while others are more single-minded – new research hints at how the hive picks which flowers to feast on</a>
</strong>
</em>
</p>
<hr>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/12Q8FfyLLso?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A scientist breaks down bees’ waggle dance.</span></figcaption>
</figure>
<h2>3. It takes a colony</h2>
<p>Bees communicate with one another about <a href="https://theconversation.com/honey-bees-cant-practice-social-distancing-so-they-stay-healthy-in-close-quarters-by-working-together-141106">many things besides food</a>. For example, bees use dancing to persuade their colony to move to a new nest site, write Providence College biologist <a href="https://scholar.google.com/citations?user=B6WmgvLL8vMC&hl=en">Rachael Bonoan</a> and Tufts University biologist <a href="https://scholar.google.com/citations?user=KF4sBDIAAAAJ&hl=en">Phil Starks</a>.</p>
<p>And bees work together to defend their colonies against external threats. Bonoan and Starks analyzed how honeybee colonies of varying sizes protected themselves against a fungus that causes a bee disease called chalkbrood. To do this, the researchers infected the colonies with the fungus and tracked the bees’ responses with thermal imaging.</p>
<p>The pathogen needs cool temperatures to infect bees, so the bees respond with heat. “When this pathogen is detected, worker bees protect the vulnerable young by contracting their large flight muscles to generate heat. This raises the temperature in the brood comb area of the hive just enough to kill the pathogen,” the biologists explain. Worker bees also remove diseased and dead young from the colony, which reduces the chance of infection spreading.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/honey-bees-cant-practice-social-distancing-so-they-stay-healthy-in-close-quarters-by-working-together-141106">Honey bees can't practice social distancing, so they stay healthy in close quarters by working together</a>
</strong>
</em>
</p>
<hr>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1395430362315067396"}"></div></p>
<h2>4. Straining for the good of the swarm</h2>
<p>Computer scientist <a href="https://scholar.google.com/citations?user=xH5Ryy4AAAAJ&hl=en">Orit Peleg</a> at the University of Colorado Boulder studied yet another way in which bees work together for the good of the group. Peleg and her colleagues analyzed swarms that European honeybees form when a colony becomes so large that it’s about to split into two new groups. The relocating group forms a swarm that can hang from objects such as tree branches, and can change its shape, with each bee essentially holding hands with others next to it.</p>
<p>The scientists used a motor to shake a wooden board with a swarm of 10,000 honeybees hanging from the underside. By seeing how the swarm <a href="https://theconversation.com/what-a-bundle-of-buzzing-bees-can-teach-engineers-about-robotic-materials-125194">responded to shaking in various directions</a>, they hoped to gain insights that could inform the creation of adaptive structures made up of robots linked together. </p>
<p>“Using a computational model, we showed that bonds between bees located closer to where the swarm attaches to the board stretch more than bonds between bees at the far tip of the swarm,” Peleg recounts. “Bees could sense these different amounts of stretching, and use them as a directional signal to move upwards and make the swarm spread.”</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A football-shaped cluster of bees hangs from a branch." src="https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=806&fit=crop&dpr=1 600w, https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=806&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=806&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1013&fit=crop&dpr=1 754w, https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1013&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/462309/original/file-20220510-16-hqlk80.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1013&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Bee swarm on a tree branch in Arkansas.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Swarming_(honey_bee)#/media/File:Bee_Swarm.JPG">Mark Osgathard/Wikipedia</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Put another way, the bees moved from locations where bonds stretched less to locations where they stretched more. “This behavioral response improves the collective stability of the swarm as a whole at the expense of increasing the average burden experienced by the individual bee,” Peleg concludes. </p>
<p>They found that when they shook the board horizontally, the swarm spread out into a wider, more stable cone. But it was less able to react to vertical shaking and eventually broke apart. That’s because vertical shaking didn’t disrupt the bonds between individual bees as much as horizontal shaking, so the swarm didn’t respond to vertical shaking by changing its shape.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-a-bundle-of-buzzing-bees-can-teach-engineers-about-robotic-materials-125194">What a bundle of buzzing bees can teach engineers about robotic materials</a>
</strong>
</em>
</p>
<hr>
<p><em>Editor’s note: This story is a roundup of articles from The Conversation’s archive.</em></p><img src="https://counter.theconversation.com/content/182814/count.gif" alt="The Conversation" width="1" height="1" />
Bees offer insights into many scientific questions, from cooperating in close quarters to strategies for finding food.Jennifer Weeks, Senior Environment + Cities Editor, The ConversationLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1811692022-04-14T05:23:19Z2022-04-14T05:23:19ZArtificial intelligence may take your job. Some lessons from my grandmother<figure><img src="https://images.theconversation.com/files/457825/original/file-20220413-17-8go2vm.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Siblings on the way to school</span> </figcaption></figure><p>My grandmother, Claire Hastings, was born in the 1920s on a farm in Armidale, northern New South Wales. That was a relatively common thing, with just <a href="https://www.ausstats.abs.gov.au/ausstats/free.nsf/0/BB80C710FF1DD730CA2578F0001502EE/$File/1921%20Census%20-%20Bulletin%20No%202.pdf">43% of the population</a> living in cities, compared with <a href="https://www.abs.gov.au/ausstats/abs@.nsf/Lookup/2071.0main+features1132016">more than 70% now</a>. </p>
<p>She lived in a small wooden hut, with a chicken coop out the front and fields out the back. When she and her siblings came home from school, they helped plough the fields with a horse-drawn plough until sundown. </p>
<p>Little did she know this life would soon disappear. The “second industrial revolution” (of mass production and standardisation) was creating machines to replace human and horse power. A plough pulled by a tractor could do in hours what took Grandma and her siblings a week. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=451&fit=crop&dpr=1 600w, https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=451&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=451&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/457824/original/file-20220413-17-jddua2.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Grandma’s brother, John, working the plough (c1929)</span>
<span class="attribution"><span class="source">Bradley Hastings</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>By the time she left school, age 17, she wasn’t needed on the farm. So she instead went to college, became a teacher, got married and raised a family. Now 93, she lives in a comfy suburban four-bedroom home, enjoys dining at restaurants, and loves going to the theatre and on ocean cruises. </p>
<p>Her story is far from unique. Around the world industrialisation has reduced farm employment enormously. In the United States, for example, 40% of the
labour force worked on farms in 1920; <a href="https://www.mckinsey.com/%7E/media/mckinsey/industries/public%20and%20social%20sector/our%20insights/what%20the%20future%20of%20work%20will%20mean%20for%20jobs%20skills%20and%20wages/mgi-jobs-lost-jobs-gained-executive-summary-december-6-2017.pdf">now it is about 2%</a></p>
<p>The loss of those jobs, and their replacement, is worth remembering as we now confront the “fourth industrial revolution”, with robots and artificial intelligence tipped to take <a href="https://fortune.com/2019/01/10/automation-replace-jobs/">up to 40% of the jobs</a> now done by humans within two decades. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/behind-those-headlines-why-not-to-rely-on-claims-robots-threaten-half-our-jobs-125935">Behind those headlines. Why not to rely on claims robots threaten half our jobs</a>
</strong>
</em>
</p>
<hr>
<p>The <a href="https://www.mckinsey.com/%7E/media/mckinsey/industries/public%20and%20social%20sector/our%20insights/what%20the%20future%20of%20work%20will%20mean%20for%20jobs%20skills%20and%20wages/mgi%20jobs%20lost-jobs%20gained_report_december%202017.pdf">hit list is long</a>, from drivers and call-centre workers to computer programmers and university lecturers like myself (we face being replaced by AI avatars, delivering animated content online).</p>
<p>But just as disappearing farm jobs didn’t lead to permanent mass unemployment, nor should we fear this next stage of technological development.</p>
<h2>Improving quality of life</h2>
<p>While industrial farming was not universally embraced as progress, the huge reductions in farming labour over the 20th century were key to a better life for most people (though poverty and glaring economic inequality still exist). </p>
<p>To cite just one measure, when my grandmother was born the average life expectancy in Australia was 60 years. Now it’s <a href="https://www.abs.gov.au/AUSSTATS/abs@.nsf/Lookup/4102.0Main+Features10Mar+2011">more than 80</a>.</p>
<p>The underlying forces driving such advances are twofold. </p>
<p>First, the mechanisation of farming made food cheaper. <a href="https://www.aei.org/carpe-diem/over-100-years-food-prices-have-fallen-by-82/">US data</a> shows the price of a common basket of groceries is now about 80% cheaper than a century ago. Similar trends exist for virtually every other consumable product. </p>
<p>Second, spending less on food meant people could spend more on other things. New industries sprang up – automobiles, holidays, health care, finance, fitness and education and so on. Sectors virtually unknown in the 1920s now employ more than <a href="https://www.mckinsey.com/%7E/media/mckinsey/industries/public%20and%20social%20sector/our%20insights/what%20the%20future%20of%20work%20will%20mean%20for%20jobs%20skills%20and%20wages/mgi-jobs-lost-jobs-gained-executive-summary-december-6-2017.pdf">half of the population</a>.</p>
<hr>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Visualising 150 years of employment history (US)." src="https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=326&fit=crop&dpr=1 600w, https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=326&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=326&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=410&fit=crop&dpr=1 754w, https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=410&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/458071/original/file-20220414-18-waic33.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=410&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">McKinsey</span></span>
</figcaption>
</figure>
<hr>
<p>These new industries have both underpinned improvements in our quality of life and, crucially, created new jobs. </p>
<p>As artificial intelligence and robotics develop, services such as banking, insurance and transport will become cheaper. As a consequence, we will have more money to spend on other items – on health and fitness, travel and leisure and possibilities yet to be conceived. </p>
<p>Whatever these new or expanded industries are, jobs will evolve at the same time as quality of life improves for all. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/artificial-intelligence-can-deepen-social-inequality-here-are-5-ways-to-help-prevent-this-152226">Artificial intelligence can deepen social inequality. Here are 5 ways to help prevent this</a>
</strong>
</em>
</p>
<hr>
<h2>Two lessons from my grandmother</h2>
<p>None of this, of course, will necessarily make you feel better if you have (and love) a job under threat from automation. </p>
<p>Some lessons from my grandma’s life may help. </p>
<p>First, she didn’t take the changes personally. She understood that times were changing, and that she would have to change with them. She embraced the challenge rather than being defeated by it.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="The author with his grandmother on her 90th birthday." src="https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=291&fit=crop&dpr=1 600w, https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=291&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=291&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=366&fit=crop&dpr=1 754w, https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=366&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/458068/original/file-20220414-13-uxe72h.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=366&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The author with his grandmother on her 90th birthday.</span>
<span class="attribution"><span class="source">Bradley Hastings</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Second, she understood she had to develop new skills. At the same time as farm jobs were diminishing, she saw growing demand for more teachers, underpinned by government regulations requiring children to stay in school longer. So too today education is the key for future jobs. </p>
<p>None of us know what the future holds. But for our collective future to replicate the advancements my grandmother has seen over her life, it’s inevitable that artificial intelligence and robots will take over jobs.</p>
<p>I asked grandma if we should be worried. “Life moves on,” she told me. </p>
<p>And so must we.</p><img src="https://counter.theconversation.com/content/181169/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bradley Hastings does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Up to 40% of all jobs now are tipped to be taken over by AI and robots in the next few decades. My grandmother, born on a farm almost a century ago, has some advice on how to cope.Bradley Hastings, Research Fellow, UNSW SydneyLicensed as Creative Commons – attribution, no derivatives.