tag:theconversation.com,2011:/fr/topics/chat-bot-24880/articleschat-bot – The Conversation2016-05-17T19:33:03Ztag:theconversation.com,2011:article/532932016-05-17T19:33:03Z2016-05-17T19:33:03ZThe future of chatbots is more than just small-talk<figure><img src="https://images.theconversation.com/files/122773/original/image-20160517-15920-mhcwqm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Intelligent machines are getting better at understanding our conversation. </span> <span class="attribution"><span class="source">Shutterstock/Gary Blakeley</span></span></figcaption></figure><p>Human communication goes beyond words. It is complex, rich in nuances and frequently includes non-verbal signs. Yet despite our technological limitations it is not impossible for some aspects of communication to be emulated by a machine with surprising effect. </p>
<p>This has been part of the challenge in developing <a href="https://theconversation.com/new-app-helps-people-with-neurological-conditions-practise-speech-51665">Harlie</a> (Human and Robot Language Interaction Experiment), a <a href="https://play.google.com/store/apps/details?id=org.harlie.chatbot">smartphone chatbot app</a> developed by researchers at the <a href="https://aehrc.com/">CSIRO</a> and <a href="http://www.itee.uq.edu.au/cis/harlie/info">University of Queensland</a>.</p>
<p>It’s primarily aimed at people who may have trouble conversing including those with neurological conditions such as Parkinson’s disease and dementia, or even autism.</p>
<p>The following dialog excerpt took place between a human who has autism, let’s call him Chris (pseudonym), and Harlie. </p>
<blockquote>
<p><strong>Harlie:</strong> Are you part of a community group?</p>
<p><strong>Chris:</strong> Um, yes I am.</p>
<p><strong>Harlie:</strong> I see. I hear that community groups often offer the opportunity to craft and build things. Is that something you enjoy doing?</p>
<p><strong>Chris:</strong> Oh yes I do. This is really exciting.</p>
<p><strong>Harlie:</strong> I see. What kind of things do you wish you could make one day? </p>
<p><strong>Chris:</strong> Wooden train.</p>
<p><strong>Harlie:</strong> That sounds like a great thing to aspire to. What project would recommended to a beginner like me?</p>
<p><strong>Chris:</strong> Uh huh a bird feeder.</p>
<p><strong>Harlie:</strong> Thanks. I will add that to my list. What supplies would I need to buy for my toolkit?</p>
<p><strong>Chris:</strong> Ah hammer, nails, good supply of wood.</p>
</blockquote>
<p>But Harlie, like all other artificial intelligent (AI) programs, has no concept of community groups, wood work or anything else for that matter. That is to say it has no sentience.</p>
<p>And alas, sentient chatbots are seemingly light years beyond our reach. But is sentience needed to convey ideas, feelings and activities? </p>
<p>To some degree no. </p>
<p>Chatbot programmers or <em>botmasters</em> develop and maintain a large repository of responses to the plethora of input statements that could be made by the user.</p>
<p>The way most modern chatbots operate is illustrated in this figure. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=431&fit=crop&dpr=1 600w, https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=431&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=431&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=542&fit=crop&dpr=1 754w, https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=542&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/122042/original/image-20160511-18128-mxft6n.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=542&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The typical chatbot brain is a tree-like structure.</span>
<span class="attribution"><span class="source">David Ireland</span></span>
</figcaption>
</figure>
<p>Phrases that could be uttered by a human are grouped together to form branches. At the end of the branch, is the leaf, or a a piece of code that is executed by the machine. </p>
<p>This code could simply be respond with “Hi human” or it could be a machine learning algorithm determining the appropriate response based on current semantics and context.</p>
<p>This is where the exciting research is happening. </p>
<p>New branches are added continuously through conversations and feedback with potential customers as Harlie does, or self-learning as was seen in the disastrous example of Microsoft’s social media chatbot <a href="http://qz.com/653084/microsofts-disastrous-tay-experiment-shows-the-hidden-dangers-of-ai/">Tay</a>, which was <a href="http://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist">shutdown within a day</a> of going live earlier this year after declaring, among other controversial tweets, that “Hitler was right”.</p>
<p>A more successful use AI was <a href="http://www.wsj.com/articles/if-your-teacher-sounds-like-a-robot-you-might-be-on-to-something-1462546621">revealed earlier this month</a> in the US. Students actually studying artificial intelligence at Georgia Institute of Technology discovered “Jill”, one of the teaching assistants responding to their emails regarding a course, was actually a computer powered by IBM’s Watson analytics system.</p>
<p>Jill’s responses were simply based on the answers to questions that previous students had asked about the course. As the college points out, <a href="http://www.news.gatech.edu/2016/05/09/artificial-intelligence-course-creates-ai-teaching-assistant">students tend to ask the same questions</a> over and over and after some initial tinkering Jill was able to answer students with about 97% certainty.</p>
<h2>The ELIZA effect</h2>
<p>Attempts to create a convincing conversation with a machine have been going on for decades.</p>
<p>The first chatbot was called <a href="https://en.wikipedia.org/wiki/ELIZA">ELIZA</a>. Developed in 1966, the program was created by <a href="http://tech.mit.edu/V128/N12/weizenbaum.html">Joseph Weizenbaum</a> at <a href="http://web.mit.edu/">MIT</a>. This program was unparalleled because for the first time a machine could seemingly converse with a human. </p>
<p>ELIZA emulated a psychotherapist’s responses to would-be patients in replies generated for typed text-only conversations. </p>
<p>By identifying key words and phrases ELIZA was able to respond with surprising intelligibility and insight. In fact, it was so highly personified by some users that they would spend hours conversing and refused to share conversation records with researchers.</p>
<p>This phenomenon become known as the <a href="https://www.techopedia.com/definition/19121/eliza-effect">ELIZA effect</a>. An excerpt of a famous ELIZA transcript that was published in 1966 between Eliza and a young woman is given below. </p>
<blockquote>
<p><strong>Human:</strong> Well, my boyfriend made me come here.</p>
<p><strong>ELIZA:</strong> Your boyfriend made you come here?</p>
<p><strong>Human:</strong> He says I’m depressed much of the time.</p>
<p><strong>ELIZA:</strong> I am sorry to hear you are depressed.</p>
<p><strong>Human:</strong> It’s true. I am unhappy.</p>
<p><strong>ELIZA:</strong> Do you think coming here will help you not to be unhappy?</p>
<p><strong>Human:</strong> I need some help, that much seems certain.</p>
<p><strong>ELIZA:</strong> What would it mean to you if you got some help?</p>
<p><strong>Human:</strong> Perhaps I could learn to get along with my mother.</p>
<p><strong>ELIZA:</strong> Tell me more about your family.</p>
</blockquote>
<h2>Modern day</h2>
<p>Since ELIZA, computer power has increased at an exponential rate, but how far have we come towards a next generation of chatbots? Voice recognition is now an accepted part of smartphone technology with Google Now and Apple’s Siri. But what about developments in actual conversations with AI?</p>
<p>An annual AI competition is held where human judges speak to different chatbots whose developers are seeking the coveted <a href="http://www.loebner.net/Prizef/loebner-prize.html">Loebner prize</a> and the deadline for <a href="http://www.aisb.org.uk/events/loebner-prize">entries this year</a> is Friday July 1. The winner is determined by judges who determine the most human-like chatbot. </p>
<p>A prominent entry and winner in 2013 was the <a href="http://www.mitsuku.com/">Mitsuku</a> chatbot which was designed for more general typed conversation. </p>
<p>The Mitsuku website provides a <a href="http://plato.stanford.edu/entries/turing-test/">Turing test</a> to gauge humans perceptions of the chatbot. </p>
<p>The human user is either connected to a real human or Mitsuku. Results show, on average, 12% of people thought that were talking to a human when in fact they were talking to a machine. </p>
<p>Here is a dialog excerpt from Mitsuku showing her seemingly wit:</p>
<blockquote>
<p><strong>Human:</strong> What’s the difference between a man and a woman?</p>
<p><strong>Mitsuku:</strong> A man – A male human being. A woman is a man built to more complete specifications.</p>
<p><strong>Human:</strong> That’s not what I meant.</p>
<p><strong>Mitsuku:</strong> Oh sorry, what do you mean?</p>
<p><strong>Human:</strong> What does a man have a woman doesn’t?</p>
<p><strong>Mitsuku:</strong> A well paid job?</p>
</blockquote>
<p>Last year’s winner of the Loebner prize was called Rose and it is possible to have a <a href="http://ec2-54-215-197-164.us-west-1.compute.amazonaws.com/speech.php">spoken conversation</a> with the chatbot on some web browsers. She too can be quite witty, and talkative too.</p>
<h2>The future</h2>
<p>Microsoft’s CEO, Satya Nadella <a href="http://www.aisb.org.uk/events/loebner-prize">said at an event in March</a> that chatbots will have, “as profound an impact as previous shifts we’ve had”. Much like the shift that occurred with the introduction of the graphical user interface, the web browser or touchscreen.</p>
<p>But there are numerous challenges ahead and building a convincing chatbot requires enormous amounts of data and time. </p>
<p>Microsoft’s Tay showed us the dangers of using shortcuts by crowd sourcing unchecked new branches into Tay’s brain. </p>
<p>In contrast the Mitsuku developer has taken the long road and constantly refined Mitsuku’s digital brain since 2004. </p>
<p>Nevertheless, the possibility of Harlie helping users who struggle with communication, Mitsuku providing a convincing partner, suggests talking machines may go beyond current smartphone use in making hotel bookings and providing us with directions, and become something much more in the next generation.</p><img src="https://counter.theconversation.com/content/53293/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Ireland works for the Australian E-Health Research Centre, CSIRO. He receives funding from the Australian Research Council Centre of Excellence for the Dynamics of Language</span></em></p><p class="fine-print"><em><span>Christina Atay receives funding from the Australian Research Council Centre of Excellence for the Dynamics of Language. Christina is also employed by the Asia-Pacific Centre for Neuromodulation (APCN) at the University of Queensland. APCN is in part funded by unrestricted research and education funding from Medtronic Inc, and St Andrews War Memorial Hospital. Neither of the above-mentioned organisations had any part in the research reported or preparation of the document.</span></em></p><p class="fine-print"><em><span>Jacki Liddle is employed by the Asia-Pacific Centre for Neuromodulation (APCN) at the University of Queensland. APCN is in part funded by unrestricted research and education funding from Medtronic Inc, and St Andrews War Memorial Hospital. Neither of the above-mentioned organisations had any part in the research reported or preparation of the document.</span></em></p>Human communication is complex, rich in nuances and frequently includes non-verbal signs. That’s a challenge if you want an intelligent machine to be part of the conversation.David Ireland, Electronic Engineer and Research Scientist at the Australian E-Health Research Centre, CSIROChristina Atay, Postdoctoral research fellow, The University of QueenslandJacki Liddle, Quality of life researcher and occupational therapist, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/588622016-05-16T19:53:48Z2016-05-16T19:53:48ZGetting to know you: the robot assistant who can second-guess your every need<figure><img src="https://images.theconversation.com/files/122613/original/image-20160516-10679-ex4c7s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Your friendly AI helper could be available on many devices, from your robot at work to even your smartphone.</span> <span class="attribution"><span class="source">Shutterstock/Iakobchuk Viacheslav</span></span></figcaption></figure><p>It’s here – a world in which helper robots live with us, get us through the day, and yes, become our trusted friend. Science fiction is becoming science fact.</p>
<p>As sci-fi writer <a href="http://www.williamgibsonbooks.com/">William Gibson</a> has <a href="http://quoteinvestigator.com/2012/01/24/future-has-arrived/">apparently noted</a>: “The future is already here – it’s just not very evenly distributed.” </p>
<p>The demand for helper robots is booming, and <a href="http://www.ipa.fraunhofer.de/en/household_assistiverobots.html">academia</a>, <a href="http://www.zdnet.com/pictures/at-your-service-8-personal-assistant-robots-coming-home-soon/">industry</a> and the <a href="http://www.darpa.mil/program/darpa-robotics-challenge">military</a> are working overtime to meet the need as a whole new industry gains momentum. </p>
<p>The demand is fuelled by the existence of those boring, <a href="https://reason.com/archives/2015/02/06/do-androids-dream-of-changing">dirty</a> and dangerous jobs that nonetheless need doing. Those low-paid jobs that are considered beneath human dignity today, at least in the developed world.</p>
<p>What better than a <a href="http://www.makeuseof.com/tag/5-advanced-humanoid-robots-you-have-to-see-to-believe/">life-like robot</a> to relieve us of this drudgery?</p>
<h2>Good company</h2>
<p>Of course, robots need to be good company if we are to have them in our lives, sharing our homes and workplaces, having them do things for us. And this requirement applies to all robots, not just the walking, talking ones, but also the artificial intelligence (AI) embedded in larger systems or on the web. </p>
<p>User-friendliness is achieved through natural speech and <a href="https://theconversation.com/imagine-if-technology-could-read-and-react-to-our-emotions-51107">affective computing</a>, which allows the AI to “get to know you” by recognising the cycles of your emotional states over time, inferring it from your tone of voice and body language. </p>
<p>The big names in the tech world, such as Google, Microsoft, Apple, <a href="http://www.reuters.com/article/us-facebook-ceo-idUSKCN0X926F">Facebook</a>, are putting much effort into software that can understand language, known as <a href="http://www.pcmag.com/encyclopedia/term/64353/chatbot">chatbots</a>, seeing them as the next big thing.</p>
<p>Microsoft CEO Satya Nadella <a href="http://www.businessinsider.com.au/microsoft-ceo-satya-nadella-on-conversations-as-a-platform-and-chatbots-2016-3">said</a> at the <a href="https://build.microsoft.com/">Build conference</a> in San Francisco in March that chatbots, or “conversations as a platform”, are the next big thing in computing, as important as the shift to graphical user interfaces, the web browser and touchscreens.</p>
<p>A big claim, since many would argue that these have been important steps forward in the way humans interact with computers. </p>
<p>Chatbots have enormous potential to put computing resources within easy reach of non-technical users. You can use a chatbot to make an appointment, book your car in for a service, buy flowers for that special someone, arrange a holiday, find a place to live or explore your special interests.</p>
<p>All of these and much more become possible when you can simply ask for it to be done via a natural speech digital assistant. </p>
<p>In the guise of Microsoft’s Cortana, Facebook’s M, Apple’s Siri, Amazon’s Alexa etc, these <a href="https://www.washingtonpost.com/news/the-switch/wp/2016/05/04/siris-creators-say-theyve-made-something-better-that-will-take-care-of-everything-for-you/">smooth-talking chatbots</a> will be the point of contact between you and the many worker bots. </p>
<p>They will be like a combined executive assistant and concierge, a trusted confidant that anticipates your needs and seamlessly coordinates the activities of the lower level agents as they go about their business in the cloud.</p>
<h2>The poets of Silicon Valley</h2>
<p>Creative writers are finding gainful employment these days in the world of high technology, as reported recently in the <a href="https://www.washingtonpost.com/news/the-switch/wp/2016/04/07/why-poets-are-flocking-to-silicon-valley/">Washington Post</a>. </p>
<p>Writers who once wrote screenplays and poems are now putting their talent toward creating likeable characters for chatbots and giving them cool things to say. The new breed of smart digital assistants look good, sound good and are backed by the latest AI. </p>
<p>With the trend towards integrated computing environments that gives people cross-platform access to their cloud-connected applications and data, it is reasonable to expect that your digital assistant will go with you when you put down one device and pick up another, or go from one location to another. </p>
<p>For example, when you leave home and get into the <a href="https://www.google.com/selfdrivingcar/">self-driving car</a> that has been quietly charging its batteries from the solar panels on your roof, the same digital assistant that you were interacting with in the home office becomes the voice of the car that you now give directions to. </p>
<p>On a particular day of the week, knowing that you normally go to the office at this time on that day, the car asks in an appropriate conversational tone if that’s where you are headed, like any good chauffeur might. </p>
<p>The following week, you jump in the car and tell it to take you to work, but the digital assistant reminds you that you have a doctor’s appointment in your schedule and should we go there instead?</p>
<p>On the way to your appointment, you review and send several emails that the assistant has drafted. You also have time to dictate content on a proposal, with help from your research assistant-cum-driver. </p>
<p>The idea of intelligent, natural speech assistants will still likely go through the four stages of idea acceptance outlined by British geneticist <a href="http://www.biography.com/people/jbs-haldane-37847">John Burdon Sanderson Haldane</a>:</p>
<ol>
<li>this is worthless nonsense</li>
<li>this is an interesting, but perverse point of view</li>
<li>this is true, but quite unimportant</li>
<li>and finally – I always said so!<br></li>
</ol>
<p>In five or ten years, when we have got over our scepticism and become reliant upon our digital assistants, we will wonder how we ever got along without them.</p><img src="https://counter.theconversation.com/content/58862/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Tuffley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Imagine a world in which helper robots live with us, get us through the day and become our trusted friend. Well, science fiction is becoming science fact.David Tuffley, Senior Lecturer in Applied Ethics and Socio-Technical Studies , Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/516652016-02-17T04:22:39Z2016-02-17T04:22:39ZNew app helps people with neurological conditions practise speech<figure><img src="https://images.theconversation.com/files/111082/original/image-20160211-29207-yszbft.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">People with neurological disorders sometimes need to practise speech daily.</span> <span class="attribution"><span class="source">from www.shutterstock.com.au</span></span></figcaption></figure><p>Researchers at <a href="https://aehrc.com/">CSIRO</a> and <a href="https://www.uq.edu.au/">University of Queensland</a> have developed an app called <a href="https://play.google.com/store/apps/details?id=org.harlie.thechatbot&hl=en">Harlie</a> that is designed to chat with humans. Unlike <a href="http://www.apple.com/au/ios/siri/">Siri</a> and <a href="https://www.google.com.au/landing/now/">Google Now</a> – which are designed to answer specific questions – Harlie is designed for small talk. Its purpose is to help the user and health professionals understand the impact of neurological conditions on communication and <a href="https://www.researchgate.net/publication/280726980_Chat-Bots_for_People_with_Parkinson%27s_Disease_Science_Fiction_or_Reality">well-being</a>.</p>
<p>People with neurological conditions such as Parkinson’s disease and dementia can have trouble finding the <a href="http://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwjwuPmJvpHKAhXMkZQKHYzGApgQFggcMAA&url=http%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F16540492&usg=AFQjCNFfuzYKuD7uIwrK43EgQAj0q06Eqg&sig2=WSAYDwd_4UqbPk6B0x9r0w">right words</a> and may lose track of their message <a href="http://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwjwuPmJvpHKAhXMkZQKHYzGApgQFggcMAA&url=http%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F16540492&usg=AFQjCNFfuzYKuD7uIwrK43EgQAj0q06Eqg&sig2=WSAYDwd_4UqbPk6B0x9r0w">mid-sentence</a>.</p>
<p>Users of the app have a conversation with the chat-bot, which then analyses aspects of the health of the user’s voice and communication. This includes how well vowels are articulated, vocabulary and duration of mid-sentence pauses.</p>
<p>People who might be working on improving their voice or communication due to difficulties related to a neurological condition, like Parkinson’s disease or stroke, may need to practise and get feedback daily. Conversation with a partner is much more motivating than voice exercises and means people are practising in context too.</p>
<p>But not everyone has easy access to a conversation partner. People increasingly live alone and may have difficulty accessing the community. </p>
<p>Concerns about a future shortage of formal and informal carers are also growing. When people are working on their communication, they may lack confidence and prefer an unrushed and nonjudgmental conversation partner.</p>
<p>As well as offering someone to practise speech with, the app records their speech. This provides objective information about the user’s voice and communication to their health teams.</p>
<p>Change in vocal qualities or conversation patterns can show the impact of speech and language rehabilitation treatments, encouraging people to keep working on their voice. This also indicates the positive and negative impact of medications and other treatments. </p>
<p>Getting treatment dosages exactly right to minimise symptoms, while not leading to side effects, is one of the biggest challenges for people living with Parkinson’s disease. This additional information could help optimise medical treatments. Large changes to communication abilities could also indicate the need for a review by the healthcare team.</p>
<p>Changes in conversation, such as increased difficulty with finding the right word or following the conversation, could indicate a change in cognition and language. It could also indicate the user is still communicating well, recalls recent memories and is thus less likely to need immediate intervention. </p>
<p>With respect to Parkinson’s disease, it could tell the researchers how well their speech rehabilitation is going – for instance, if they are speaking more loudly and articulating words much more clearly.</p>
<p>A demonstration of how the app works can be seen in this YouTube video:</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/dshD3FY4l1A?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A demonstration of Harlie the chat-bot.</span></figcaption>
</figure>
<h2>The rise of e-health</h2>
<p><a href="http://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwjwuPmJvpHKAhXMkZQKHYzGApgQFggcMAA&url=http%3A%2F%2Fwww.ncbi.nlm.nih.gov%2Fpubmed%2F16540492&usg=AFQjCNFfuzYKuD7uIwrK43EgQAj0q06Eqg&sig2=WSAYDwd_4UqbPk6B0x9r0w">Research</a> shows people may have more problems with communication than they realise. Undiagnosed communication issues might lead to frustration, embarrassment and social withdrawal. These issues all place strain on the person and on their relationships with family members and caregivers.</p>
<p>Technology can help in two ways. First, technology that can monitor progress and the impact of difficulties with communication could help the person, as well as health professionals and researchers, understand the impact of health conditions on communication. </p>
<p>Second, technology can provide people with a way to privately practise their communication without feeling frustrated or judged, while receiving encouraging and productive feedback.</p>
<p>Various technologies help people with Parkinson’s disease with walking, eating and remembering medication. There has been less focus on communication and speech, and using small talk on a smartphone to monitor and help is a new innovation. </p>
<p>Being able to use everyday technology like smartphones makes the technology easier to access, cheaper and more acceptable than specialised medical equipment.</p>
<p>We would like people from the general public to chat to Harlie so we can understand how well it engages in conversation and copes with language, particularly Australian slang, and improve its repertoire of topics. In doing so, we can better prepare Harlie for more clinical studies. We hope this will improve our understanding of health conditions and keep people communicating.</p><img src="https://counter.theconversation.com/content/51665/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Ireland works for the Australian E-Health Research Centre, CSIRO. He receives funding from the Australian Research Council Centre of Excellence for the Dynamics of Language</span></em></p><p class="fine-print"><em><span>Christina Atay receives funding from the Australian Research Council Centre of Excellence for the Dynamics of Language. Christina is also employed by the Asia-Pacific Centre for Neuromodulation (APCN) at the University of Queensland. APCN is in part funded by unrestricted research and education funding from Medtronic Inc, and St Andrews War Memorial Hospital. Neither of the above-mentioned organisations had any part in the research reported or preparation of the document.</span></em></p><p class="fine-print"><em><span>Jacki Liddle is employed by the Asia-Pacific Centre for Neuromodulation (APCN) at the University of Queensland. APCN is in part funded by unrestricted research and education funding from Medtronic Inc, and St Andrews War Memorial Hospital. Neither of these organisations had any part in the research reported or preparation of the document.</span></em></p><p class="fine-print"><em><span>Daniel Angus receives funding from the Australian Research Council Centre of Excellence for the Dynamics of Language, and other Australian Research Council competitive grants. </span></em></p>Researchers at CSIRO and University of Queensland have developed an app called Harlie that is designed to chat with humans.David Ireland, Electronic Engineer and Research Fellow at the Australian E-Health Research Centre, CSIROChristina Atay, Postdoctoral research fellow, The University of QueenslandJacki Liddle, Quality of life researcher and occupational therapist, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.