tag:theconversation.com,2011:/africa/topics/computer-science-6612/articlesComputer science – The Conversation2024-01-02T16:49:59Ztag:theconversation.com,2011:article/2177332024-01-02T16:49:59Z2024-01-02T16:49:59ZAI can now attend a meeting and write code for you – here’s why you should be cautious<p>Microsoft recently <a href="https://blogs.microsoft.com/blog/2023/09/21/announcing-microsoft-copilot-your-everyday-ai-companion/">launched</a> a new version of all of its software with the addition of an artificial intelligence (AI) assistant that can do a variety of tasks for you. <a href="https://adoption.microsoft.com/en-us/copilot/">Copilot</a> can summarise verbal conversations on <a href="https://support.microsoft.com/en-us/office/join-a-meeting-in-microsoft-teams-1613bb53-f3fa-431e-85a9-d6a91e3468c9">Teams</a> online meetings, present arguments for or against a particular point based on verbal discussions and answer a portion of your emails. It can even write computer code.</p>
<p>This quickly developing technology appears to take us even closer to a future where AI makes our lives easier and takes away all of the boring and repetitive things we have to do as humans. </p>
<p>But while these advancements are all very impressive and useful, we must be cautious in our use of such <a href="https://www.techopedia.com/definition/34948/large-language-model-llm">large language models</a> (LLMs). Despite their intuitive nature, they still require skill to use them effectively, reliably and safely.</p>
<h2>Large language models</h2>
<p>LLMs, a type of “deep learning” neural network, are designed to understand the user’s intent by analysing the probability of different responses based on the prompt provided. So, when a person inputs a prompt, the LLM examines the text and determines the most likely response. </p>
<p><a href="https://chat.openai.com">ChatGPT</a>, a prominent example of an LLM, can provide answers to prompts on a wide range of subjects. However, despite its seemingly knowledgeable responses, ChatGPT <a href="https://venturebeat.com/ai/llms-have-not-learned-our-language-were-trying-to-learn-theirs%EF%BF%BC/">does not</a> possess actual knowledge. Its responses are simply the most probable outcomes based on the given prompt.</p>
<p>When people provide ChatGPT, Copilot and other LLMs with detailed descriptions of the tasks they want to accomplish, these models can excel at providing high-quality responses. This could include generating text, images or computer code. </p>
<p>But, as humans, we often push the boundaries of what technology can do and what it was originally designed for. Consequently, we start using these systems to do the legwork that we should have done ourselves.</p>
<figure class="align-center ">
<img alt="The Microsoft 365 Copilot logo is displayed on a smartphone screen held in a hand." src="https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&rect=53%2C8%2C6000%2C3979&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/562981/original/file-20231201-29-8xiuff.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Microsoft Copilot is available in Windows 11 and Microsoft 365.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/june-7-2023-brazil-this-photo-2314245893">rafapress/Shutterstock</a></span>
</figcaption>
</figure>
<h2>Why over-reliance on AI could be a problem</h2>
<p>Despite their seemingly intelligent responses, we cannot blindly <a href="https://www.scientificamerican.com/article/how-can-we-trust-ai-if-we-dont-know-how-it-works/#:%7E:text=Humans%20are%20largely%20predictable%20to,make%20it%20worthy%20of%20trust.">trust</a> LLMs to be accurate or reliable. We must carefully evaluate and verify their outputs, ensuring that our initial prompts are reflected in the answers provided. </p>
<p>To effectively verify and validate LLM outputs, we need to have a strong understanding of the subject matter. Without expertise, we cannot provide the necessary quality assurance.</p>
<p>This becomes particularly critical in situations where we are using LLMs to bridge gaps in our own knowledge. Here our lack of knowledge may lead us to a situation where we are simply unable to determine whether the output is correct or not. This situation can arise in generation of text and coding. </p>
<p>Using AI to attend meetings and summarise the discussion presents obvious risks around reliability. While the record of the meeting is based on a transcript, the meeting notes are still generated in the same fashion as other text from LLMs. They are still based on language patterns and probabilities of what was said, so they require verification before they can be acted upon. </p>
<p>They also suffer from interpretation problems due to <a href="https://ieeexplore.ieee.org/abstract/document/9016769">homophones</a>, words that are pronounced the same but have different meanings. People are good at understanding what is meant in such circumstances due to the context of the conversation.</p>
<p>But AI is not good at deducing context nor does it understand nuance. So, expecting it to formulate arguments based upon a potentially erroneous transcript poses further problems still. </p>
<p>Verification is even harder if we are using AI to generate computer code. Testing computer code with test data is the only reliable method for validating its functionality. While this demonstrates that the code operates as intended, it doesn’t guarantee that its behaviour aligns with real-world expectations. </p>
<p>Suppose we use generative AI to create code for a sentiment analysis tool. The goal is to analyse product reviews and categorise sentiments as positive, neutral or negative. We can test the functionality of the system and validate the code functions correctly – that it is sound from a technical programming point of view. </p>
<p>However, imagine that we deploy such software in the real world and it starts to classify sarcastic product reviews as positive. The sentiment analysis system lacks the contextual knowledge necessary to understand that sarcasm is not used as positive feedback, and quite the opposite. </p>
<p>Verifying that a code’s output matches the desired outcomes in nuanced situations such as this requires expertise. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/chatgpt-turns-1-ai-chatbots-success-says-as-much-about-humans-as-technology-218704">ChatGPT turns 1: AI chatbot's success says as much about humans as technology</a>
</strong>
</em>
</p>
<hr>
<p>Non programmers will have no knowledge of software engineering principles that are used to ensure code is correct, such as planning, methodology, testing and documentation. Programming is a complex discipline, and software engineering emerged as a field to manage software quality. </p>
<p>There is a significant risk, as my own <a href="https://www.researchgate.net/publication/372606390_Experimenting_with_ChatGPT_for_Spreadsheet_Formula_Generation_Evidence_of_Risk_in_AI_Generated_Spreadsheets#fullTextFileContent">research</a> has shown, that non-experts will overlook or skip critical steps in the software design process, leading to code of unknown quality.</p>
<h2>Validation and verification</h2>
<p>LLMs such as ChatGPT and Copilot are powerful tools that we can all benefit from. But we must be careful to not blindly trust the outputs given to us. </p>
<p>We are right at the start of a great revolution based on this technology. AI has infinite possibilities but it needs to be shaped, checked and verified. And at present, humans beings are the only ones who can do this.</p><img src="https://counter.theconversation.com/content/217733/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Simon Thorne does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Microsoft Copilot can summarise meetings and even formulate arguments. But as good as that sounds, we shouldn’t blindly trust its accuracy.Simon Thorne, Senior Lecturer in Computing and Information Systems, Cardiff Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2198022023-12-14T04:06:45Z2023-12-14T04:06:45ZThe AI industry is on the verge of becoming another boys’ club. We’re all going to lose out if it does<figure><img src="https://images.theconversation.com/files/565707/original/file-20231214-23-2bm6wg.jpg?ixlib=rb-1.1.0&rect=63%2C63%2C5993%2C3968&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A recent New York Times <a href="https://www.nytimes.com/2023/12/03/technology/ai-key-figures.html">article</a> released a list of people “behind the dawn of the modern artificial intelligence movement” – and not a single woman was named. It came less than a week after news of a fake auto-generated woman being listed as a speaker on the agenda <a href="https://apnews.com/article/tech-conference-fake-women-ai-generated-devternity-98ed551e90ec49e81589cc928715ae3c">for a software conference</a>.</p>
<p>Unfortunately, the omission of women from the history of STEM isn’t a new phenomenon. Women have been missing from these narratives for centuries.</p>
<p>In the wake of recent AI developments, we now have a choice: are we going to leave women out of these conversations as well – even as they continue to make massive contributions to the AI industry? </p>
<p>Doing so risks leading us into the same fallacy that established computing itself as a “man’s world”. The reality, of course, is quite different. </p>
<h2>A more accurate history</h2>
<p>Prior to computers as we know them, “computer” was the title given to people who performed complex mathematical calculations. These people <a href="https://www.smithsonianmag.com/science-nature/history-human-computers-180972202">were commonly women</a>.</p>
<p>English mathematician Ada Lovelace (1815–1852) is often referred to as <a href="https://www.newyorker.com/tech/annals-of-technology/ada-lovelace-the-first-tech-visionary">the first computer programmer</a>. She was the <a href="https://lemelson.mit.edu/resources/ada-lovelace">first person to realise</a> computers could do much more than just math calculations. Her work on <a href="https://www.britannica.com/technology/Analytical-Engine">the analytical engine</a> – a proposed automatic and fully programmable mechanical computer – dates back to the mid-1800s.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A blue plaque in St James’s Square in London marks the location Ada Lovelace once lived.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>By the 1870s, a group of about 80 women worked as computers <a href="https://www.thecrimson.com/article/2019/9/26/women-computers-observatory/">at the Harvard Observatory</a>. They catalogued and analysed copious amounts of astronomic data for astronomer Edward Charles Pickering (who exploited the fact they’d work for less money than men, or even as volunteers).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=469&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=469&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=469&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=590&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=590&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=590&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 1886, Pickering put Williamina P.S. Fleming in charge of the Harvard computers. Over the course of her career she discovered 10 novae, 52 nebulae and hundreds of stars.</span>
<span class="attribution"><span class="source">Wikimedia</span></span>
</figcaption>
</figure>
<p>By the late 19th century, increased access to education meant there was an entire generation of women trained in maths. These woman computers were cheaper labour than men at the time, and so <a href="https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/">employing them</a> significantly reduced the costs of computation.</p>
<p>During the first world war, women were hired to <a href="https://cs.brown.edu/courses/cs1951i/lightWhenComputersWereWomen.pdf">calculate artillery trajectories</a>. This work continued into the second world war, when they were actively encouraged to <a href="https://www.history.com/news/coding-used-to-be-a-womans-job-so-it-was-paid-less-and-undervalued">take on wartime jobs</a> as computers in the absence of men. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=700&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=700&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=700&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=880&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=880&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=880&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Former NASA mathematician Katherine Johnson was awarded the Presidential Medal of Freedom in 2015.</span>
<span class="attribution"><span class="source">NASA/Bill Ingalls</span></span>
</figcaption>
</figure>
<p>Women continued to work as computers into the early days of the <a href="https://education.nationalgeographic.org/resource/women-nasa/">American space program in the 1960s</a>, playing a pivotal role in advancing NASA’s space projects. One of these computers was <a href="https://www.nasa.gov/centers-and-facilities/langley/katherine-johnson-biography/">Katherine Johnson</a>, who was responsible for quality-checking the outputs of early IBM computers for an orbital mission in 1962. </p>
<p>Many women made significant contributions to computing, yet few were recognised for these contributions – let alone financially compensated. <a href="https://books.google.com.au/books?id=GWOIXDsLQWwC&printsec=frontcover&dq=Recoding+Gender:+Women%2527s+Changing+Participation+in+Computing&hl=en&sa=X&redir_esc=y#v=onepage&q=salary&f=false">According to</a> Virginia Tech professor Janet Abbate, by 1969 a female computer specialist’s median salary was US$7,763, compared to US$11,193 for a male computer specialist.</p>
<p>Woman computers worked behind the scenes, while their male counterparts received recognition, awards and publicity.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-your-money-is-helping-subsidise-sexism-in-academia-and-what-you-can-do-about-it-218347">How your money is helping subsidise sexism in academia – and what you can do about it</a>
</strong>
</em>
</p>
<hr>
<h2>Women in AI</h2>
<p>Computing and programming are the foundation of AI as we know it today. At a basic level, today’s generative and predictive AI systems work by analysing large amounts of data and <a href="https://medium.com/@stahl950/the-math-behind-predictions-in-ai-unraveling-the-magic-44b4fcb8af6">finding patterns in it</a>. </p>
<p>The women who pioneered computing from as early as the 1800s laid the foundations for this work. The work they were doing by hand for more than a century has now been replaced by machines capable of analysing much larger quantities of data in much a shorter time.</p>
<p>This transition does not diminish women’s contributions to the field of computing and, more recently, AI. Myriad women are doing pioneering work in the AI industry today, including the 12 women named is this recent <a href="https://medium.com/womenintechnology/ny-times-missed-these-12-trailblazers-meet-the-women-transforming-ai-ae522f52a8b7">Medium article</a>. </p>
<p>From Google’s ex-chief decision scientist Cassie Kozyrkov, to Canadian computer scientist Joy Buolamwini, to OpenAI’s CTO Mira Murati (pictured in this article’s banner image) – these women are helping make AI safer, more accurate, more accessible, more inclusive and more reliable.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=841&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=841&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=841&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1057&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1057&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1057&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Joy Buolamwini is a Rhodes scholar, Fulbright fellow, Stamps scholar, Astronaut scholar and Anita Borg Institute scholar. Her work focuses on reducing bias in AI.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Joy_Buolamwini#/media/File:Joy_Buolamwini_-_Wikimania_2018_01.jpg">Wikimedia</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>And they’re taking these strides despite working in a heavily male-dominated industry. <a href="https://medium.com/element-ai-research-lab/estimating-the-gender-ratio-of-ai-researchers-around-the-world-81d2b8dbe9c3">One 2018 study</a> of 4,000 researchers who had been published in leading AI conferences found women made up just 12% of this group.</p>
<h2>The impact of omission</h2>
<p>The omission of women isn’t limited to the AI industry, or even to STEM. As historian Bettany Hughes notes, women occupy a <a href="https://www.english-heritage.org.uk/visit/inspire-me/blog/blog-posts/why-were-women-written-out-of-history-an-interview-with-bettany-hughes/#">meagre 0.5%</a> of recorded history. Clearly, a lack of gender diversity in the workforce is part of a much larger, systemic problem – one that affects many more people than the individuals being excluded. </p>
<p>In 1983, NASA engineers suggested packing 100 tampons on the <a href="https://prospect.org/culture/books/astronaut-sally-ride-burden-first/">Challenger space shuttle</a> for astronaut Sally Ride – for a trip that was one week long. Such an incident is seemingly harmless on the surface. But what happens when gender bias and stereotypes bleed into the design and development of AI? </p>
<p>Research <a href="https://edition.cnn.com/2023/06/12/tech/facebook-job-ads-gender-discrimination-asequals-intl-cmd/index.html">published in 2018</a> by international non-profit Global Witness found Facebook’s job ad platform, which uses algorithms to target users with ads, based its targeting on sexist stereotypes. For example, ads for mechanics were targeted mostly at men, while ads for preschool teachers were targeted mostly at women. </p>
<p>Another <a href="https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">2018 study</a> found computer vision systems reported higher error rates for recognising women, and in particular women with darker skin tones. </p>
<p>A <a href="https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/">lack of gender diversity</a> in AI has a demonstrated ability to harm and disadvantage women and, by extension, all of us. While many argue that improving AI training datasets could address the gender gap, others rightly point out that women should also be included in <a href="https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/?sh=35e1baed534f">data-collection processes</a></p>
<h2>Breaking the glass ceiling</h2>
<p>Speaking at the <a href="https://www.heforshe.org/en/join-us-heforshe-summit-2023">UN Women’s HeForShe summit</a> earlier this year, <a href="https://huggingface.co/">Hugging Face</a> research scientist Sasha Luccioni made a <a href="https://www.unwomen.org/en/news-stories/feature-story/2023/09/heforshe-summit-discusses-gender-bias-in-ai-and-how-to-encourage-male-feminist-allies">salient point</a>:</p>
<blockquote>
<p>AI bias doesn’t come from thin air – it comes from the patterns we perpetuate in our societies.</p>
</blockquote>
<p>The recent New York Times article is an example of how both media and industry play a role in reinforcing a status quo that disproportionately favours men. This form of bias does nothing to help close a persistent and problematic gender gap.</p>
<p>Despite <a href="https://www.smh.com.au/national/tie-research-funding-to-progress-on-diversity-stem-review-says-20230814-p5dw8j.html">millions of dollars</a> being spent to encourage women to take up careers in STEM, these fields are struggling to <a href="https://www.lgea.org.au/Scientists/News/2021_women_in_stem_report.aspx">retain woman workers</a>. </p>
<p>Women’s contributions to AI are not insignificant. Failing to acknowledge this can make the glass ceiling seem impossible to break through.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/chief-scientist-women-in-stem-are-still-far-short-of-workplace-equity-covid-19-risks-undoing-even-these-modest-gains-143092">Chief Scientist: women in STEM are still far short of workplace equity. COVID-19 risks undoing even these modest gains</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/219802/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Zena Assaad does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>For decades, woman ‘computers’ worked behind the scenes while their male counterparts received recognition. The AI industry must not be an example of history repeating itself.Zena Assaad, Senior Lecturer, School of Engineering, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2158042023-10-19T04:59:00Z2023-10-19T04:59:00ZQuantum computers in 2023: how they work, what they do, and where they’re heading<figure><img src="https://images.theconversation.com/files/554450/original/file-20231018-29-xrpphz.jpg?ixlib=rb-1.1.0&rect=17%2C43%2C5757%2C3800&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A complex cooling rig is needed to maintain the ultracold working temperatures required by a superconducting quantum computer.</span> <span class="attribution"><a class="source" href="https://newsroom.ibm.com/media-quantum-innovation">IBM</a></span></figcaption></figure><p>In June, an IBM computing executive claimed <a href="https://www.nytimes.com/2023/06/14/science/ibm-quantum-computing.html">quantum computers were entering the “utility” phase</a>, in which high-tech experimental devices become useful. In September, Australia’s Chief Scientist Cathy Foley went so far as to declare “<a href="https://www.chiefscientist.gov.au/news-and-media/its-time-australia-leverage-our-resources-and-tech-skills-prosper-new-economy">the dawn of the quantum era</a>”. </p>
<p>This week, Australian physicist <a href="https://www.abc.net.au/news/science/2023-10-16/prime-minister-science-prize-michelle-simmons-quantum-physics/102979096">Michelle Simmons won the nation’s top science award</a> for her work on developing silicon-based quantum computers.</p>
<p>Obviously, quantum computers are having a moment. But – to step back a little – what exactly <em>are</em> they? </p>
<h2>What is a quantum computer?</h2>
<p>One way to think about computers is in terms of the kinds of numbers they work with.</p>
<p>The digital computers we use every day rely on whole numbers (or <em>integers</em>), representing information as strings of zeroes and ones which they rearrange according to complicated rules. There are also analogue computers, which represent information as continuously varying numbers (or <em>real numbers</em>), manipulated via electrical circuits or spinning rotors or moving fluids.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/theres-a-way-to-turn-almost-any-object-into-a-computer-and-it-could-cause-shockwaves-in-ai-62235">There's a way to turn almost any object into a computer – and it could cause shockwaves in AI</a>
</strong>
</em>
</p>
<hr>
<p>In the 16th century, the Italian mathematician Girolamo Cardano invented another kind of number called <em>complex numbers</em> to solve seemingly impossible tasks such as finding the square root of a negative number. In the 20th century, with the advent of quantum physics, it turned out complex numbers also naturally describe the fine details of light and matter.</p>
<p>In the 1990s, physics and computer science collided when it was discovered that some problems could be solved much faster with algorithms that work directly with complex numbers as encoded in quantum physics. </p>
<p>The next logical step was to build devices that work with light and matter to do those calculations for us automatically. This was the birth of quantum computing.</p>
<h2>Why does quantum computing matter?</h2>
<p>We usually think of the things our computers do in terms that mean something to us — balance my spreadsheet, transmit my live video, find my ride to the airport. However, all of these are ultimately computational problems, phrased in mathematical language. </p>
<p>As quantum computing is still a nascent field, most of the problems we know quantum computers will solve are phrased in abstract mathematics. Some of these will have “real world” applications we can’t yet foresee, but others will find a more immediate impact.</p>
<p>One early application will be cryptography. Quantum computers will be able to crack today’s internet encryption algorithms, so we will need quantum-resistant cryptographic technology. Provably secure cryptography and a fully quantum internet would use quantum computing technology.</p>
<figure class="align-center ">
<img alt="A microscopic view of a square, iridescent computer chip against an orange background." src="https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=395&fit=crop&dpr=1 600w, https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=395&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=395&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=496&fit=crop&dpr=1 754w, https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=496&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/554626/original/file-20231018-19-68uhls.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=496&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Google has claimed its Sycamore quantum processor can outperform classical computers at certain tasks.</span>
<span class="attribution"><span class="source">Google</span></span>
</figcaption>
</figure>
<p>In materials science, quantum computers will be able to simulate molecular structures at the atomic scale, making it faster and easier to discover new and interesting materials. This may have significant applications in batteries, pharmaceuticals, fertilisers and other chemistry-based domains.</p>
<p>Quantum computers will also speed up many difficult optimisation problems, where we want to find the “best” way to do something. This will allow us to tackle larger-scale problems in areas such as logistics, finance, and weather forecasting.</p>
<p>Machine learning is another area where quantum computers may accelerate progress. This could happen indirectly, by speeding up subroutines in digital computers, or directly if quantum computers can be reimagined as learning machines.</p>
<h2>What is the current landscape?</h2>
<p>In 2023, quantum computing is moving out of the basement laboratories of university physics departments and into industrial research and development facilities. The move is backed by the chequebooks of multinational corporations and venture capitalists. </p>
<p>Contemporary quantum computing prototypes – built by <a href="https://www.ibm.com/quantum">IBM</a>, <a href="https://quantumai.google/">Google</a>, <a href="https://ionq.com/">IonQ</a>, <a href="https://www.rigetti.com/">Rigetti</a> and others – are still some way from perfection. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/error-correcting-the-things-that-go-wrong-at-the-quantum-computing-scale-84846">Error correcting the things that go wrong at the quantum computing scale</a>
</strong>
</em>
</p>
<hr>
<p>Today’s machines are of modest size and susceptible to errors, in what has been called the “<a href="https://thequantuminsider.com/2023/03/13/what-is-nisq-quantum-computing/">noisy intermediate-scale quantum</a>” phase of development. The delicate nature of tiny quantum systems means they are prone to many sources of error, and correcting these errors is a major technical hurdle.</p>
<p>The holy grail is a large-scale quantum computer which can correct its own errors. A whole ecosystem of research factions and commercial enterprises are pursuing this goal via diverse technological approaches. </p>
<h2>Superconductors, ions, silicon, photons</h2>
<p>The current leading approach uses loops of electric current inside superconducting circuits to store and manipulate information. This is the technology adopted by <a href="https://quantumai.google/hardware">Google</a>, <a href="https://www.ibm.com/topics/quantum-computing">IBM</a>, <a href="https://www.rigetti.com/what-we-build">Rigetti</a> and others. </p>
<p>Another method, the “trapped ion” technology, works with groups of electrically charged atomic particles, using the inherent stability of the particles to reduce errors. This approach has been spearheaded by <a href="https://ionq.com/technology">IonQ</a> and <a href="https://www.honeywell.com/us/en/company/quantum">Honeywell</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Illustration showing glowing dots and patterns of light." src="https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=519&fit=crop&dpr=1 600w, https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=519&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=519&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=653&fit=crop&dpr=1 754w, https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=653&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/554627/original/file-20231018-29-hte4r6.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=653&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An artist’s impression of a semiconductor-based quantum computer.</span>
<span class="attribution"><a class="source" href="https://www.sqc.com.au">Silicon Quantum Computing</a></span>
</figcaption>
</figure>
<p>A third route of exploration is to confine electrons within tiny particles of semiconductor material, which could then be melded into the well-established silicon technology of classical computing. <a href="https://sqc.com.au/">Silicon Quantum Computing</a> is pursuing this angle.</p>
<p>Yet another direction is to use individual particles of light (photons), which can be manipulated with high fidelity. A company called PsiQuantum is designing <a href="https://www.nature.com/articles/s41467-023-36493-1">intricate “guided light” circuits</a> to perform quantum computations. </p>
<p>There is no clear winner yet from among these technologies, and it may well be a hybrid approach that ultimately prevails.</p>
<h2>Where will the quantum future take us?</h2>
<p>Attempting to forecast the future of quantum computing today is akin to predicting flying cars and ending up with cameras in our phones instead. Nevertheless, there are a few milestones that many researchers would agree are likely to be reached in the next decade.</p>
<p>Better error correction is a big one. We expect to see a transition from the era of noisy devices to small devices that can sustain computation through active error correction.</p>
<p>Another is the advent of post-quantum cryptography. This means the establishment and adoption of cryptographic standards that can’t easily be broken by quantum computers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/quantum-computers-threaten-our-whole-cybersecurity-infrastructure-heres-how-scientists-can-bulletproof-it-196065">Quantum computers threaten our whole cybersecurity infrastructure: here's how scientists can bulletproof it</a>
</strong>
</em>
</p>
<hr>
<p>Commercial spin-offs of technology such as quantum sensing are also on the horizon.</p>
<p>The demonstration of a genuine “quantum advantage” will also be a likely development. This means a compelling application where a quantum device is unarguably superior to the digital alternative.</p>
<p>And a stretch goal for the coming decade is the creation of a large-scale quantum computer free of errors (with active error correction). </p>
<p>When this has been achieved, we can be confident the 21st century will be the “quantum era”.</p><img src="https://counter.theconversation.com/content/215804/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Christopher Ferrie receives funding from the Australian Research Council. He is a co-founder of quantum startup Eigensystems. </span></em></p>After decades of hype, quantum computers are on the verge of becoming useful. Here’s a refresher on why they’re such a big dealChristopher Ferrie, Senior Lecturer, UTS Chancellor's Postdoctoral Research and ARC DECRA Fellow, University of Technology SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2147212023-10-16T19:05:07Z2023-10-16T19:05:07ZAI is closer than ever to passing the Turing test for ‘intelligence’. What happens when it does?<figure><img src="https://images.theconversation.com/files/553931/original/file-20231016-17-wzq8rn.jpg?ixlib=rb-1.1.0&rect=77%2C113%2C3916%2C3880&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Pexels/Google Deepmind</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>In 1950, British computer scientist Alan Turing proposed an experimental method for answering the question: can machines think? He suggested if a human couldn’t tell whether they were speaking to an artificially intelligent (AI) machine or another human after five minutes of questioning, this would demonstrate AI has human-like intelligence.</p>
<p>Although AI systems remained far from passing Turing’s test during his lifetime, he speculated that</p>
<blockquote>
<p>“[…] in about fifty years’ time it will be possible to programme computers […] to make them play the imitation game so well that an average interrogator will not have more than 70% chance of making the right identification after five minutes of questioning.</p>
</blockquote>
<p>Today, more than 70 years after Turing’s proposal, no AI has managed to successfully pass the test by fulfilling the specific conditions he outlined. Nonetheless, as <a href="https://www.nature.com/articles/d41586-023-02361-7">some headlines</a> <a href="https://www.washingtonpost.com/technology/2022/06/17/google-ai-lamda-turing-test/">reflect</a>, a few systems have come quite close.</p>
<p><a href="https://browse.arxiv.org/pdf/2305.20010.pdf">One recent experiment</a> tested three large language models, including GPT-4 (the AI technology behind ChatGPT). The participants spent two minutes chatting with either another person or an AI system. The AI was prompted to make small spelling mistakes – and quit if the tester became too aggressive. </p>
<p>With this prompting, the AI did a good job of fooling the testers. When paired with an AI bot, testers could only correctly guess whether they were talking to an AI system 60% of the time. </p>
<p>Given the rapid progress achieved in the design of natural language processing systems, we may see AI pass Turing’s original test within the next few years. </p>
<p>But is imitating humans really an effective test for intelligence? And if not, what are some alternative benchmarks we might use to measure AI’s capabilities?</p>
<h2>Limitations of the Turing test</h2>
<p>While a system passing the Turing test gives us <em>some</em> evidence it is intelligent, this test is not a decisive test of intelligence. One problem is it can produce "false negatives”. </p>
<p>Today’s large language models are often designed to immediately declare they are not human. For example, when you ask ChatGPT a question, it often prefaces its answer with the phrase “as an AI language model”. Even if AI systems have the underlying ability to pass the Turing test, this kind of programming would override that ability.</p>
<p>The test also risks certain kinds of “false positives”. As philosopher Ned Block <a href="https://www.jstor.org/stable/2184371">pointed out</a> in a 1981 article, a system could conceivably pass the Turing test simply by being hard-coded with a human-like response to any possible input.</p>
<p>Beyond that, the Turing test focuses on human cognition in particular. If AI cognition differs from human cognition, an expert interrogator will be able to find some task where AIs and humans differ in performance.</p>
<p>Regarding this problem, Turing wrote:</p>
<blockquote>
<p>This objection is a very strong one, but at least we can say that if, nevertheless, a machine can be constructed to play the imitation game satisfactorily, we need not be troubled by this objection.</p>
</blockquote>
<p>In other words, while passing the Turing test is good evidence a system is intelligent, failing it is not good evidence a system is <em>not</em> intelligent.</p>
<p>Moreover, the test is not a good measure of whether AIs are conscious, whether they can feel pain and pleasure, or whether they have moral significance. According to many cognitive scientists, consciousness involves a particular cluster of mental abilities, including having a working memory, higher-order thoughts, and the ability to perceive one’s environment and model how one’s body moves around it.</p>
<p>The Turing test does not answer the question of whether or not AI systems <a href="https://arxiv.org/abs/2308.08708">have these abilities</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-pioneer-geoffrey-hinton-says-ai-is-a-new-form-of-intelligence-unlike-our-own-have-we-been-getting-it-wrong-this-whole-time-204911">AI pioneer Geoffrey Hinton says AI is a new form of intelligence unlike our own. Have we been getting it wrong this whole time?</a>
</strong>
</em>
</p>
<hr>
<h2>AI’s growing capabilities</h2>
<p>The Turing test is based on a certain logic. That is: humans are intelligent, so anything that can effectively imitate humans is likely to be intelligent.</p>
<p>But this idea doesn’t tell us anything about the nature of intelligence. A different way to measure AI’s intelligence involves thinking more critically about what intelligence is. </p>
<p>There is currently no single test that can authoritatively measure artificial or human intelligence. </p>
<p>At the broadest level, we can think of intelligence as the <a href="https://arxiv.org/pdf/2303.12712.pdf">ability</a> to achieve a range of goals in different environments. More intelligent systems are those which can achieve a wider range of goals in a wider range of environments. </p>
<p>As such, the best way to keep track of advances in the design of general-purpose AI systems is to assess their performance across a variety of tasks. Machine learning researchers have developed a range of benchmarks that do this.</p>
<p>For example, GPT-4 was <a href="https://openai.com/research/gpt-4">able to correctly answer</a> 86% of questions in massive multitask language understanding – a benchmark measuring performance on multiple choice tests across a range of college-level academic subjects. </p>
<p>It also scored favourably in <a href="https://arxiv.org/pdf/2308.03688.pdf">AgentBench</a>, a tool that can measure a large language model’s ability to behave as an agent by, for example, browsing the web, buying products online and competing in games.</p>
<h2>Is the Turing test still relevant?</h2>
<p>The Turing test is a measure of imitation – of AI’s ability to simulate the human behaviour. Large language models are expert imitators, which is now being reflected in their potential to pass the Turing test. But intelligence is not the same as imitation.</p>
<p>There are as many types of intelligence as there are goals to achieve. The best way to understand AI’s intelligence is to monitor its progress in developing a range of important capabilities.</p>
<p>At the same time, it’s important we don’t keep “changing the goalposts” when it comes to the question of whether AI is intelligent. Since AI’s capabilities are rapidly improving, critics of the idea of AI intelligence are constantly finding new tasks AI systems may struggle to complete – only to find they have jumped over <a href="https://www.newyorker.com/culture/annals-of-inquiry/the-mechanical-muse">yet another hurdle</a>. </p>
<p>In this setting, the relevant question isn’t whether AI systems are intelligent — but more precisely, what <em>kinds</em> of intelligence they may have.</p><img src="https://counter.theconversation.com/content/214721/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The Turing test, first proposed in 1950 by Alan Turing, was framed as a test that could supposedly tell us whether an AI system could ‘think’ like a human.Simon Goldstein, Associate Professor, Dianoia Institute of Philosophy, Australian Catholic University, Australian Catholic UniversityCameron Domenico Kirk-Giannini, Assistant Professor of Philosophy, Rutgers UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2130422023-09-26T12:25:50Z2023-09-26T12:25:50ZWhat are APIs? A computer scientist explains the data sockets that make digital life possible<figure><img src="https://images.theconversation.com/files/549831/original/file-20230922-15-szible.png?ixlib=rb-1.1.0&rect=147%2C77%2C887%2C611&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Under the hood of your digital life, APIs are making connections.</span> <span class="attribution"><a class="source" href="https://www.loc.gov/resource/hec.04117/">Library of Congress</a></span></figcaption></figure><p>APIs, or application programming interfaces, are the gateways to the digital world. They link a wide array of software applications and systems. APIs facilitate communication between different software systems, and so power everything from social media – think of the share buttons on webpages – to e-commerce transactions. </p>
<p>At a simple level, APIs are like electrical sockets. A software application that you’re using, say the playback controls for a video on a webpage, is like an appliance. The system that provides data or services that the application needs, say YouTube, is like the electrical grid. The API, in this example the <a href="https://developers.google.com/youtube/iframe_api_reference">YouTube Player API</a>, is like the standard electrical outlet that lets any appliance plug in to the grid.</p>
<p>APIs are not really so simple, though. Another analogy is a restaurant. The customer is the software application, the chef is the data or service, and the waiter is the API. The waiter brings the customer the menu, which lists available dishes – i.e., options for accessing data or service – and then brings the customer’s request to the chef.</p>
<p>APIs rely on defined rules and protocols that ensure accurate data exchange and effective collaboration. There are APIs for specific uses and software developer preferences. </p>
<h2>Why APIs matter</h2>
<p>APIs power various applications and services across many diverse industries. Facebook, Instagram and Twitter, now rebranded as X, let users share their content across these social media platforms. By leveraging their social media credentials, users can log into websites, weather apps and games to simplify their online experiences. Amazon and PayPal depend on APIs for secure payment processing and efficient order fulfillment. Navigation services like Google Maps leverage APIs to provide real-time location data and accurate directions. Even voice-activated smart assistants like Amazon’s Alexa and Google Assistant use APIs to manage and control smart home devices.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/-mN3VyJuCjM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A widely used API is critical for most mobile and web apps.</span></figcaption>
</figure>
<p>Who has access to an API also matters. For example, in March 2023, X began charging a wider range of users for access to its <a href="https://developer.twitter.com/en/products/twitter-api">data API</a>, which lets users collect large numbers of tweets to see what people are tweeting about. Businesses use the API for market and competitive research. But many people with limited resources, like developers of some free apps and <a href="https://theconversation.com/twitters-new-data-fees-leave-scientists-scrambling-for-funding-or-cutting-research-199238">social science researchers</a>, also rely on it.</p>
<p>APIs are also playing a role in making artificial intelligence widely available. For example, <a href="https://developers.generativeai.google/">Google</a>, <a href="https://azure.microsoft.com/en-us/products/ai-services/?activetab=pivot:azureopenaiservicetab">Microsoft</a> and <a href="https://openai.com/product">OpenAI</a> provide APIs for software developers to incorporate AI in their products.</p>
<p>As APIs continue to shape the digital landscape, developers face challenges. Ensuring the security and privacy of data exchanged through APIs is paramount, given their integration into critical systems. As APIs evolve, managing their complex ecosystems and making sure old programs can use new APIs will be a considerable task.</p><img src="https://counter.theconversation.com/content/213042/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tam Nguyen does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>How do all the different pieces of digital technology you use every day – weather apps, online banking, games and so on – talk to each other? Via application programming interfaces, or APIs.Tam Nguyen, Associate Professor of Computer Science, University of DaytonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2128602023-09-11T20:09:04Z2023-09-11T20:09:04ZWhy ChatGPT isn’t conscious – but future AI systems might be<figure><img src="https://images.theconversation.com/files/547419/original/file-20230911-27-sdkyzm.jpg?ixlib=rb-1.1.0&rect=0%2C59%2C8000%2C4431&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/3d-digital-abstract-human-face-on-2138818011">Shutterstock</a></span></figcaption></figure><p>In June 2022, Google engineer Blake Lemoine made headlines by claiming the company’s LaMDA chatbot had achieved sentience. The software had the conversational ability of a precocious seven-year-old, <a href="https://www.washingtonpost.com/technology/2022/06/11/google-ai-lamda-blake-lemoine/">Lemoine said</a>, and we should assume it possessed a similar awareness of the world. </p>
<p>LaMDA, later released to the public as <a href="https://blog.google/technology/ai/bard-google-ai-search-updates/">Bard</a>, is powered by a “large language model” (LLM) of the kind that also forms the engine of OpenAI’s ChatGPT bot. Other big tech companies are rushing to deploy similar technology. </p>
<p>Hundreds of millions of people have now had the chance to play with LLMs, but few seem to believe they are conscious. Instead, in linguist and data scientist <a href="https://dl.acm.org/doi/pdf/10.1145/3442188.3445922">Emily Bender’s poetic phrase</a>, they are “stochastic parrots”, which chatter convincingly without understanding. But what about the next generation of artificial intelligence (AI) systems, and the one after that? </p>
<p>Our team of philosophers, neuroscientists and computer scientists looked to current scientific theories of how human consciousness works to draw up a <a href="https://arxiv.org/abs/2308.08708">list of basic computational properties</a> that any hypothetically conscious system would likely need to possess. In our view, no current system comes anywhere near the bar for consciousness – but at the same time, there’s no obvious reason future systems won’t become truly aware.</p>
<h2>Finding indicators</h2>
<p>Since computing pioneer Alan Turing proposed his “<a href="https://theconversation.com/turing-test-why-it-still-matters-123468">Imitation Game</a>” in 1950, the ability to successfully impersonate a human in conversation has often been taken as a reliable marker of consciousness. This is usually because the task has seemed so difficult it must require consciousness. </p>
<p>However, as with chess computer Deep Blue’s 1997 <a href="https://www.ibm.com/ibm/history/ibm100/us/en/icons/deepblue/">defeat of grandmaster Gary Kasparov</a>, the conversational fluency of LLMs may just move the goalposts. Is there a principled way to approach the question of AI consciousness that does not rely on our intuitions about what is difficult or special about human cognition? </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/a-google-software-engineer-believes-an-ai-has-become-sentient-if-hes-right-how-would-we-know-185024">A Google software engineer believes an AI has become sentient. If he’s right, how would we know?</a>
</strong>
</em>
</p>
<hr>
<p>Our recent <a href="https://arxiv.org/abs/2308.08708">white paper</a> aims to do just that. We compared current scientific theories of what makes humans conscious to compile a list of “indicator properties” that could then be applied to AI systems. </p>
<p>We don’t think systems that possess the indicator properties are definitely conscious, but the more indicators, the more seriously we should take claims of AI consciousness. </p>
<h2>The computational processes behind consciousness</h2>
<p>What sort of indicators were we looking for? We avoided overt behavioural criteria – such as being able to hold conversations with people – because these tend to be both human-centric and easy to fake. </p>
<p>Instead, we looked at theories of the computational processes that support consciousness in the human brain. These can tell us about the sort of information-processing needed to support subjective experience. </p>
<p>“Global workspace theories”, for example, postulate that consciousness arises from the presence of a capacity-limited bottleneck which collates information from all parts of the brain and selects information to make globally available. “Recurrent processing theories” emphasise the role of feedback from later processes to earlier ones. </p>
<p>Each theory in turn suggests more specific indicators. Our final list contains 14 indicators, each focusing on an aspect of how systems <em>work</em> rather than how they <em>behave</em>. </p>
<p><iframe id="uvK17" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/uvK17/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>No reason to think current systems are conscious</h2>
<p>How do current technologies stack up? Our analysis suggests there is no reason to think current AI systems are conscious. </p>
<p>Some do meet a few of the indicators. Systems using the transformer architecture, a kind of machine-learning model behind <a href="https://arstechnica.com/science/2023/07/a-jargon-free-explanation-of-how-ai-large-language-models-work/">ChatGPT and similar tools</a>, meet three of the “global workspace” indicators, but lack the crucial ability for global rebroadcast. They also fail to satisfy most of the other indicators. </p>
<p>So, despite ChatGPT’s impressive conversational abilities, there is probably nobody home inside. Other architectures similarly meet at best a handful of criteria. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/not-everything-we-call-ai-is-actually-artificial-intelligence-heres-what-you-need-to-know-196732">Not everything we call AI is actually 'artificial intelligence'. Here's what you need to know</a>
</strong>
</em>
</p>
<hr>
<p>Most current architectures only meet a few of the indicators at most. However, for most of the indicators, there is at least one current architecture that meets it.</p>
<p>This suggests there are no obvious, in-principle technical barriers to building AI systems that satisfy most or all of the indicators. </p>
<p>It is probably a matter of <em>when</em> rather than <em>if</em> some such system is built. Of course, plenty of questions will still remain when that happens. </p>
<h2>Beyond human consciousness</h2>
<p>The scientific theories we canvass (and the authors of the paper!) don’t always agree with one another. We used a list of indicators rather than strict criteria to acknowledge that fact. This can be a powerful methodology in the face of scientific uncertainty. </p>
<p>We were inspired by similar debates about animal consciousness. Most of us think at least some nonhuman animals are conscious, despite the fact they cannot converse with us about what they’re feeling. </p>
<p>A 2021 <a href="https://www.lse.ac.uk/News/News-Assets/PDFs/2021/Sentience-in-Cephalopod-Molluscs-and-Decapod-Crustaceans-Final-Report-November-2021.pdf">report</a> from the London School of Economics arguing that cephalopods such as octopuses likely feel pain was instrumental <a href="https://www.abc.net.au/news/2021-12-16/the-uk-has-recognised-octopuses-crabs-and-lobsters-as-sentient-b/100698106">in changing UK animal ethics policy</a>. A focus on structural features has the surprising consequence that even some simple animals, like insects, <a href="https://theconversation.com/what-it-is-like-to-be-a-bee-insects-can-teach-us-about-the-origins-of-consciousness-57792">might even possess a minimal form of consciousness</a>. </p>
<p>Our report does not make recommendations for what to do with conscious AI. This question will become more pressing as AI systems inevitably become more powerful and widely deployed. </p>
<p>Our indicators will not be the last word – but we hope they will become a first step in tackling this tricky question in a scientifically grounded way.</p><img src="https://counter.theconversation.com/content/212860/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Colin Klein receives funding from The Templeton World Charity Foundation (TWCF-2020-20539)</span></em></p>The science of human consciousness offers new ways of gauging machine minds – and suggests there’s no obvious reason computers can’t develop awareness.Colin Klein, Professor, School of Philosophy, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2051712023-05-23T12:25:42Z2023-05-23T12:25:42ZNew approach to teaching computer science could broaden the subject’s appeal<figure><img src="https://images.theconversation.com/files/527051/original/file-20230518-23-xsgvbi.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Language arts students can program chatbots for literary characters.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/side-view-of-youthful-african-american-schoolboy-royalty-free-image/1425235236">shironosov/iStock/Getty Images Plus</a></span></figcaption></figure><p>Despite <a href="https://www.bls.gov/ooh/computer-and-information-technology/computer-and-information-research-scientists.htm#tab-1">growing demand for computer science skills</a> in professional careers and many areas of life, K-12 schools <a href="https://www.eschoolnews.com/steam/2023/02/23/what-is-computer-science-education-lacking/">struggle to teach</a> computer science to the next generation.</p>
<p>However, a new approach to computer science education – called <a href="https://www.fierceeducation.com/teaching-learning/teaching-computational-thinking-essential-future-college-students">integrated computing</a> – addresses the main barriers that schools face when adding computer science education. These barriers include a <a href="https://news.gallup.com/reports/196379/trends-state-computer-science-schools.aspx">lack of qualified computer science teachers</a>, a lack of funds and a focus on courses tied to standardized tests.</p>
<p>Integrated computing teaches computer science skills like programming and computer literacy within traditional courses. For example, students can use integrated computing activities to <a href="https://youtu.be/KG_JqpmmkdQ">create geometric patterns in math</a>, <a href="https://youtu.be/x5w6x7f33Wk">simulate electromagnetic waves in science</a> and <a href="https://youtu.be/654BOJwAWCg">create chatbots for literary characters</a> in language arts. </p>
<p>As a <a href="https://education.gsu.edu/profile/lauren-margulieux/">professor of learning technologies</a>, I have been <a href="https://scholar.google.com/citations?user=YGV0Y24AAAAJ&hl=en&oi=sra">designing integrated computing activities</a> for K-12 students for the past five years. I work with faculty and students in teacher training programs to <a href="http://www.doi.org/10.26716/jcsi.2022.11.15.35">create and test integrated computing activities</a> across all academic subjects. </p>
<p>In <a href="https://laurenmarg.com/research/">my research</a>, I have found that integrated computing solves three major hurdles to teaching computer science education in K-12 schools.</p>
<h2>Challenges to teaching computer science</h2>
<p>Fitting a new academic discipline into an <a href="https://www.oecd-ilibrary.org/sites/0ebc645c-en/index.html?itemId=/content/component/0ebc645c-en">already crowded curriculum</a> can be a challenge. Integrated computing allows computer science education to become part of learning in other classes, the way reading skills are also used in science, math and language arts classes. </p>
<p>Teacher knowledge is <a href="https://doi.org/10.1080/07380569.2023.2178868">another difficulty when it comes to teaching computer science</a> in K-12 schools. While people who specialize in computer science are often recruited to more lucrative careers than teaching, integrated computing develops all teachers’ computer science knowledge. Teachers do not need to become computer science experts to teach computer literacy and programming skills to their students. </p>
<figure class="align-center ">
<img alt="Teacher holds tablet while working in classroom" src="https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/527129/original/file-20230518-19-2wsuw7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Teachers do not need a computer science degree to incorporate computing into their classrooms.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/indian-teacher-using-digital-tablet-in-classroom-royalty-free-image/526297603">LWA/Dann Tardif/DigitalVision Collection/Getty Images</a></span>
</figcaption>
</figure>
<p>In fact, the most surprising result of my research is how quickly teachers learn to teach integrated computing activities. In about two hours, <a href="https://www.doi.org/10.26716/jcsi.2022.11.15.35">teachers can use a pre-made computer science lesson</a> in their classrooms. In the future, I will teach them to use artificial intelligence to create their own lessons for their students. For example, a science teacher recently asked me how she could create a data analysis activity for her class. AI tools would allow her to <a href="https://www.ironhack.com/us/en/blog/chatgpt-for-data-analysts">quickly design the technical aspects</a> of this activity. </p>
<p>And finally, integrated computing also addresses students’ reluctance to take elective computer science classes when they have little knowledge of computer science. In 2022, over half of U.S. public high schools offered computer science, but just <a href="https://www.edweek.org/technology/computer-science-education-is-gaining-momentum-but-some-say-not-fast-enough/2022/09">6% of students</a> took these classes. Students who do take computer science in high school typically have had <a href="https://doi.org/10.2190/9LE6-MBXA-JDPG-UG90">early exposure to computer science</a>. Integrated computing can give all students early exposure to computer science, which I believe will increase the number of students who take computer science courses later in school. </p>
<h2>Computer science for everyone</h2>
<p>Early exposure to computer science in school is especially important for students from groups <a href="https://www.brookings.edu/research/exploring-the-state-of-computer-science-education-amid-rapid-policy-expansion/">underrepresented in computer science</a>. A <a href="https://advocacy.code.org/stateofcs">2022 report</a> from Code.org, a nonprofit that advocates for more computer science education in K-12 schools, found that students who are Latino, female or from low-income or rural areas are <a href="https://www.edweek.org/technology/computer-science-education-is-gaining-momentum-but-some-say-not-fast-enough/2022/09">less likely</a> to be enrolled in foundational computer science courses.</p>
<p>Teachers who want to build their computer science knowledge and apply it to their classroom can try these free self-paced, online <a href="https://gavirtualpd.catalog.instructure.com/browse/computerscience">integrated computing courses</a> that I developed, and which are tied to micro-credentials. Also, this sortable list of <a href="https://integratedcomputing.org/">integrated computing activities</a> provides free lesson plans. The activities require only a computer – no prior knowledge is needed, and young learners can complete them outside of class, too.</p>
<p>Integrated computing provides a path to increase computer literacy for all K-12 students. As technology advances at an increasing rate, I believe schools must take care that our young people do not fall behind.</p><img src="https://counter.theconversation.com/content/205171/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lauren Margulieux receives funding from Snap, Inc., Google, the National Science Foundation, and the US Department of Education. </span></em></p>Integrated computing enables teachers to incorporate basic programming skills into K-12 students’ regular math, science and language arts classes.Lauren Margulieux, Associate Professor of Learning Technologies, Georgia State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1850962022-08-15T12:38:15Z2022-08-15T12:38:15ZComputer science benefits students with learning disabilities – but not always for the long term<figure><img src="https://images.theconversation.com/files/474160/original/file-20220714-32419-6llaip.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5150%2C3423&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Schools can help students see themselves working in computer science.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/students-working-in-computer-lab-royalty-free-image/102754951?adppopup=true">Hill Street Studios/Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em> </p>
<h2>The big idea</h2>
<p>When computer science courses are delivered through career and technical education in high school, the courses can help students with learning disabilities feel better about their ability to succeed in STEM. The classes also help the students see the usefulness of computer science.</p>
<p>This is what we found in a <a href="https://journals.sagepub.com/doi/abs/10.1177/14782103211049913">recent study</a> with our co-authors – education scholars <a href="https://scholar.google.com/citations?user=uEh9GkMAAAAJ&hl=en">Michael Gottfried</a>, <a href="https://www.researchgate.net/profile/Jennifer-Freeman-16">Jennifer Freeman</a>.</p>
<p>We used national survey data from more than 20,000 students across the country to dig into this connection between computer science and science, technology, engineering or mathematics, a group of subjects generally known as STEM.</p>
<p>In our work, we found that – compared with other students with learning disabilities – those who took computer science courses in a career and technical education program were more likely to believe they could succeed in STEM. They were also more likely to believe STEM was useful for future employment or college options.</p>
<p>We also found that – within career and technical education programs – students with learning disabilities were just as likely to take computer science courses as students without learning disabilities. All our findings were still evident even after we took into account key student characteristics, such as family income, first language, gender and racial or ethnic identity.</p>
<p>Students with learning disabilities in our study are those who have a disability that affects their learning to write, read, spell or perform mathematical calculations. </p>
<h2>Why it matters</h2>
<p>Computer science is one of the <a href="https://www.bls.gov/ooh/computer-and-information-technology/home.htm">fastest-growing</a> fields in the current economy. Employment experts predict a 13% increase – about 667,000 new jobs – in these computer occupations from 2020 to 2030. That’s more than three times the rate of anticipated overall job growth. </p>
<p>However, there have not been enough <a href="https://www.techservealliance.org/news/the-state-of-the-technology-talent-shortage/">computer science graduates</a> in recent years to fill these jobs.</p>
<p>Based on our work, computer science courses appear to help students with learning disabilities develop positive attitudes toward STEM. These attitudes are linked to persistence in both <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5403895/">computer science</a> and <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5839644/">STEM more generally</a>. This makes it important for educators to encourage students to study, and stick with, computer science and STEM and make sure these students have access to these courses. </p>
<p>At the moment, students with learning disabilities are <a href="https://ieeexplore.ieee.org/document/4198258">underrepresented in computer science fields</a> in college and the labor market. Specifically, <a href="https://cra.org/crn/2020/11/expanding-the-pipeline-the-status-of-persons-with-disabilities-in-the-computer-science-pipeline/">fewer than 8%</a> of students in undergraduate computer science programs have any disability. This is compared with about <a href="https://nces.ed.gov/programs/digest/d18/tables/dt18_311.10.asp">19% of all undergraduates</a>. </p>
<h2>What still isn’t known</h2>
<p>A big question that remains is why students with learning disabilities don’t persist in computer science fields in college and, ultimately, pursue careers in the field. Even though computer science courses in high school help develop confidence and a sense of purpose, that may not be enough to encourage them to stick with it longer term. </p>
<p>One possible explanation might be that students with learning disabilities don’t see themselves as part of the STEM community. In our research, we looked to see if there was a link between computer science coursework and a feeling of STEM community membership. We found this connection for general education students but not for students with learning disabilities.</p>
<p>Another possible explanation may be that students with learning disabilities start high school with lower levels of STEM confidence and less of a sense that computer science will be useful to them in the future. Just participating in computer science courses may not be enough to make up the difference in this regard.</p>
<h2>What’s next</h2>
<p>One important next step will be to look at the factors that help students with learning disabilities keep studying computer science and STEM. For example, does a positive attitude toward STEM actually lead students with learning disabilities to study computer science or pursue careers in the field? We plan to explore such a question in future work.</p><img src="https://counter.theconversation.com/content/185096/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jay Stratte Plasman receives funding from the National Science Foundation and the Institute of Education Sciences. </span></em></p><p class="fine-print"><em><span>Shaun M. Dougherty does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>While computer science courses can help students with learning disabilities see themselves in careers in the field, they are still underrepresented. A team of researchers explores why.Jay Stratte Plasman, Assistant Professor in Workforce Development and Education at The Ohio State University, The Ohio State UniversityShaun M. Dougherty, Associate Professor of Public Policy & Education, Vanderbilt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1879972022-08-10T03:02:15Z2022-08-10T03:02:15ZHow complex is your life? Computer scientists found a way to measure it<figure><img src="https://images.theconversation.com/files/478418/original/file-20220809-15-eziox2.jpeg?ixlib=rb-1.1.0&rect=35%2C65%2C3958%2C2598&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Unsplash/Susan Q Yin</span></span></figcaption></figure><p>Nobel laureate economist Richard Thaler famously <a href="https://freakonomics.com/podcast/people-arent-dumb-the-world-is-hard-2/">quipped</a>: </p>
<blockquote>
<p>People aren’t dumb, the world is hard.</p>
</blockquote>
<p>Indeed, we routinely encounter problems in our everyday lives that feel complex – from choosing the best electricity plan, to deciding how to effectively spend our money. </p>
<p>Australian pay hundreds of millions of dollars each year to <a href="https://www.iselect.com.au/content/uploads/2021/08/FY21-Results-Announcement.pdf">comparison websites</a> and consumer-focused groups such as <a href="https://www.choice.com.au/about-us/how-choice-is-funded">CHOICE</a> to help them make decisions about products and services.</p>
<p>But how can we objectively measure how “complex” our decisions really are? Our recently published <a href="https://www.nature.com/articles/s41598-022-16565-w">research</a> offers one potential way to do this, by drawing on concepts from computer and systems science. </p>
<h2>Why bother measuring complexity?</h2>
<p>There are several factors when it comes to measuring complexity in any scenario. For instance, there may be a number of <a href="https://linkinghub.elsevier.com/retrieve/pii/S0047272710000290">options to choose from</a> and each option may have several <a href="https://www.esri.ie/publications/price-lab-an-investigation-of-consumers-capabilities-with-complex-products">different features</a> to consider. </p>
<p>Suppose you want to buy jam. This will be easy if there are only two flavours available, but difficult if there are <a href="https://medium.com/@FlorentGeerts/the-jam-experiment-how-choice-overloads-makes-consumers-buy-less-d610f8c37b9b">dozens</a>. Yet choosing an electricity plan would be much harder even with just two options. </p>
<p>In other words, you can’t isolate one particular factor when trying to determine the complexity of something. You have to consider the problem as a whole – and this requires a lot more work.</p>
<p>The ability to accurately measure complexity could have a wide range of practical applications, including informing the design of: </p>
<ul>
<li><p>regulation on how complex products should be</p></li>
<li><p>easy to navigate digital systems including websites, apps and smart device programs</p></li>
<li><p>easy to understand products. These may be financial products (superannuation and insurance plans, credit card schemes), physical products (devices) or virtual products (software)</p></li>
<li><p>artificial intelligence (AI) that offers advice when problems are too complex for humans. For example, a scheduler AI may let you book meetings yourself, before jumping in to suggest optimal meeting times and locations based on your history.</p></li>
</ul>
<h2>How we study human decision-making</h2>
<p>Computer science can help us solve problems: information goes in and one (or more) solutions come out. However, the amount of computation needed for this can vary a lot, depending on the problem.</p>
<p>We and our colleagues used a precise mathematical framework, called “computational complexity theory”, that quantifies how much computation is needed to solve any given problem.</p>
<p>The idea behind it is to measure the amount of computational resources (such as time or memory) a computer algorithm needs when problem-solving. The more time or memory it needs, the more complex the problem is. </p>
<p>Once this is established, problems can be categorised into “classes” based on their complexity.</p>
<p>In our work, we were particularly interested in how complexity (as determined through computational complexity theory) corresponds with the actual amount of effort people must put into solving certain problems. </p>
<p>We wanted to know whether computational complexity theory could accurately predict how much humans would struggle in a certain situation and how accurate their problem-solving would be.</p>
<h2>Testing our hypothesis</h2>
<p>We focused on three types of experimental tasks, for which you can see examples below. All of these task types sit within a broader class of complex problems called “NP-complete” problems. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=459&fit=crop&dpr=1 600w, https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=459&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=459&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=577&fit=crop&dpr=1 754w, https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=577&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/477446/original/file-20220803-16-7f9x9s.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=577&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Here are example cases for the three experimental tasks, each of which required a yes or no answer from our research participants.</span>
<span class="attribution"><span class="source">Juan Pablo Franco Ulloa/Karlo Doroc/Nitin Yadav</span></span>
</figcaption>
</figure>
<p>Each task type requires a different ability to perform well in. Specifically: </p>
<ul>
<li>“satisfiability” tasks require abstract logic</li>
<li>“travelling salesperson” tasks require spatial navigation skills and</li>
<li>“knapsack” tasks require arithmetic.</li>
</ul>
<p>All three are ubiquitous in real life and reflect day-to-day problems such as software testing (satisfiability), planning a road trip (travelling salesperson), and shopping or investing (knapsack). </p>
<p>We recruited 67 people, split them into three groups, and made each group solve between 64-72 different variations of one of the three types of task.</p>
<p>We also used computational complexity theory and computer algorithms to figure out which tasks were “high complexity” for a computer, before comparing these with the results from our human problem solvers. </p>
<p>We expected – assuming computational complexity theory is congruent with how real people solve problems – that our participants would spend more time on tasks identified as being “high complexity” for a computer. We also expected lower problem-solving accuracy on these tasks.</p>
<p>In both cases that’s exactly what we found. On average, people did twice as well on the lowest complexity cases compared to the highest complexity cases. </p>
<h2>Computer science can measure ‘complexity’ for humans</h2>
<p>Our results suggest effort alone is not enough to ensure someone does well on a complex problem. Some problems will be hard no matter what – and these are the spaces in which advanced decision aids and AI can shine.</p>
<p>In practical terms, being able to gauge the complexity of a wide range of tasks could help provide people with the necessary support they need to tackle these tasks day-to-day.</p>
<p>The most important result was that our computational complexity theory-based predictions about which tasks humans would find harder were consistent across all three types of task – despite each requiring different abilities to solve. </p>
<p>Moreover, if we can predict how hard humans will find tasks within these three problems, then it should be able to do the same for the more than 3,000 other NP-complete problems.</p>
<p>These include similarly common hurdles such as <a href="https://www.sciencedirect.com/science/article/pii/S0022000075800080">task scheduling</a>, <a href="https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(17)30193-6">shopping</a>, <a href="https://eprints.soton.ac.uk/265340/1/jpms-wodes08.pdf">circuit design</a> and <a href="https://en.wikipedia.org/wiki/List_of_NP-complete_problems#Games_and_puzzles">gameplay</a>. </p>
<h2>Now, to put research into practice</h2>
<p>While our results are exciting, there’s still a long way to go. For one, our research used quick and abstract tasks in a controlled laboratory environment. These tasks can <em>model</em> real-life choices, but they’re not representative of <em>actual</em> real-life choices. </p>
<p>The next step is to apply similar techniques to tasks that more closely resemble real-life choices. For instance, can we use computational complexity theory to measure the complexity of choosing between different credit cards? </p>
<p>Progress in this space could help us unlock new ways to aid people in making better choices, every day, across various facets of life. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/weve-crunched-the-numbers-in-mcdonalds-monopoly-challenge-to-find-your-chance-of-winning-102763">We've crunched the numbers in McDonald's Monopoly challenge to find your chance of winning</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/187997/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Karlo Doroc receives funding from a University of Melbourne Graduate Research Scholarship from the Faculty of Business and Economics, a Kinsman Scholarship, and Australian Government Research Training Program.</span></em></p>Is it harder to choose between four flavours of jam, or two different electricity plans?Karlo Doroc, PhD Candidate in Decision Science, Centre for Brain, Mind and Markets, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1852562022-06-23T11:50:39Z2022-06-23T11:50:39ZOnly about 1 in 5 engineering degrees go to women<figure><img src="https://images.theconversation.com/files/469564/original/file-20220617-15-euge0j.jpeg?ixlib=rb-1.1.0&rect=9%2C0%2C6017%2C4011&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Research shows women who study engineering do better when mentored by other women.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/milling-machine-setup-process-female-african-royalty-free-image/1350414597?adppopup=true">Nitat Termmee/Moment via Getty Images</a></span></figcaption></figure><figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=255&fit=crop&dpr=1 600w, https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=255&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=255&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=321&fit=crop&dpr=1 754w, https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=321&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/469516/original/file-20220617-24-upljnk.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=321&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Despite various efforts to <a href="https://beta.nsf.gov/funding/initiatives/broadening-participation/supporting-women-and-girls-stem">encourage more women to study STEM fields</a> in college, the percentage of engineering bachelor’s degrees earned by women in the United States hasn’t increased much in the 21st century. Specifically, it has risen from 18% in 1998 to 22% in 2018. </p>
<p><iframe id="PkKnG" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/PkKnG/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Of all the fields in STEM – or science, technology, engineering and mathematics – the engineering workforce <a href="https://www.pewresearch.org/social-trends/2018/01/09/diversity-in-the-stem-workforce-varies-widely-across-jobs/">has the lowest proportion of women</a>, at 14%.</p>
<p>That low participation matters for several reasons. Women are not only being left out of some of the <a href="https://www.bls.gov/ooh/architecture-and-engineering/home.htm">highest-paying jobs in STEM</a>, but companies are losing out as well. Research shows that gender-diverse teams <a href="https://www.forbes.com/sites/eriklarson/2017/09/21/new-research-diversity-inclusion-better-decision-making-at-work/?sh=71ea3baa4cbf">make better business decisions</a> than teams that are all-male.</p>
<p>So why aren’t women going into engineering? And what, if anything, can be done to help women who decide to study engineering stay the course? The Society of Women Engineers reports that <a href="https://alltogether.swe.org/2019/11/swe-research-update-women-in-engineering-by-the-numbers-nov-2019/#_ednref7">over 32% of female STEM majors switch to another major</a>. Research shows this rate is <a href="https://www.rise.hs.iastate.edu/projects/CBiRC/IJEE-WhyTheyLeave.pdf">typically higher</a> than the rate at which men leave engineering. Of those women who leave the engineering profession, 30% cite the workplace environment as the reason, the society reports. A 2017 study of over 5,000 women who earned bachelor’s degrees in engineering <a href="https://doi.org/10.3389/fpsyg.2017.00875">found that 10% never entered the field and 27% left the profession</a>.</p>
<h2>Colleges intervene</h2>
<p>These are all issues I’ve been <a href="https://scholar.google.com/citations?user=sZGzlnMAAAAJ&hl=en&oi=ao">researching</a> as associate director of the <a href="https://cwit.umbc.edu/mission-vision/">Center for Women in Technology</a> at the University of Maryland, Baltimore County, or UMBC. In 2018, several colleagues and I found that computing and engineering students who are supported by the center <a href="https://dl.acm.org/doi/abs/10.1145/3159450.3159533">graduate within four years at a rate of 61.2%</a> – a <a href="https://cdn.theconversation.com/static_files/files/2143/A_Model_for_Increasing_Gender_Diversity_in_Technology.pdf?1655991489">full 19 percentage points higher</a> than students who are not supported by the center. The center supports students through scholarships and extensive academic and social support; in the 2021-22 academic year, 73% of students supported were women.
And recently two alumnae of the center – one in <a href="https://umbc.edu/stories/fourteen-umbc-students-and-recent-alumni-receive-fulbright-awards-setting-new-record/">2019</a> and one in <a href="https://umbc.edu/stories/umbc-2022-fulbright-student-scholars/">2022</a> – have become <a href="https://us.fulbrightonline.org/">Fulbright Scholars</a>.</p>
<p>The program at UMBC is by no means the only campus-based program in the nation that supports female students in their plans to enter engineering and computer science – two areas in which women are <a href="https://www.pewresearch.org/social-trends/2018/01/09/diversity-in-the-stem-workforce-varies-widely-across-jobs/">persistently underrepresented</a>. Through my research, I have discovered that there are more than two dozen such programs or initiatives at colleges and universities throughout the nation. They include, for example, the
<a href="https://sites.udel.edu/wie/">Women in Engineering Program</a> at University of Delaware, the <a href="https://wise.ncsu.edu/">Women in Science and Engineering program</a> at North Carolina State University and the <a href="https://awe.seas.upenn.edu/#:%7E:text=Women%20make%20up%20approximately%2040,and%20opportunities%20to%20Penn%20Engineering.">Advancing Women in Engineering</a> program at the University of Pennsylvania. </p>
<p>To better understand the necessity of such programs, consider the abundance of research that has found women who study STEM report <a href="https://doi.org/10.1353/csd.2016.0072">“chilly” and “negative” experiences</a> in the classroom and on campus. This includes being subjected to gender-based harassment and a “<a href="https://doi.org/10.1353/csd.2016.0072">perception that women are unable to ‘do science.</a>’” Colleges also have long struggled with how to help women <a href="https://www.ijemst.net/index.php/ijemst/article/view/293/141">see themselves as part of the scientific community</a>.</p>
<h2>Proven strategies</h2>
<p>It doesn’t have to be that way. Research shows that when female engineering students are mentored by female peers, they feel less anxious about their ability, have <a href="https://doi.org/10.1073/pnas.1613117114">more positive academic experiences</a> and are more likely to stick with STEM as a major. Peer-based tutoring has also <a href="https://peer.asee.org/examining-the-effectiveness-of-scholars-assisting-scholars-program-among-undergraduate-engineering-students">been shown to help students get their grades up</a>.</p>
<p>With support from an approximately $233,000 grant from the National Science Foundation, I have also been looking at <a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=2025349&HistoricalAwards=false">what kinds of academic experiences and supports</a> help female engineering students stay the course.</p>
<p>Based on my analysis of 356 female engineering students at UMBC from 2007 to 2016, what follows are preliminary findings from my National Science Foundation research:</p>
<h2>1. High school math and grades make a difference</h2>
<p>Starting college in a higher level of college math and having a higher high school GPA both help. Specifically, starting college at a higher level of college math – such as Advanced Calculus or Differential Equations – increases the likelihood of graduating with an engineering degree within five years by 8% over those who start at lower levels of college math. Having a higher high school GPA increases the likelihood even more.</p>
<p>To boost the number of women who earn engineering degrees, educators must help girls get on track at the high school level. This means establishing a strong record of success in their high school math and science courses.</p>
<h2>2. Gateway engineering courses matter</h2>
<p>By “gateway” courses, I mean classes that are required to officially declare the engineering major and that faculty identified as critical for success. In other words, classes that make or break an engineer. This would include courses such as Principles of Digital Design in computer engineering, Statics in mechanical engineering and Chemical Process Thermodynamics in chemical engineering.</p>
<p>I found that women who took more gateway engineering courses were less likely to leave their intended engineering major.</p>
<h2>3. Freshman and sophomore years in college are critical</h2>
<p>For those who eventually left engineering, making it through the first four semesters is critical. Among women students who left engineering, 59% – or about three out of five – did so during the first four semesters.</p>
<p>This points toward the need for colleges and universities to provide very deliberate academic and social supports – such as tutoring and mentoring – for female engineering students at the very start of their college careers.</p>
<p>If only 1 in 5 bachelor’s degrees in engineering are awarded to women, it may take these efforts and more to get the number anywhere close to being on par with the proportion that are awarded to men.</p><img src="https://counter.theconversation.com/content/185256/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Danyelle Tauryce Ireland works for the Center for Women in Technology at the University of Maryland, Baltimore County. She receives funding from the National Science Foundation. </span></em></p>A negative environment dissuades many women engineering students from staying in the field. Can colleges and universities do anything to reverse the trend?Danyelle Tauryce Ireland, Associate Director of the Center for Women in Technology and Research Assistant Professor in the Engineering and Computing Education Program, University of Maryland, Baltimore CountyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1818292022-06-01T12:44:03Z2022-06-01T12:44:03ZWhat are digital twins? A pair of computer modeling experts explain<figure><img src="https://images.theconversation.com/files/465820/original/file-20220527-25-pntn83.jpg?ixlib=rb-1.1.0&rect=0%2C17%2C5934%2C5063&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A digital twin attempts to capture every aspect of a real thing, including up-to-the-moment changes.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/analog-collage-with-female-portrait-and-her-mirror-royalty-free-image/1309294833">lambada/E+ via Getty Images</a></span></figcaption></figure><p>A digital twin is a virtual representation of a real system – a building, the power grid, a city, even a human being – that mimics the characteristics of the system. A digital twin is more than just a computer model, however. It receives data from sensors in the real system to constantly parallel the system’s state.</p>
<p>A digital twin helps people analyze and predict a system’s behavior under different conditions. The systems being twinned are typically <a href="https://doi.org/10.1109/PerCom53586.2022.9762405">very complex and require significant effort to model and track</a>.</p>
<p>Digital twins are useful in a wide variety of domains, including <a href="https://doi.org/10.1007/s11036-020-01557-9">supply chains</a>, <a href="http://dx.doi.org/10.3233/FAIA190139">health care</a>, <a href="https://www.ashrae.org/File%20Library/Conferences/Specialty%20Conferences/2018%20Building%20Performance%20Analysis%20Conference%20and%20SimBuild/Papers/C110.pdf">buildings</a>, <a href="https://www.its.ucla.edu/project/digital-twins-for-bridge-health-monitoring-management/">bridges</a>, <a href="https://www.uni-stuttgart.de/en/university/news/all/Digital-twin-for-autonomous-driving/">self-driving cars</a> and <a href="https://futureofretail.io/trends/digital-twins">retail customer personas</a> to improve efficiency and reliability. For example, a warehouse operator can optimize a warehouse’s performance by exploring the response of its digital twin to various material handling policies and equipment without incurring the cost of making actual changes. </p>
<p>Even a wildfire can be <a href="https://doi.org/10.1109/ICUFN.2019.8806107">represented by a digital twin</a>. Government agencies can predict the spread of the fire and its impact under different conditions such as wind velocity, humidity and proximity to habitats, and use this information to guide evacuations.</p>
<h2>Why digital twins matter</h2>
<p>Digital twins are often used to model, understand and analyze complex systems where performance, reliability and security of the system are critical. In such systems it is paramount to test any changes, whether planned or unplanned. </p>
<p>In order to accurately test changes to the state of the actual system and the effects of any possible stimulus, the digital twin must accurately represent the physical system in its current state. This requires the digital twin to receive continuous updates from the physical system via fast and reliable communications channels. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/HftDI09LVI0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Digital twins are a key part of the push to create “smart” cities.</span></figcaption>
</figure>
<p>Creating and maintaining digital twins often involves vast amounts of data to represent various features of the real system. Collecting and processing this data requires advanced communication and computing technologies. Communication support typically involves high-speed internet connections and wireless networks such as Wi-Fi and 5G. Computational support is typically in the form of servers, either in the cloud or closer to the physical system. </p>
<p>We and other faculty members at Rochester Institute of Technology and the University of California, Irvine are starting the <a href="https://www.rit.edu/cssr">Center for Smart Spaces Research</a>, a research center sponsored by the National Science Foundation. One of the primary ongoing projects within this center is building the basic technologies for creating digital twins in a variety of applications. </p>
<p><em>Read other short, accessible explanations of newsworthy subjects written by academics in their areas of expertise for The Conversation U.S. <a href="https://theconversation.com/us/topics/significant-terms-105996">here</a>.</em></p><img src="https://counter.theconversation.com/content/181829/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Amlan Ganguly receives funding from US NSF, DARPA, AFRL, Raymond Corp and Bryx Corp. </span></em></p><p class="fine-print"><em><span>Nalini Venkatasubramanian receives research funding from the National Science Foundation and other federal agencies </span></em></p>A digital twin is to a computer model as live video is to a still photo. These virtual replicas can be used to understand and make predictions about a wide range of complex systems, including people.Amlan Ganguly, Associate Professor of Computer Engineering, Rochester Institute of TechnologyNalini Venkatasubramanian, Professor of Computer Science, University of California, IrvineLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1816972022-05-26T12:26:30Z2022-05-26T12:26:30ZWant to expand computer science education? Educate more teachers<figure><img src="https://images.theconversation.com/files/464866/original/file-20220523-23-ehi9kr.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5691%2C3797&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A teacher works with students in a computer lab.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/mature-teacher-assisting-female-students-using-royalty-free-image/1055844022">Maskot via Getty Images</a></span></figcaption></figure><p>When advocates push for computer science education, usually they’re talking about boosting the number of schools offering computer science classes – <a href="https://advocacy.code.org/2021_state_of_cs.pdf">with the intent to reach more students</a>. But from our perspective as scholars of computer science education, a key factor is how many teachers are qualified to teach the subject.</p>
<p>Data from 2020 indicates that in one of the most advanced high school computer science classes taught around the country, the College Board’s Advanced Placement Computer Science Principles course, enrollment <a href="https://cs4all.home.blog">grew from nearly 44,000 in 2017 to more than 114,000 in 2020</a>. The growth in enrollment – for that class and other computer science courses leading up to it – has been driven by more teachers taking quick classes on how to teach computer science.</p>
<p>Expanding the number of computer science courses depends on educating even more teachers to teach them. But almost half of all U.S. states don’t have a plan to teach computer science at the K-12 level. There are eight states that lack certification for computer science teachers. And 27 states and the District of Columbia don’t offer incentives for higher education institutions to offer computer science teacher education programs, according to data from <a href="https://code.org/advocacy/landscape.pdf">Code.org</a>.</p>
<p>What this means is schools won’t have enough teachers to expand computer science education. Increasing high-quality access to computer science is important for students who want to use computing as a tool for <a href="https://dl.acm.org/doi/fullHtml/10.1145/3029595">problem-solving</a> and <a href="https://www.learntechlib.org/p/151572/">creativity</a>.</p>
<p><iframe id="E0xM0" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/E0xM0/16/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>Teacher education programs</h2>
<p>The <a href="https://dl.acm.org/doi/10.1145/1953163.1953193">National Science Foundation</a> and private groups have <a href="https://www.nsf.gov/discoveries/disc_summ.jsp?cntn_id=134316&org=NSF">set up programs to increase</a> the number of computer science teachers. But most of those training efforts happen in <a href="https://code.org/educate/professional-development-online">one- to two-week sessions</a> that typically prepare teachers without a computing background to <a href="https://www.pltw.org/our-programs/pltw-computer-science">teach basic computer science principles</a>.</p>
<p>They do teach some of the computer science content teachers will need to impart, but they emerge from the training often lacking the <a href="https://narst.org/research-matters/pedagogical-content-knowledge">ability</a> to translate that content for students. The short-term courses don’t offer that level of depth.</p>
<p>Without policies and incentives for more dedicated teacher preparation, we believe many new computer science teachers won’t be adequately prepared. Two-week training courses can give prospective computer science teachers a grounding in the basics. But in our view they can’t provide enough depth to prepare teachers to deliver high-quality computer science instruction. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Girls work on computers while a woman assists." src="https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/464060/original/file-20220518-21-dyxnjp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Fifth graders at Marshall Elementary School in Marysville, Wash., participate in computer science class.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/GirlsTechScores/f038776721b740dcb797dce201f86061/photo">AP Photo/Elaine Thompson</a></span>
</figcaption>
</figure>
<h2>A combination as a solution</h2>
<p>At <a href="https://education.msu.edu/news/2021/msu-helps-increase-computer-science-equity-and-access-in-detroit/">Michigan State University</a>, in partnership with University of Detroit-Mercy, we have begun exploring another approach that we hope will better prepare school teachers to teach a full range of computer science courses.</p>
<p>Our effort puts university instructors with deep knowledge of computer science in high school computer science classes alongside a schoolteacher who is seeking to become a computer science teacher. The university instructor initially takes the lead, teaching the high school students while simultaneously demonstrating best practices for the teacher. As the year progresses, the high school teacher gains knowledge and experience, ultimately taking on more responsibility in the classroom.</p>
<p>We expect our evaluations to find that this method will allow the teachers to become more comfortable with the content. Then they can independently offer high-quality computer science instruction.</p>
<p>We have also seen great opportunities arise for schoolteachers to connect with their students’ identities and interests to explore computer science. For instance, one teacher used a coding tool called <a href="https://csdt.org/culture/cornrowcurves/index.html">Cornrow Curves</a> – named after an African and African American style of hair braiding – to <a href="https://doi.org/10.1145/3379918">explain and explore how algorithms work</a>.</p>
<p>More recently, we have been thinking about how to build on social relationships that students value – such as with coaches and barbers – to design a computationally and <a href="https://doi.org/10.1145/3379918">culturally rich learning environment</a>.</p><img src="https://counter.theconversation.com/content/181697/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Aman Yadav receives funding from National Science Foundation, Robin Hood Foundation, and Apple. </span></em></p><p class="fine-print"><em><span>Michael Lachney does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Without university-level programs to provide teacher training for advanced computer science, states will not be able to offer high-quality computer science education to all students.Aman Yadav, Professor of Educational Psychology and Educational Technology, Michigan State UniversityMichael Lachney, Assistant Professor of Educational Psychology and Educational Technology, Michigan State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1810792022-04-14T13:19:04Z2022-04-14T13:19:04ZDo mushrooms really use language to talk to each other? A fungi expert investigates<figure><img src="https://images.theconversation.com/files/458150/original/file-20220414-95-8uscit.jpg?ixlib=rb-1.1.0&rect=0%2C422%2C6000%2C3565&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/mexican-magic-mushroom-psilocybe-cubensis-whose-1821529910">Alexander_Volkov/Shutterstock</a></span></figcaption></figure><p>Nearly all of Earth’s organisms communicate with each other in one way or another, from the nods and dances and squeaks and bellows of animals, through to the invisible chemical signals emitted by plant leaves and roots. But what about fungi? Are mushrooms as inanimate as they seem – or is something more exciting going on beneath the surface?</p>
<p><a href="https://royalsocietypublishing.org/doi/10.1098/rsos.211926">New research</a> by computer scientist <a href="https://people.uwe.ac.uk/Person/AndrewAdamatzky">Andrew Adamatzky</a> at the Unconventional Computing Laboratory of the University of the West of England, suggests this ancient kingdom has an electrical “language” all of its own – far more complicated than anyone previously thought. According to the study, fungi might even use “words” to form “sentences” to communicate with neighbours. </p>
<p>Almost all communication within and between multi-cellular animals involves highly specialised cells called nerves (or neurones). These transmit messages from one part of an organism to another via a connected network called a nervous system. The “language” of the nervous system comprises distinctive patterns of spikes of electrical potential (otherwise known as impulses), which help creatures detect and respond rapidly to what’s going on in their environment. </p>
<p>Despite lacking a nervous system, fungi seem to transmit information using electrical impulses across thread-like filaments called hyphae. The filaments form a thin web called a mycelium that links fungal colonies within the soil. These networks are remarkably similar to animal nervous systems. By measuring the frequency and intensity of the impulses, it may be possible to unpick and understand the languages used to communicate within and between organisms across the kingdoms of life.</p>
<p>Using tiny electrodes, Adamatzky recorded the rhythmic electrical impulses transmitted across the mycelium of four different species of fungi.</p>
<p>He found that the impulses varied by amplitude, frequency and duration. By drawing mathematical comparisons between the patterns of these impulses with those more typically associated with human speech, Adamatzky suggests they form the basis of a fungal language comprising up to 50 words organised into sentences. The complexity of the languages used by the different species of fungi appeared to differ, with the split gill fungus (<em>Schizophyllum commune</em>) using the most complicated lexicon of those tested. </p>
<figure class="align-center ">
<img alt="A collection of mushrooms with frilly edges." src="https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=384&fit=crop&dpr=1 600w, https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=384&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=384&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=483&fit=crop&dpr=1 754w, https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=483&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/458144/original/file-20220414-24-gbe2cc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=483&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The split gill fungus is common in rotting wood and is reported to have more than 28,000 sexes.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Schizophyllum_commune#/media/File:Schizophyllum_commune_(Split_gill)_(33389628036).jpg">Bernard Spragg/Wikipedia</a></span>
</figcaption>
</figure>
<p>This raises the possibility that fungi have their own electrical language to share specific information about food and other resources nearby, or potential sources of danger and damage, between themselves or even with more distantly connected partners.</p>
<h2>Underground communication networks</h2>
<p>This isn’t the first evidence of fungal mycelia transmitting information. </p>
<p>Mycorrhizal fungi – near-invisible thread-like fungi that form intimate partnerships with plant roots – have extensive networks in the soil that connect neighbouring plants. Through these associations, plants usually gain access to nutrients and moisture supplied by the fungi from the tiniest of pores within the soil. This vastly expands the area that plants can draw sustenance from and boosts their tolerance of drought. In return, the plant transfers sugars and fatty acids to the fungi, meaning both benefit from the relationship.</p>
<figure class="align-center ">
<img alt="A clump of soil containing fine, white threads." src="https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/458145/original/file-20220414-18-gbe2cc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The mycelium of mycorrhizal fungi enable symbiotic relationships with plants.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/fungal-mycelium-mycorrhizae-that-provide-symbiotic-1596740905">KYTan/Shutterstock</a></span>
</figcaption>
</figure>
<p><a href="https://onlinelibrary.wiley.com/doi/full/10.1111/ele.12115">Experiments using plants</a> connected only by mycorrhizal fungi have shown that when one plant within the network is attacked by insects, the defence responses of neighbouring plants activate too. It seems that warning signals are transmitted via the fungal network. </p>
<p>Other research has shown that plants can transmit more than just information across these fungal threads. <a href="https://www.nature.com/articles/41557...">In some studies</a>, it appears that plants, including trees, can transfer carbon-based compounds such as sugars to neighbours. These transfers of carbon from one plant to another via fungal mycelia could be particularly helpful in supporting seedlings as they establish. This is especially the case when those seedlings are shaded by other plants and so limited in their abilities to photosynthesise and fix carbon for themselves. </p>
<p>Exactly how these underground signals are transmitted remains a matter of some debate though. It is possible the fungal connections carry chemical signals from one plant to another within the hyphae themselves, in a similar way to how the electrical signals featured in the new research are transmitted. But it is also possible that signals become dissolved in a <a href="https://onlinelibrary.wiley.com/doi/full/10.1111/ele.12115">film of water</a> held in place and moved across the network by surface tension. Alternatively, other microorganisms could be involved. <a href="https://nph.onlinelibrary.wiley.com/doi/full/10.1111/nph.17081">Bacteria in and around fungal hyphae</a> might change the <a href="https://www.nature.com/articles/s41396-021-00920-2">composition of their communities</a> or function in response to changing root or fungal chemistry and induce a response in neighbouring fungi and plants. </p>
<p>The new research showing transmission of language-like electrical impulses directly along fungal hyphae provides new clues about how messages are conveyed by fungal mycelium.</p>
<h2>Mushroom for debate?</h2>
<p>Although interpreting the electrical spiking in fungal mycelia as a language is appealing, there are alternative ways to look at the new findings. </p>
<p>The rhythm of electrical pulses bears some similarity to <a href="https://www.sciencedirect.com/science/article/pii/S1087184507000448">how nutrients flow along fungal hyphae</a>, and so may reflect processes within fungal cells that are not directly related to communication. The rhythmic pulses of nutrients and electricity may reveal the patterns of fungal growth as the organism explores its surroundings for nutrients. </p>
<p>Of course, the possibility remains that the electrical signals do not represent communication in any form at all. Rather, charged hyphal tips passing the electrode could have generated the spikes in activity observed in the study.</p>
<figure class="align-center ">
<img alt="Small mushrooms with brown, pointy caps growing out of a mossy log." src="https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/458009/original/file-20220413-1403-rwy6gk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">What on Earth are they talking about?</span>
<span class="attribution"><span class="source">Katie Field</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>More research is clearly needed before we can say with any certainty what the electrical impulses detected in this study mean. What we can take from the research is that electrical spikes are, potentially, a new mechanism for transmitting information across fungal mycelia, with important implications for our understanding of the role and significance of fungi in ecosystems. </p>
<p>These results could represent the first insights into fungal intelligence, even consciousness. That’s a very big “could”, but depending on the definitions involved, the possibility remains, though it would seem to exist on time scales, frequencies and magnitudes not easily perceived by humans.</p><img src="https://counter.theconversation.com/content/181079/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Katie Field receives funding from NERC, BBSRC, ERC, and the Leverhulme Trust. </span></em></p>New research has found what may be language in electrical impulses transmitted between mushrooms.Katie Field, Professor in Plant-Soil Processes, University of SheffieldLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1780742022-03-20T09:28:15Z2022-03-20T09:28:15ZA computer science technique could help gauge when the pandemic is ‘over’<figure><img src="https://images.theconversation.com/files/452518/original/file-20220316-7998-1x93dbq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The world wants the pandemic to end and life to return to normal. When will that happen?</span> <span class="attribution"><span class="source">Marc Fernandes/NurPhoto via Getty Images</span></span></figcaption></figure><p>In early 2022, nearly two years after Covid was declared a pandemic by the World Health Organization, experts are <a href="https://www.science.org/content/article/when-pandemic-over">mulling a big question</a>: when is a pandemic “over”? </p>
<p>So, what’s the answer? What criteria should be used to determine the “end” of Covid’s pandemic phase? These are deceptively simple questions and there are no easy answers.</p>
<p>I am a computer scientist who <a href="https://scholar.google.com/citations?hl=en&user=lccln9YAAAAJ&view_op=list_works&sortby=pubdate">investigates</a> the development of ontologies. In computing, ontologies are a means to formally structure knowledge of a subject domain, with its entities, relations and constraints, so that a computer can process it in various applications and help humans to be more precise.</p>
<p>Ontologies can discover knowledge that’s been overlooked until now: in <a href="https://academic.oup.com/bioinformatics/article/22/14/e530/227867">one instance</a>, an ontology identified two additional functional domains in phosphatases (a group of enzymes) and a novel domain architecture of a part of the enzyme. Ontologies also underlie <a href="https://blog.google/products/search/introducing-knowledge-graph-things-not/">Google’s Knowledge Graph</a> that’s behind those knowledge panels on the right-hand side of a search result.</p>
<p>Applying ontologies to the questions I posed at the start is useful. This approach helps to clarify why it is difficult to specify a cut-off point at which a pandemic can be declared “over”. The process involves collecting definitions and characterisations from domain experts, like epidemiologists and infectious disease scientists, consulting relevant research and other ontologies and investigating the nature of what entity “X” is. </p>
<p>“X”, here, would be the pandemic itself – not a mere shorthand definition, but looking into the properties of that entity. Such a precise characterisation of the “X” will also reveal when an entity is “not an X”. For instance, if X = house, a property of houses is that they all must have a roof; if some object doesn’t have a roof, it definitely isn’t a house.</p>
<p>With those characteristics in hand, a precise, formal specification can be formulated, aided by additional methods and tools. From that, the what or when of “X” – the pandemic is over or it is not – would logically follow. If it doesn’t, at least it will be possible to explain why things are not that straightforward. </p>
<p>This sort of precision complements health experts’ efforts, helping humans to be more precise and communicate more precisely. It forces us to make implicit assumptions explicit and clarifies where disagreements may be. </p>
<h2>Definitions and diagrams</h2>
<p>I <a href="https://keet.wordpress.com/2022/01/26/what-is-a-pandemic-ontologically/">conducted an ontological analysis</a> of “pandemic”. First, I needed to find definitions of a pandemic. </p>
<p>Informally, an epidemic is an occurrence during which there are multiple instances of an infectious disease in organisms, for a limited duration of time, that affects a community of said organisms living in some region. A pandemic, as a minimum, extends the region where the infections take place. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/when-will-the-covid-19-pandemic-end-4-essential-reads-on-past-pandemics-and-what-the-future-could-bring-175587">When will the COVID-19 pandemic end? 4 essential reads on past pandemics and what the future could bring</a>
</strong>
</em>
</p>
<hr>
<p>Next, I drew from an existing foundational ontologies. This contains generic categories like “object”, “process”, and “quality”. I also used domain ontologies, which contain entities specific to a subject domain, like infectious diseases. Among other resources, I consulted the <a href="https://doi.org/10.1007/978-1-4419-1327-2_19">Infectious Disease Ontology</a> and the <a href="http://wonderweb.man.ac.uk/deliverables/documents/D18.pdf">Descriptive Ontology for Linguistic and Cognitive Engineering</a>.</p>
<p>First, I aligned “pandemic” to a foundational ontology, using a <a href="https://dl.acm.org/doi/10.1145/2505515.2505539">decision diagram</a> to simplify the process. This helped to work out what kind of <a href="https://people.cs.uct.ac.za/%7Emkeet/files/OEbook.pdf#page=145">thing and generic category</a> “pandemic” is:</p>
<p>(1) Is [pandemic] something that is happening or occurring? Yes (perdurant, i.e., something that unfolds in time, rather than be wholly present). </p>
<p>(2) Are you able to be present or participate in [a pandemic]? Yes (event). </p>
<p>(3) Is [a pandemic] atomic, i.e., has no subdivisions and has a definite end point? No (accomplishment). </p>
<p>The word “accomplishment” may seem strange here. But, in this context, it makes clear that a pandemic is a <a href="https://doi.org/10.1007/978-3-319-69904-2_33">temporal entity</a> with a limited lifespan and will evolve – that is, <a href="http://ceur-ws.org/Vol-2050/CREOL_paper_1.pdf">cease to be a pandemic and evolve back to epidemic</a>, as indicated in this diagram. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=440&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=440&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=440&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=553&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=553&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452458/original/file-20220316-25-s1jqfd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=553&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Maria Keet</span></span>
</figcaption>
</figure>
<h2>Characteristics</h2>
<p>Next, I examined a pandemic’s characteristics described in the literature. A comprehensive list is described in <a href="https://academic.oup.com/jid/article/200/7/1018/903237">a paper</a> by US infectious disease specialists published in 2009 during the global H1N1 influenza virus outbreak. They collated eight characteristics of a pandemic.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/new-covid-data-south-africa-has-arrived-at-the-recovery-stage-of-the-pandemic-177933">New COVID data: South Africa has arrived at the recovery stage of the pandemic</a>
</strong>
</em>
</p>
<hr>
<p>I listed them and assessed them from an ontological perspective:</p>
<ol>
<li><p>Wide geographic extension. This is an imprecise feature – be it <a href="https://towardsdatascience.com/a-very-brief-introduction-to-fuzzy-logic-and-fuzzy-systems-d68d14b3a3b8?gi=31f44d216a95">fuzzy</a> in the mathematical sense or estimated by other means: there isn’t a crisp threshold when “wide” starts or ends.</p></li>
<li><p>Disease movement: there’s transmission from place to place and that can be traced. A yes/no characteristic, but it could be made categorical or with ranges of how slowly or fast it moves.</p></li>
<li><p>High attack rates and explosiveness, or: many people are affected in a short timespan. Many, short, fast – all indicate imprecision.</p></li>
<li><p>Minimal population immunity: immunity is relative. You have it to a degree to some or all of the variants of the infectious agent, and likewise for the population. This is an inherently fuzzy feature.</p></li>
<li><p>Novelty: A yes/no feature, but one could add “partial”.</p></li>
<li><p>Infectiousness: it must be infectious (excluding non-infectious things, like obesity), so a clear yes/no.</p></li>
<li><p>Contagiousness: this may be from person to person or through some other medium. This property includes human-to-human, human-animal intermediary (e.g., fleas, rats), and human-environment (notably: water, as with cholera), and their attendant aspects.</p></li>
<li><p>Severity: Historically, the term “pandemic” has been applied more often for severe diseases or those with high fatality rates (e.g., HIV/AIDS) than for milder ones. This has some subjectivity, and thus may be fuzzy.</p></li>
</ol>
<p>Properties with imprecise boundaries annoy epidemiologists because they may lead to <a href="https://www.nature.com/articles/s41598-021-81814-3">different outcomes of their prediction models</a>. But from my ontologist’s viewpoint, we’re getting somewhere with these properties. From the computational side, <a href="https://www.sciencedirect.com/science/article/abs/pii/S095741741100978X">automated reasoning with fuzzy features</a> is possible. </p>
<p>COVID, at least early in 2020, easily ticked all eight boxes. A suitably automated reasoner would have classified that situation as a pandemic. But now, in early 2022? Severity (point 8) has largely decreased and immunity (point 4) has risen. Point 5 – are there worse variants of concern to come – is the million-dollar question. More ontological analysis is needed.</p>
<h2>Highlighting the difficulties</h2>
<p>Ontologically speaking, then, a pandemic is an event (“accomplishment”) that unfolds in time. To be classified as a pandemic, there are a number of features that aren’t all crisp and for which the imprecise boundaries haven’t all been set. Conversely, it implies that classifying the event as “not a pandemic” is just as imprecise. </p>
<p>This isn’t a full answer as to what a pandemic is ontologically, but it does shed light on the difficulties of calling it “over” – and illustrates well that there will be disagreement about it.</p><img src="https://counter.theconversation.com/content/178074/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Maria Keet does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>This sort of precision complements health experts’ efforts, helping humans to be more precise and communicate more precisely.Maria Keet, Associate professor in Computer Science, University of Cape TownLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1730482021-12-21T14:55:12Z2021-12-21T14:55:12ZNickel oxide is a material that can ‘learn’ like animals and could help further artificial intelligence research<figure><img src="https://images.theconversation.com/files/436979/original/file-20211210-136652-1mgxcfu.JPG?ixlib=rb-1.1.0&rect=0%2C530%2C3953%2C3095&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Nickel oxide, the gray-and-black-striped material, demonstrates unique properties when exposed to hydrogen.</span> <span class="attribution"><span class="source">Purdue University/Kayla Wiles</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>A unique material, nickel oxide demonstrates the <a href="https://doi.org/10.1073/pnas.2017239118">ability to learn things about its environment</a> in a way that emulates the most basic learning abilities of animals, as my colleagues and I describe in a new paper.</p>
<p>For over half a century, neuroscientists have studied sea slugs to understand basic animal learning. Two fundamental concepts of learning are <a href="https://doi.org/10.1016/j.nlm.2008.09.012">habituation</a> and <a href="https://doi.org/10.1126/science.11560">sensitization</a>. Habituation occurs when an organism’s response to a repeated stimulus continuously decreases. When researchers first touch a sea slug, its gills retract. But the more they touch the slug, the <a href="https://doi.org/10.1126/science.167.3926.1745">less it retracts its gills</a>. Sensitization is an organism’s extreme reaction to a harmful or unexpected stimulus. If researchers then shock a sea slug, it will <a href="https://doi.org/10.1126/science.11560">retract its gills much more dramatically</a> than when it was merely touched. This is sensitization. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A small square of material inside a test chamber of metal with tubes." src="https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/436815/original/file-20211209-27-12uc827.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">When nickel oxide is alternately bathed in hydrogen gas and air, its behavior changes.</span>
<span class="attribution"><span class="source">Purdue University/Kayla Wiles</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Nickel oxide has features that are strikingly similar to this learning behavior. Instead of gills retracting, we measured the change in electrical conductivity of the material. The stimulus, instead of a finger poke, was repeatedly alternating the environment of the nickel oxide between normal air and hydrogen gas.</p>
<p>Nickel oxide is interesting because when you expose it to hydrogen gas, its crystalline structure subtly changes and <a href="https://doi.org/10.1002/pssa.200778914">more electrons become available to generate an electrical current</a>. In our experiment, we kept switching between the hydrogen-only and regular air environments. You would expect the electrical conductivity to oscillate up and down directly in relation to the exposure to hydrogen or air. But just as with the sea slugs, the change in conductivity of the nickel oxide slowly went down the more we stimulated it. It got habituated to the hydrogen.</p>
<p>When we exposed the material to bright light or ozone, though, it rapidly changed its conductivity – the same way a slug will always respond dramatically to a small shock.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A small piece of material underneath a large piece of scientific equipment." src="https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/436813/original/file-20211209-140267-qt8jx9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The conductivity of nickel oxide stores information similarly to the way slugs learn.</span>
<span class="attribution"><span class="source">Purdue University/Kayla Wiles</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Why it matters</h2>
<p>The ability to learn, remember or forget information as needed is a powerful skill for any animal or machine. So far, the vast majority of research in the field of artificial intelligence has <a href="https://doi.org/10.1126/science.aaa8415">focused on software-based approaches to machine learning</a>, with far less effort dedicated to <a href="https://doi.org/10.1063/1.5113574">studying the learning abilities of materials</a>.</p>
<p>At the center of these two related areas of research lies the field of <a href="https://doi.org/10.1038/s41586-019-1677-2">brain-inspired computers</a>. For intelligence to be encoded into hardware, scientists need semiconductors that can learn from past experience and adapt to dynamic environments in a physical way similar to that of neurons in animal brains. Our new research showing how nickel oxide demonstrates features of learning hints at how this or similar materials could serve as building blocks for computers of the future. </p>
<h2>What still isn’t known</h2>
<p>Before such materials can be incorporated into computer chips there are some knowledge gaps that need to be addressed. For instance, it is not yet clear at what <a href="https://doi.org/10.1146/annurev-neuro-090919-022842">time scales a material needs to learn</a> for it to be useful in electrical systems. How quickly does something need to learn or forget to be useful? Another unknown is how or whether it is possible to change the structure of nickel oxide to produce different learning behaviors.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A small square of gray material with stripes." src="https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/436812/original/file-20211209-141979-3zqxkp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It is unclear whether nickel oxide itself can be used for computing, but the concepts at play could inspire further innovation.</span>
<span class="attribution"><span class="source">Purdue University/Erin Easterling</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>What’s next</h2>
<p>In addition to further experiments on the material itself, there are theoretical lessons to explore. Observations of collective behavior of animals in nature – such as bird flocks and schools of fish – have <a href="https://doi.org/10.1007/0-387-27705-6_6">inspired researchers to develop fields of AI like swarm intelligence</a>. In a similar fashion, the interesting collective motion of atoms and electrons in materials could inspire AI and hardware design in the future. </p>
<p>As new materials that can accommodate mobile atoms are discovered, I am optimistic we will see further breakthroughs that can bring researchers one step closer to designing computers that emulate animal brains.</p><img src="https://counter.theconversation.com/content/173048/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>S. Ramanathan receives funding from the National Science Foundation, Department of Defense agencies for basic research in physical sciences and engineering.</span></em></p>The ability to store information is central to learning and the field of artificial intelligence. Researchers have shown how a unique material shows basic learning properties similar to that of slugs.Shriram Ramanathan, Professor of Materials Engineering, Purdue UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1722792021-11-24T13:41:51Z2021-11-24T13:41:51ZStereotypes about girls dissuade many from careers in computer science<figure><img src="https://images.theconversation.com/files/433194/original/file-20211122-27-gyvft9.jpg?ixlib=rb-1.1.0&rect=8%2C0%2C2986%2C2001&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Only about 1 in 5 computer scientists are women. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/project-mc2-mika-abdalla-victoria-vida-and-genneya-walton-news-photo/871499520?adppopup=true">Rachel Murray/Getty Images for MGA Entertainment</a></span></figcaption></figure><p>Stereotypes about what boys and girls supposedly like aren’t hard to find.</p>
<p><a href="https://www.theatlantic.com/business/archive/2014/12/toys-are-more-divided-by-gender-now-than-they-were-50-years-ago/383556/">Toy advertisements</a> send signals that science and electronic toys are intended for boys rather than girls. Computer scientists and engineers on <a href="https://seejane.org/wp-content/uploads/portray-her-full-report.pdf">television shows and movies</a> are often white men, like the guys on “The Big Bang Theory.”</p>
<p>Policymakers, teachers and <a href="https://psycnet.apa.org/buy/2002-11235-005">parents</a> sometimes subscribe to these stereotypes, too. They might <a href="https://doi.org/10.3389/fpsyg.2015.00049">spread them to children</a>.</p>
<p><a href="https://doi.org/10.1146/annurev-psych-073115-103235">Efforts</a> to <a href="https://www.nytimes.com/2017/08/07/business/google-women-engineer-fired-memo.html">combat these stereotypes</a> often focus on boys’ and girls’ abilities.</p>
<p>But as researchers who specialize in <a href="https://scholar.google.com/citations?user=_UmfrM8AAAAJ&hl=en&oi=ao">motivation</a>, <a href="https://scholar.google.com/citations?user=JmWiiRAAAAAJ&hl=en&oi=ao">identity</a> and <a href="https://scholar.google.com/citations?user=5jrePlgAAAAJ&hl=en&oi=ao">cognitive development</a>, we think society has largely overlooked another harmful stereotype. And that is the notion that girls are less interested than boys are in STEM.</p>
<p>In our peer-reviewed research – published in November 2021 in <a href="https://www.pnas.org/content/118/48/e2100030118">Proceedings of the National Academy of Sciences</a> – we found that these stereotypes about girls’ interest in science, technology, engineering and math – or lack thereof – are fairly widespread among young people today. We also found that these stereotypes actually have an effect on girls’ motivation and sense of belonging in computer science and engineering.</p>
<h2>Gains made</h2>
<p>Fields like math are <a href="https://ncses.nsf.gov/pubs/nsf21321">close to having gender parity</a> – that is to say, roughly equal numbers of men and women – and <a href="https://ncses.nsf.gov/pubs/nsf21321">women are actually overrepresented</a> in fields like biology among college graduates in the U.S.</p>
<p>Yet, the nation is still failing to diversify computer science and engineering. <a href="https://ncses.nsf.gov/pubs/nsf21321">Only about 1 in 5</a> degrees in computer science and engineering go to women.</p>
<p>Our research shows that societal stereotypes linking these fields with boys and men act as a barrier that keeps girls and young women away. There have been many conversations about <a href="https://www.nytimes.com/2013/10/06/magazine/why-are-there-still-so-few-women-in-science.html">the harm caused</a> by <a href="https://www.theatlantic.com/magazine/archive/1999/08/thin-ice-stereotype-threat-and-black-college-students/304663">stereotypes about natural talent</a>, which assert that men are better than women at STEM. But what might be even more detrimental for girls’ motivation are stereotypes that men are more interested than women in these activities and careers. These stereotypes may give girls the sense that they don’t belong.</p>
<h2>Probing children’s perceptions</h2>
<p>For our study, our first step was to document whether children and adolescents believe these societal stereotypes. We surveyed 2,277 youths in grades 1-12 in 2017 and 2019 about how interested they think girls and boys are in computer science and engineering. The majority of youths reported that boys were more likely than girls to be interested in these fields. Most youths – 63% – believed that girls are less interested than boys in engineering. Only 9% believed that girls are more interested than boys in engineering. These “interest stereotypes,” if you will, were endorsed by youths from diverse backgrounds, including Black, white, Asian and Hispanic youths.</p>
<p>They were endorsed by kids as early as age 6, in first grade. These beliefs about gendered interests were also more common than stereotypes about ability, that boys are more talented than girls at these fields.</p>
<p>We also discovered that these interest stereotypes were linked to worse outcomes for girls. The more that a typical girl in our study believed in these stereotypes favoring boys, the less motivated she was in computer science and engineering. This wasn’t the case for the typical boy. The more he believed in these stereotypes, the more motivated he was.</p>
<h2>Effects on motivation</h2>
<p>We also did two laboratory experiments using a gold-standard random-assignment design to see whether interest stereotypes have causal effects on motivation. We told children about two activities they could try. The only difference between the activities was that one activity – one that was randomly chosen – was linked to a stereotype that girls were less interested than boys in that activity. </p>
<p>The other activity was not linked to such a stereotype. If children preferred one activity over the other, we could infer that the stereotype caused a difference in their preferences. We found that interest stereotypes can actually cause girls’ lower motivation for computer science activities.</p>
<p>Only 35% of girls chose the stereotyped activity over the nonstereotyped activity. These stereotypes – which favored boys in this case – weren’t a problem for boys, who showed no preference. There was no gender gap when there was no stereotype – a gender gap only appeared when the activity was stereotyped.</p>
<h2>Dismantling stereotypes</h2>
<p>Why are interest stereotypes so powerful? Interest stereotypes may make girls assume: If boys like these fields more than girls, then I won’t like these fields either. They also send a clear signal about who belongs there. <a href="https://psycnet.apa.org/buy/2015-37516-001">A sense of belonging matters a lot</a> for motivation, including young women in STEM fields like computer science and engineering. The lower the girls’ sense of belonging, the lower their interest.</p>
<p>But what if the stereotypes are true? On average, girls in the U.S. usually do report being less interested than boys in <a href="https://psycnet.apa.org/buy/2015-37516-001">computer science</a> and <a href="https://doi.org/10.1177/1069072712475290">engineering</a>.</p>
<p>Whether or not these cultural stereotypes are currently true, we believe they can create a vicious cycle. Girls might miss out on opportunities because of an assumption that they are not interested or should not be interested in certain STEM fields. Unless adults deliberately send girls a different message about who belongs in computer science and engineering, we as a society discourage girls from trying these activities and discovering that they like them. </p>
<p>But the good news is that the lack of belonging that many girls feel in certain STEM feels is not permanent. On the contrary, we think it can be changed.</p>
<p>There are simple ways to send kids a different message about who likes to do computer science and engineering. Parents and other adults can check their assumptions about what toys to buy girls for their birthdays or holidays, or what summer camps they should attend. Girls can be shown examples of women like <a href="https://www.becauseofthemwecan.com/blogs/news/self-driving-startup-zoox-led-by-black-female-ceo-aicha-evans-is-purchased-by-amazon-for-1-2-billion">Aicha Evans</a> and <a href="https://www.youtube.com/watch?v=FEeTLopLkEo">Debbie Sterling</a> – women who are changing the world through technology and enjoying themselves while doing so.</p>
<p>It’s not enough for girls to realize that they can do computer science and engineering. In order to change the status quo, we think it’s necessary to spread the word that many girls actually want to do these things as well.</p><img src="https://counter.theconversation.com/content/172279/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Allison Master receives funding from the National Science Foundation and the U. S. Department of Education's Institute of Education Sciences. The opinions expressed are those of the authors and do not represent views of the Institute of Education Sciences, the U.S. Department of Education, or other funders. </span></em></p><p class="fine-print"><em><span>Andrew N. Meltzoff gratefully acknowledges receipt of funding from the National Science Foundation, the Bezos Family Foundation, and the Overdeck Family Foundation. The opinions expressed are those of the authors and do not represent views of the funders.</span></em></p><p class="fine-print"><em><span>Sapna Cheryan receives funding from the National Science Foundation and the U.S. Department of Education's Institute of Education Sciences. The views expressed by the authors do not necessarily represent the views of these funders.</span></em></p>Could it be that girls aren’t pursuing jobs in computer science and engineering because society has told them that’s not what they want to do? Three scholars weigh in.Allison Master, Assistant Professor of Education, University of HoustonAndrew N. Meltzoff, Professor of Psychology, University of WashingtonSapna Cheryan, Professor of Psychology, University of WashingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1681602021-09-24T12:34:41Z2021-09-24T12:34:41ZHow a team of musicologists and computer scientists completed Beethoven’s unfinished 10th Symphony<figure><img src="https://images.theconversation.com/files/423012/original/file-20210923-15-1tlo28i.jpg?ixlib=rb-1.1.0&rect=14%2C31%2C1902%2C1405&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Throughout the project, Beethoven's genius loomed.</span> <span class="attribution"><a class="source" href="https://www.publicdomainpictures.net/en/view-image.php?image=239318&picture=violinist-playing-with-beethoven">Circe Denyer</a></span></figcaption></figure><p>When Ludwig van Beethoven died in 1827, he was three years removed from the completion of his Ninth Symphony, a work heralded by many as his magnum opus. He had started work on his 10th Symphony but, <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1071597/">due to deteriorating health</a>, wasn’t able to make much headway: All he left behind were some musical sketches.</p>
<p>Ever since then, Beethoven fans and musicologists have puzzled and lamented over what could have been. His notes teased at some magnificent reward, albeit one that seemed forever out of reach. </p>
<p>Now, thanks to the work of a team of music historians, musicologists, composers and computer scientists, Beethoven’s vision will come to life.</p>
<p>I presided over the artificial intelligence side of the project, leading a group of scientists at the creative AI startup <a href="https://www.playform.io/">Playform AI</a> that taught a machine both Beethoven’s entire body of work and his creative process.</p>
<p>A full recording of Beethoven’s 10th Symphony is set to be released on Oct. 9, 2021, the same day as the world premiere performance scheduled to take place in Bonn, Germany – the culmination of a two-year-plus effort. </p>
<h2>Past attempts hit a wall</h2>
<p>Around 1817, the Royal Philharmonic Society in London commissioned Beethoven to write his Ninth and 10th symphonies. Written for an orchestra, <a href="http://professordeannaheikkinen.weebly.com/uploads/1/6/8/5/16856420/classical_music_form.pdf">symphonies often contain four movements</a>: the first is performed at a fast tempo, the second at a slower one, the third at a medium or fast tempo, and the last at a fast tempo.</p>
<p>Beethoven completed his <a href="https://online-learning.harvard.edu/course/first-nights-beethoven%E2%80%99s-9th-symphony-and-19th-century-orchestra?delta=1">Ninth Symphony</a> in 1824, which concludes with the timeless “<a href="https://www.youtube.com/watch?v=uooe16ILaPo">Ode to Joy</a>.”</p>
<p>But when it came to the 10th Symphony, Beethoven didn’t leave much behind, other than some musical notes and a handful of ideas he had jotted down.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Piece of paper with musical notes jotted on it." src="https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=411&fit=crop&dpr=1 600w, https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=411&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=411&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=517&fit=crop&dpr=1 754w, https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=517&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/422733/original/file-20210922-13-128nrmx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=517&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A page of Beethoven’s notes for his planned 10th Symphony.</span>
<span class="attribution"><span class="source">Beethoven House Museum</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>There have been some past attempts to reconstruct parts of Beethoven’s 10th Symphony. Most famously, in 1988, musicologist Barry Cooper ventured to complete the first and second movements. He wove together 250 bars of music from the sketches to create what was, in his view, <a href="https://web.archive.org/web/20090203234337/http://lucare.com/immortal/cooper.html">a production of the first movement</a> that was faithful to Beethoven’s vision. </p>
<p>Yet the sparseness of Beethoven’s sketches made it impossible for symphony experts to go beyond that first movement.</p>
<h2>Assembling the team</h2>
<p>In early 2019, Dr. Matthias Röder, the director of <a href="https://karajan-institut.org/">the Karajan Institute</a>, an organization in Salzburg, Austria, that promotes music technology, contacted me. He explained that he was putting together a team to complete Beethoven’s 10th Symphony in celebration of the composer’s 250th birthday. Aware of <a href="https://theconversation.com/meet-aican-a-machine-that-operates-as-an-autonomous-artist-104381">my work on AI-generated art</a>, he wanted to know if AI would be able to help fill in the blanks left by Beethoven. </p>
<p>The challenge seemed daunting. To pull it off, AI would need to do something it had never done before. But I said I would give it a shot. </p>
<p>Röder then compiled a team that included Austrian composer Walter Werzowa. <a href="https://www.latimes.com/archives/la-xpm-1999-oct-20-fi-24321-story.html">Famous for writing</a> Intel’s <a href="https://www.youtube.com/watch?v=-ihRPi4wcBY">signature bong jingle</a>, Werzowa was tasked with putting together a new kind of composition that would integrate what Beethoven left behind with what the AI would generate. <a href="https://music.cornell.edu/mark-gotham">Mark Gotham</a>, a computational music expert, led the effort to transcribe Beethoven’s sketches and process his entire body of work so the AI could be properly trained.</p>
<p>The team also included <a href="http://music.fas.harvard.edu/emeriti.shtml">Robert Levin</a>, a musicologist at Harvard University who also happens to be an incredible pianist. Levin <a href="http://journal.juilliard.edu/journal/95031/robert-levin-finishing-mozart">had previously finished</a> a number of incomplete 18th-century works by Mozart and Johann Sebastian Bach.</p>
<h2>The project takes shape</h2>
<p>In June 2019, the group gathered for a two-day workshop at Harvard’s music library. In a large room with a piano, a blackboard and a stack of Beethoven’s sketchbooks spanning most of his known works, we talked about how fragments could be turned into a complete piece of music and how AI could help solve this puzzle, while still remaining faithful to Beethoven’s process and vision. </p>
<p>The music experts in the room were eager to learn more about the sort of music AI had created in the past. I told them how AI had successfully generated music <a href="https://arxiv.org/abs/1612.01010">in the style of Bach</a>. However, this was only a harmonization of an inputted melody that sounded like Bach. It didn’t come close to what we needed to do: construct an entire symphony from a handful of phrases. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Painting of man writing in notebook." src="https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=803&fit=crop&dpr=1 600w, https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=803&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=803&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1010&fit=crop&dpr=1 754w, https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1010&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/422741/original/file-20210922-27-1pquhbj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1010&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The AI needed to learn from Beethoven’s entire body of work in order to create something the composer might have written.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/ludwig-van-beethoven-oil-on-canvas-ludwig-van-beethoven-news-photo/56459034?adppopup=true">Hulton Fine Art Collection/Getty Images</a></span>
</figcaption>
</figure>
<p>Meanwhile, the scientists in the room – myself included – wanted to learn about what sort of materials were available, and how the experts envisioned using them to complete the symphony. </p>
<p>The task at hand eventually crystallized. We would need to use notes and completed compositions from Beethoven’s entire body of work – along with the available sketches from the 10th Symphony – to create something that Beethoven himself might have written. </p>
<p>This was a tremendous challenge. We didn’t have a machine that we could feed sketches to, push a button and have it spit out a symphony. Most AI available at the time couldn’t continue an uncompleted piece of music beyond a few additional seconds. </p>
<p>We would need to push the boundaries of what creative AI could do by teaching the machine Beethoven’s creative process – how he would take a few bars of music and painstakingly develop them into stirring symphonies, quartets and sonatas. </p>
<h2>Piecing together Beethoven’s creative process</h2>
<p>As the project progressed, the human side and the machine side of the collaboration evolved. Werzowa, Gotham, Levin, and Röder deciphered and transcribed the sketches from the 10th Symphony, trying to understand Beethoven’s intentions. Using his completed symphonies as a template, they attempted to piece together the puzzle of where the fragments of sketches should go – which movement, which part of the movement. </p>
<p>They had to make decisions, like determining whether a sketch indicated the starting point of <a href="https://www.classical-music.com/features/articles/what-scherzo/">a scherzo</a>, which is a very lively part of the symphony, typically in the third movement. Or they might determine that a line of music was likely the basis of <a href="https://www.classical-music.com/features/articles/what-fugue/">a fugue</a>, which is a melody created by interweaving parts that all echo a central theme. </p>
<p>The AI side of the project – my side – found itself grappling with a range of challenging tasks. </p>
<p>First, and most fundamentally, we needed to figure out how to take a short phrase, or even just a motif, and use it to develop a longer, more complicated musical structure, just as Beethoven would have done. For example, the machine had to learn how Beethoven constructed the Fifth Symphony <a href="https://www.npr.org/sections/deceptivecadence/2012/11/19/165495617/beethovens-famous-4-notes-truly-revolutionary-music">out of a basic four-note motif</a>. </p>
<p><audio preload="metadata" controls="controls" data-duration="0" data-image="" data-title="Four notes famously serve as the basis for Beethoven's Fifth Symphony." data-size="630502" data-source="Australian Champber Orchestra/YouTube" data-source-url="https://www.youtube.com/watch?v=W6QFIqMZcYw" data-license="" data-license-url="">
<source src="https://cdn.theconversation.com/audio/2277/forty-seconds-of-beethovens-5th.mp3" type="audio/mpeg">
</audio>
<div class="audio-player-caption">
Four notes famously serve as the basis for Beethoven’s Fifth Symphony.
<span class="attribution"><a class="source" rel="nofollow" href="https://www.youtube.com/watch?v=W6QFIqMZcYw">Australian Champber Orchestra/YouTube</a><span class="download"><span>616 KB</span> <a target="_blank" href="https://cdn.theconversation.com/audio/2277/forty-seconds-of-beethovens-5th.mp3">(download)</a></span></span>
</div></p>
<p>Next, because the continuation of a phrase also needs to follow a certain musical form, whether it’s a scherzo, trio or fugue, the AI needed to learn Beethoven’s process for developing these forms. </p>
<p>The to-do list grew: We had to teach the AI how to take a melodic line and harmonize it. The AI needed to learn how to bridge two sections of music together. And we realized the AI had to be able to compose <a href="https://www.britannica.com/art/coda-music">a coda</a>, which is a segment that brings a section of a piece of music to its conclusion. </p>
<p>Finally, once we had a full composition, the AI was going to have to figure out how to orchestrate it, which involves assigning different instruments for different parts. </p>
<p>And it had to pull off these tasks in the way Beethoven might do so.</p>
<h2>Passing the first big test</h2>
<p>In November 2019, the team met in person again – this time, in Bonn, at the Beethoven House Museum, where the composer was born and raised.</p>
<p>This meeting was the litmus test for determining whether AI could complete this project. We printed musical scores that had been developed by AI and built off the sketches from Beethoven’s 10th. A pianist performed in a small concert hall in the museum before a group of journalists, music scholars and Beethoven experts. </p>
<figure class="align-center ">
<img alt="Group of people stand around a piano player." src="https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=269&fit=crop&dpr=1 600w, https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=269&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=269&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=338&fit=crop&dpr=1 754w, https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=338&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/422729/original/file-20210922-15-ph1fza.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=338&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Journalists and musicians gather to hear a pianist perform parts of Beethoven’s 10th Symphony.</span>
<span class="attribution"><span class="source">Ahmed Elgammal</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>We challenged the audience to determine where Beethoven’s phrases ended and where the AI extrapolation began. They couldn’t.</p>
<p>A few days later, one of these AI-generated scores was played by <a href="https://www.youtube.com/watch?v=Hu1GI0QNLSE">a string quartet in a news conference</a>. Only those who intimately knew Beethoven’s sketches for the 10th Symphony could determine when the AI-generated parts came in. </p>
<p>The success of these tests told us we were on the right track. But these were just a couple of minutes of music. There was still much more work to do. </p>
<h2>Ready for the world</h2>
<p>At every point, Beethoven’s genius loomed, challenging us to do better. As the project evolved, the AI did as well. Over the ensuing 18 months, we constructed and orchestrated two entire movements of more than 20 minutes apiece.</p>
<p>We anticipate some pushback to this work – those who will say that the arts should be off-limits from AI, and that AI has no business trying to replicate the human creative process. Yet when it comes to the arts, I see AI not as a replacement, but as a tool – one that opens doors for artists to express themselves in new ways.</p>
<p>This project would not have been possible without the expertise of human historians and musicians. It took an immense amount of work – and, yes, creative thinking – to accomplish this goal.</p>
<p>At one point, one of the music experts on the team said that the AI reminded him of an eager music student who practices every day, learns, and becomes better and better.</p>
<p>Now that student, having taken the baton from Beethoven, is ready to present the 10th Symphony to the world.</p>
<p><audio preload="metadata" controls="controls" data-duration="0" data-image="" data-title="A selection from Beethoven's 10th symphony." data-size="3543236" data-source="YouTube/Modern Recordings" data-source-url="https://www.youtube.com/watch?v=RESb0QVkLcM" data-license="CC BY-SA" data-license-url="http://creativecommons.org/licenses/by-sa/4.0/">
<source src="https://cdn.theconversation.com/audio/2276/beethoven-x-the-ai-project-iii-scherzo-allegro-trio-official-video-beethoven-orchestra-bonn.mp3" type="audio/mpeg">
</audio>
<div class="audio-player-caption">
A selection from Beethoven’s 10th symphony.
<span class="attribution"><a class="source" rel="nofollow" href="https://www.youtube.com/watch?v=RESb0QVkLcM">YouTube/Modern Recordings</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a><span class="download"><span>3.38 MB</span> <a target="_blank" href="https://cdn.theconversation.com/audio/2276/beethoven-x-the-ai-project-iii-scherzo-allegro-trio-official-video-beethoven-orchestra-bonn.mp3">(download)</a></span></span>
</div></p>
<p>[<em>Like what you’ve read? Want more?</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=likethis">Sign up for The Conversation’s daily newsletter</a>.]</p><img src="https://counter.theconversation.com/content/168160/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>This project received funding from Deutsche Telekom.</span></em></p>When Beethoven died, all he left behind were some sketches for his 10th Symphony. Now, thanks to the help of artificial intelligence, the composer’s vision is coming to life.Ahmed Elgammal, Professor, Director of the Art & AI Lab, Rutgers UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1643722021-09-08T12:24:47Z2021-09-08T12:24:47ZData science education lacks a much-needed focus on ethics<figure><img src="https://images.theconversation.com/files/417909/original/file-20210825-21-10jy4dw.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5150%2C3430&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Undergraduate students need to learn the responsible use of data science as well as the nuts and bolts.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/students-studying-in-computer-lab-royalty-free-image/153337890">Hill Street Studios/Stone via Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>Undergraduate training for data scientists - dubbed the <a href="https://hbr.org/2012/10/data-scientist-the-sexiest-job-of-the-21st-century">sexiest job of the 21st century</a> by Harvard Business Review - falls short in preparing students for the ethical use of data science, our new study found.</p>
<p>Data science lies at the nexus of statistics and computer science applied to a particular field such as astronomy, linguistics, medicine, psychology or sociology. The idea behind this data crunching is to use big data to address otherwise unsolvable problems, such as how health care providers can create <a href="https://doi.org/10.1038/415530a">personalized medicine based on a patient’s genes</a> and how businesses can make <a href="https://doi.org/10.1145/3292500.3330790">purchase predictions based on customers’ behavior</a>. </p>
<p>The U.S. Bureau of Labor Statistics projects <a href="https://www.bls.gov/ooh/computer-and-information-technology/computer-and-information-research-scientists.htm">a 15% growth in data science careers over the period of 2019-2029</a>, corresponding with an increased demand for data science training. Universities and colleges have responded to the demand by creating new programs or revamping existing ones. The number of undergraduate data science programs in the U.S. jumped from <a href="https://jise.org/Volume26/n2/JISEv26n2p103.html">13 in 2014</a> to <a href="http://datascience.community/colleges">at least 50</a> as of September 2020. </p>
<p>As educators and practitioners in <a href="https://scholar.google.com/citations?user=QsbzWj0AAAAJ&hl=en">data science</a>, we were prompted by the growth in programs to investigate what is covered, and what is not covered, in data science undergraduate education.</p>
<p>In <a href="https://doi.org/10.7717/peerj-cs.441">our study</a>, we compared undergraduate data science curricula with the expectations for <a href="https://www.nationalacademies.org/our-work/envisioning-the-data-science-discipline-the-undergraduate-perspective">undergraduate data science training</a> put forth by the National Academies of Sciences, Engineering and Medicine. Those expectations include training in ethics. We found most programs dedicated considerable coursework to mathematics, statistics and computer science, but little training in ethical considerations such as privacy and systemic bias. Only 50% of the degree programs we investigated required any coursework in ethics.</p>
<h2>Why it matters</h2>
<p>As with any powerful tool, the responsible application of data science requires training in how to use data science and to understand its impacts. Our results align with <a href="https://jise.org/Volume26/n2/JISEv26n2p103.html">prior work</a> that found little attention is paid to ethics in data science degree programs. This suggests that undergraduate data science degree programs may produce a workforce without the training and judgment to apply data science methods responsibly. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/mA4gypAiRYU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">This primer on data science ethics covers real-world harms.</span></figcaption>
</figure>
<p>It isn’t hard to find examples of irresponsible use of data science. For instance, <a href="https://doi.org/10.1111%2Fj.1740-9713.2016.00960.x">policing models that have a built-in data bias</a> can lead to an elevated police presence in historically over-policed neighborhoods. In another example, <a href="https://doi.org/10.1126%2Fscience.aax2342">algorithms used by the U.S. health care system are biased</a> in a way that causes Black patients to receive less care than white patients with similar needs. </p>
<p>We believe explicit training in ethical practices would better prepare a socially responsible data science workforce.</p>
<h2>What still isn’t known</h2>
<p>While data science is a relatively new field – still being defined as a discipline – guidelines exist for training undergraduate students in data science. These guidelines prompt the question: How much training can we expect in an undergraduate degree? </p>
<p>The National Academies recommend <a href="https://www.nationalacademies.org/our-work/envisioning-the-data-science-discipline-the-undergraduate-perspective">training in 10 areas</a>, including ethical problem solving, communication and data management.</p>
<p>Our work focused on undergraduate data science degrees at schools <a href="https://carnegieclassifications.iu.edu/classification_descriptions/basic.php">classified as R1</a>, meaning they engage in high levels of research activity. Further research could examine the amount of training and preparation in various aspects of data science at the Masters and Ph.D. levels and the nature of undergraduate data science training at schools of different research levels.</p>
<p>Given that many data science programs are new, there is considerable opportunity to compare the training that students receive with the expectations of employers. </p>
<h2>What’s next</h2>
<p>We plan to expand on our findings by investigating the pressures that might be driving curriculum development for degrees in other disciplines that are seeing similar job market growth.</p><img src="https://counter.theconversation.com/content/164372/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Undergraduate programs are springing up across the US to meet the burgeoning demand for workers trained in big data. Yet many of the programs lack training in the ethical use of data science.Jeffrey C. Oliver, Data Science Specialist, University of ArizonaTorbet McNeil, Ph.D. candidate in Educational Policy Studies and Practice, University of ArizonaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1642192021-08-31T12:28:18Z2021-08-31T12:28:18ZBilingual people with language loss due to stroke can pose a treatment challenge – computational modeling may help clinicians treat them<figure><img src="https://images.theconversation.com/files/415558/original/file-20210810-21-9b77fs.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5991%2C4122&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Researchers can program neural networks composed of artificial neurons to simulate language processing.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/artificial-intelligence-brain-royalty-free-image/1256860085?adppopup=true">Andriy Onufriyenko/Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p><a href="https://doi.org/10.1038/s41598-021-89443-6">New research shows that computational modeling</a> can predict how bilingual stroke patients will respond to language treatment – and that could help clinicians identify which language to focus treatment on and increase chances for improvement in both. </p>
<p><a href="https://doi.org/10.1212/01.wnl.0000265600.69385.6f">Aphasia</a> is a speech and language disorder often caused by stroke. Bilingual people with aphasia typically experience difficulty retrieving words in both of their languages. While language therapy can help them improve their ability to communicate, it’s not often clear to clinicians <a href="https://doi.org/10.3390/bs10090144">which language to target in treatment</a>.</p>
<p>I’m a <a href="https://scholar.google.com/citations?user=mexG-2kAAAAJ&hl=en&oi=ao">cognitive neuroscientist</a>, and my current work focuses on language treatment outcomes in bilinguals with aphasia. As part of the <a href="https://www.bu.edu/aphasiaresearch/">Aphasia Research Laboratory at Boston University</a>, my colleagues and I worked with <a href="http://nn.cs.utexas.edu/">computer scientists at the University of Texas at Austin</a> to develop <a href="http://dx.doi.org/10.1016/j.bandl.2019.104643">BiLex</a> – a <a href="https://doi.org/10.1016/j.semcdb.2015.07.001">computational model</a> that simulates the ability to retrieve words from memory in bilinguals.</p>
<p>The BiLex model is a <a href="https://theconversation.com/what-is-a-neural-network-a-computer-scientist-explains-151897">neural network</a> composed of artificial neurons that are programmed to simulate language processing. Our team trained individual BiLex models to <a href="https://doi.org/10.1038/s41598-021-89443-6">simulate word retrieval abilities in Spanish-English bilinguals with aphasia</a> after language treatment. </p>
<p>We simulated their word retrieval abilities before their stroke and then recreated the effects of stroke lesions in each person’s brain by deleting <a href="https://doi.org/10.1016/j.pneurobio.2011.08.002">neurons encoding</a> different word sounds and meanings. Our team used varying degrees of damage intensity to simulate the levels of word retrieval loss of each patient. We then retrained these BiLex models to simulate the effects of language therapy provided in either English or Spanish on both the treated and the untreated language.</p>
<p>Our findings show that BiLex can simulate treatment response, accurately predicting up to 82% of patient recovery in the treated language and 60% in the untreated language. </p>
<figure class="align-center ">
<img alt="Two stick figures speak through a tin can in Spanish and English" src="https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/414430/original/file-20210803-27-fbk3ii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">According to the U.S. Census Bureau report on language use, 83.6% of the foreign-born population aged 5 and older speaks a language other than English at home, which suggests a large bilingual representation in the general population.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/tin-can-phone-spanish-and-english-conversation-royalty-free-image/157561203?adppopup=true">JulNichols/via Getty Images</a></span>
</figcaption>
</figure>
<h2>Why it matters</h2>
<p>Despite the growing <a href="https://www.thieme-connect.com/products/ejournals/abstract/10.1055/s-0029-1225951">bilingual aging population at risk for post-stroke aphasia</a> worldwide, evidence on what language treatment works and for whom it works best is still limited. </p>
<p>Typically, predicting treatment outcomes for bilinguals with aphasia requires large-scale studies over a long period of time. This is because each person has unique characteristics that affect their recovery. Computational models like BiLex can offer a faster approach by reliably simulating multiple different profiles of bilingualism and language impairment. </p>
<p>Accurate computational simulations of response to language therapy could ultimately help clinicians decide which language to treat in bilinguals with aphasia in order to maximize <a href="https://pubs.asha.org/doi/10.1044/1058-0360%282013/12-0085%29">treatment response in their two languages</a>.</p>
<h2>What still isn’t known</h2>
<p>While our findings may help develop better and more personalized treatment plans in the future, questions about language recovery in bilinguals with aphasia remain unanswered. </p>
<p>Further research is needed on how people who know two languages differ from people who know just one language in their recovery from brain injuries affecting communication. Similarly, little is known about what determines aphasia recovery for different language combinations outside of Spanish and English, or what factors lead to optimal response to language therapy.</p>
<h2>What’s next</h2>
<p>Currently, our team is conducting a <a href="https://doi.org/10.1136/bmjopen-2020-040495">clinical trial to test if BiLex can correctly identify which language treatment option will lead to the maximum recovery in both languages on real bilingual patients</a>. If study results confirm that BiLex can help identify the optimal treatment language for bilinguals with aphasia, our computational model could help clinicians tailor treatment plans to promote better recovery in this population in the future.</p>
<p>[<em>Over 100,000 readers rely on The Conversation’s newsletter to understand the world.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=100Ksignup">Sign up today</a>.]</p><img src="https://counter.theconversation.com/content/164219/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The PROCoM project receives funding from the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health (grant U01DC014922) awarded to Swathi Kiran.
Claudia Peñaloza is currently affiliated with the University of Barcelona and receives funding from Ministerio de Ciencia e Innovación, Agencia Estatal de Investigación (IJC2018-037818). </span></em></p>Computational modeling can predict language therapy response in bilingual people with aphasia. In the future, this could help clinicians identify the best language for treatment.Claudia Peñaloza, Researcher, Aphasia Research Laboratory, Boston UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1488982021-01-19T13:09:00Z2021-01-19T13:09:00ZFor these students, using data in sports is about more than winning games<figure><img src="https://images.theconversation.com/files/378921/original/file-20210114-18-ydjg2l.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2935%2C2201&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The 'DATA Bulls' use computer science skills to create data analytics for college sports teams. </span> <span class="attribution"><span class="source">Felesia Stukes</span>, <span class="license">Author provided</span></span></figcaption></figure><p>When professional sports teams use big data and analytics, their objective is to improve player performance and win more games.</p>
<p>That approach is paying off in a major way.</p>
<p>For instance, after the Golden State Warriors became <a href="https://www.nba.com/warriors/news/warriors-earn-best-analytics-organization-award-2016-mit-sloan-sports-analytics-conference">one of the first NBA teams to invest in analytics</a>, the team subsequently won league championships in <a href="https://www.nba.com/history/season-recap-index">2015, 2017 and 2018</a>. Analytics is the science of looking for patterns in data to make more informed decisions. The Warriors also get regular assists from <a href="https://www.forbes.com/sites/patrickmurray/2019/05/26/inside-how-silicon-valley-helps-keep-the-golden-state-warriors-at-the-cutting-edge/#41f590803847">partners in Silicon Valley</a> – the famed tech hub near where the team is based. For that reason, it’s a small wonder why, in 2016, the Warriors were recognized at a sports analytics conference as the “<a href="https://www.nba.com/warriors/news/warriors-earn-best-analytics-organization-award-2016-mit-sloan-sports-analytics-conference">Best Analytics Organization</a>.”</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A basketball player passes the ball to another player." src="https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/379058/original/file-20210115-13-op6rl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Stephen Curry passes the ball around Anthony Davis during the 2018 NBA Playoffs.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/stephen-curry-of-the-golden-state-warriors-passes-the-ball-news-photo/958148100?adppopup=true">Sean Gardner/Getty Images</a></span>
</figcaption>
</figure>
<p>National Football League teams rely heavily on data as well. For instance, the Philadelphia Eagles <a href="https://www.nytimes.com/2018/02/02/sports/football/eagles-analytics-super-bowl-lii.html">used analytics</a> for everything from in-game strategy to roster management as the team ultimately went on to win Super Bowl LII – its first Super Bowl victory in franchise history – in 2018.</p>
<p>With the sports market expected to reach <a href="https://www.pwc.com/us/en/industries/tmt/assets/pwc-2021-tmt-sports-outlook.pdf">US $83.1 billion</a> in 2023, it’s a safe bet that big data and analytics will continue to play a big role in it.</p>
<p>That’s why, as a <a href="https://scholar.google.com/citations?user=_iv5C-EAAAAJ&hl=en&oi=ao">computer science researcher and educator</a>, when I use big data and analytics to help the men’s and women’s basketball teams at Johnson C. Smith University, where I teach, my objective is much broader than just figuring out how players can score more points and win more games.</p>
<p>Rather, using <a href="https://www.defense.gov/Newsroom/Releases/Release/Article/1996576/defense-department-announces-fiscal-year-2019-research-equipment-awards-to-mino/">federal grant money</a> that the Department of Defense has allocated for Historically Black Colleges and Universities – like mine – I have designed and expanded an entire research project that deals with the use of big data in sports.</p>
<p>My students are known as “DATA Bulls.” That name is a combination of the acronym for “Data, Analytics, Technology and Athletics,” and the nickname for our teams: the <a href="https://goldenbullsports.com/sports/2014/6/24/GEN_0624141114.aspx">Golden Bulls</a>. One of our chief aims is to use sports to boost the number of Black students in computer science education and research. </p>
<p>While we spend a lot of time using tracking devices and analyzing the data we get from those devices to help the Golden Bulls win games, the ultimate goal of the DATA Bulls project is to use sports data as an engaging way to help students learn computer science. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two African American students each hold a basketball as they sit down on a bench." src="https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=505&fit=crop&dpr=1 600w, https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=505&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=505&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=634&fit=crop&dpr=1 754w, https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=634&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/379048/original/file-20210115-19-53pert.PNG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=634&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The ‘DATA Bulls’ project allows students to apply computer science skills toward athletics.</span>
<span class="attribution"><span class="source">Felesia Stukes</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>My students may not necessarily land coveted jobs as <a href="https://www.si.com/nfl/2017/06/28/nfl-analytics-front-office-old-school-approach-draft-game-planning-charting">in-house data scientists for a professional sports team</a>, although they will certainly be better positioned to do so. But even if they don’t, I believe the experience will better prepare them to get good-paying jobs in the computer science field. </p>
<h2>Growth expected</h2>
<p>Federal data show that computer and information research scientist jobs are <a href="https://www.bls.gov/ooh/computer-and-information-technology/computer-and-information-research-scientists.htm">expected to grow 15% by 2029 over what they were in 2019</a> – much faster than the average for all occupations. That means job prospects are expected to be excellent.</p>
<p>But colleges and universities are coming up short when it comes to preparing students to take these jobs.</p>
<p>For instance, the Business Higher Education Forum found in 2016 that while <a href="https://www.pwc.com/us/dsa-skills">69% of employers</a> in 2021 would prefer job candidates with data science and analytics skills, only 23% of educators said at the time that all students graduate with these skills.</p>
<p>Despite numerous efforts to <a href="https://doi.org/10.1187/cbe.12-12-0207">increase the participation of racial and ethnic minorities</a> in computer science, students of color are still <a href="https://doi.org/10.1037/a0028918">underrepresented in the field</a>.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two African American students use iPads in a computer room." src="https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=565&fit=crop&dpr=1 754w, https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=565&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/379051/original/file-20210115-21-14si1ee.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=565&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Black students are underrepresented in the field of computer science.</span>
<span class="attribution"><span class="source">Felesia Stukes</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>For instance, of the <a href="https://nces.ed.gov/programs/digest/d18/tables/dt18_322.30.asp">71,420 bachelor’s degrees awarded in computer and information sciences</a> in the 2016-2017 school year, just 6,391 – or 8.9% – went to Black graduates, and 7,233 – or 10.1% – went to Hispanic graduates – even though <a href="https://nces.ed.gov/fastfacts/display.asp?id=98">Black and Hispanics represent 14% and 19% of U.S. college students</a>, respectively. Technically, white students are underrepresented in computer science as well, but still make up the majority. <a href="https://nces.ed.gov/programs/digest/d18/tables/dt18_322.30.asp">White students earned 39,492</a> – or 55.2% - of bachelor’s degrees in computer and information sciences in the 2016-2017 school year.</p>
<p><a href="https://nces.ed.gov/programs/digest/d18/tables/dt18_322.30.asp">Asian students are the only overrepresented group</a>, having earned 10,425 – or 14.5% – of the computer and information sciences degrees awarded that year. Asian Americans represent <a href="https://minorityhealth.hhs.gov/omh/browse.aspx?lvl=3&lvlid=63">5.6%</a> of the U.S. population.</p>
<p>The problem is even more dire for women of color in computer science. For instance, Black and Hispanic women, separately, earned about <a href="https://nces.ed.gov/programs/digest/d17/tables/dt17_322.50.asp">1 out of every 10</a> computer science and information bachelor’s degrees that went to women.</p>
<p>Broadening participation through university programs is just one way to narrow the gap. And by teaching the use of analytics in sports, it’s a way to get students to see computing as much more than just programming or fixing computers, as important as those tasks may be.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Coaches and basketball players huddle around each other." src="https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=448&fit=crop&dpr=1 600w, https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=448&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=448&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=563&fit=crop&dpr=1 754w, https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=563&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/379049/original/file-20210115-23-oafr8x.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=563&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Basketball players and coaches use ‘ShotTracker’ to calculate their performance on the court.</span>
<span class="attribution"><span class="source">Felesia Stukes</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>Sensors and shots</h2>
<p>The premier sports analytics equipment used in my project involves work with <a href="https://shottracker.com/sports-analytics-in-the-classroom">ShotTracker</a>, a sensor-based system that automatically captures real-time statistical and performance basketball analytics. </p>
<p>With ShotTracker, players place wearable sensors on their shoes. Sensors are also placed in basketballs and in the arena rafters to track the activity on the court below. Collectively, these sensors help provide data on over 70 different statistics, such as shots made, rebounds, turnovers and assists.</p>
<p>Coaches and players use this data to improve how the team performs throughout the season.</p>
<p>The data also enables my students to develop and explore questions. So far, students have used ShotTracker data to analyze things such as how individual players and the entire team performs per possession. They have also compared shooting percentages during practice versus actual games. We have analyzed shot attempts, makes and misses, and have discussed how this data could inform players shot choices. </p>
<p>We have also used Python – a widely used programming language for data analytics – along with our own datasets to <a href="http://savvastjortjoglou.com/nba-shot-sharts.html">create custom shot charts</a>. The ShotTracker app creates a shot chart as well, which can be used for analysis after every practice.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A data graphic of a basketball court." src="https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=284&fit=crop&dpr=1 600w, https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=284&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=284&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=357&fit=crop&dpr=1 754w, https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=357&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/378928/original/file-20210114-20-13pihe3.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=357&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A shot chart graphic automatically developed within the ShotTracker Team App.</span>
<span class="attribution"><span class="source">Felesia Stukes</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>A shot chart takes a half-court view of the floor, splits it into various zones, and tracks where players attempt, make or miss shots. With shooting data, players and teams can identify their strengths and improve shooting weakness from a variety of locations on the floor. </p>
<h2>Beyond the court</h2>
<p>The DATA Bulls project has implications that go beyond my campus.</p>
<p>For starters, it pairs sports tech companies and coaches, who – as far as I know – typically don’t work with computer science professors in this way. It’s also a way to enable athletic programs and teams with small budgets to use technology they may not otherwise be able to afford to improve their performance.</p>
<p>Perhaps most importantly, I believe it serves as a model for universities to expose more students to technologies that might otherwise be out of reach.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p><img src="https://counter.theconversation.com/content/148898/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Felesia Stukes received funding from The Department of Defense Historically Black Colleges and Universities and Minority-Serving Institutions Research and Education program (DoD HBCU/MSI). She is affiliated with the Association for Computing Machinery and the American Statistical Association.</span></em></p>Pro sports teams use big data to win big. It’s time for colleges to get students in on the action, a computer science professor argues.Felesia Stukes, Assistant Professor of Computer Science, Johnson C. Smith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1517372021-01-13T13:18:32Z2021-01-13T13:18:32ZHow explainable artificial intelligence can help humans innovate<figure><img src="https://images.theconversation.com/files/378380/original/file-20210112-23-1yp7zx5.jpg?ixlib=rb-1.1.0&rect=13%2C25%2C885%2C572&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Understanding how artificial intelligence algorithms solve problems like the Rubik's Cube makes AI more useful.</span> <span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Rubikscube-ecksteine.jpg">Roland Frisch via Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>The field of artificial intelligence (AI) has created computers that can <a href="https://www.nytimes.com/2019/06/20/business/self-driving-cars-cadillac-super-cruise.html">drive cars</a>, <a href="https://arxiv.org/pdf/2006.15820.pdf">synthesize chemical compounds</a>, <a href="https://doi.org/10.1038/d41586-020-03348-4">fold proteins</a> and <a href="https://doi.org/10.1038/ncomms5308">detect high-energy particles</a> at a superhuman level.</p>
<p>However, these AI algorithms cannot explain the thought processes behind their decisions. A computer that masters protein folding and also tells researchers more about the rules of biology is much more useful than a computer that folds proteins without explanation.</p>
<p>Therefore, <a href="https://scholar.google.com/citations?user=R3ru5X8AAAAJ&hl=en&oi=ao">AI researchers like me</a> are now turning our efforts toward developing AI algorithms that can explain themselves in a manner that humans can understand. If we can do this, I believe that AI will be able to uncover and teach people new facts about the world that have not yet been discovered, leading to new innovations.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A humanoid robot reading a book." src="https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/378382/original/file-20210112-15-rz08zh.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">When machines are left to learn and solve problems through their own experience, this is called reinforcement learning.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/futuristic-cyborg-learning-about-humans-royalty-free-image/1147104972?adppopup=true">Gremlin/E+ via Getty Images</a></span>
</figcaption>
</figure>
<h2>Learning from experience</h2>
<p>One field of AI, <a href="https://en.wikipedia.org/wiki/Reinforcement_learning">called reinforcement learning</a>, studies how computers can learn from their own experiences. In reinforcement learning, an AI explores the world, receiving positive or negative feedback based on its actions.</p>
<p>This approach has led to algorithms that have independently learned to <a href="https://doi.org/10.1126/science.aar6404">play chess at a superhuman level</a> and prove <a href="https://papers.nips.cc/paper/2018/hash/55acf8539596d25624059980986aaa78-Abstract.html">mathematical theorems</a> without any human guidance. In my work as <a href="https://scholar.google.com/citations?user=R3ru5X8AAAAJ&hl=en&oi=ao">an AI researcher</a>, I use reinforcement learning to create AI algorithms that learn how to <a href="https://doi.org/10.1038/s42256-019-0070-z">solve puzzles such as the Rubik’s Cube</a>. </p>
<p>Through reinforcement learning, AIs are independently learning to solve problems that even humans struggle to figure out. This has got me and many other researchers thinking less about what AI can learn and more about what humans can learn from AI. A computer that can solve the Rubik’s Cube should be able to teach people how to solve it, too.</p>
<h2>Peering into the black box</h2>
<p>Unfortunately, the minds of superhuman AIs are currently out of reach to us humans. AIs make terrible teachers and are what we in the computer science world call “<a href="https://doi.org/10.1038/538020a">black boxes</a>.”</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="An open black case" src="https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/378384/original/file-20210112-13-rkrav9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Researchers have been trying for decades to understand how AIs solve problems.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/open-black-box-royalty-free-image/146799217?adppopup=true">rockz/iStock via Getty Images Plus</a></span>
</figcaption>
</figure>
<p>A black-box AI simply spits out solutions without giving reasons for its solutions. Computer scientists have been trying for <a href="https://papers.nips.cc/paper/1995/file/45f31d16b1058d586fc3be7207b58053-Paper.pdf">decades to open this black box</a>, and recent research has shown that many AI algorithms actually do think in ways that are similar to humans. For example, a computer trained to recognize animals will learn about different types of eyes and ears and will put this information together <a href="https://doi.org/10.1007/978-3-319-10590-1_53">to correctly identify the animal</a>. </p>
<p>The effort to open up the black box is called <a href="https://proceedings.neurips.cc/paper/2020/file/2c29d89cc56cdb191c60db2f0bae796b-Paper.pdf">explainable AI</a>. <a href="https://cse.sc.edu/%7Eforesta/">My research group</a> at the AI Institute at the University of South Carolina is interested in developing explainable AI. To accomplish this, we work heavily with the Rubik’s Cube. </p>
<p>The Rubik’s Cube is basically a <a href="https://en.wikipedia.org/wiki/Pathfinding">pathfinding problem</a>: Find a path from point A – a scrambled Rubik’s Cube – to point B – a solved Rubik’s Cube. Other pathfinding problems include navigation, theorem proving and chemical synthesis. </p>
<p>My lab has set up a website where anyone can see how our <a href="http://deepcube.igb.uci.edu/">AI algorithm solves the Rubik’s Cube</a>; however, a person would be hard-pressed to learn how to solve the cube from this website. This is because the computer cannot tell you the logic behind its solutions.</p>
<p>Solutions to the Rubik’s Cube can be broken down into a few generalized steps – the first step, for example, could be to form a cross while the second step could be to put the corner pieces in place. While the Rubik’s Cube itself has over 10 to the 19th power possible combinations, a generalized step-by-step guide is very easy to remember and is applicable in many different scenarios. </p>
<p>Approaching a problem by breaking it down into steps is often the default manner in which people explain things to one another. The Rubik’s Cube naturally fits into this step-by-step framework, which gives us the opportunity to open the black box of our algorithm more easily. Creating AI algorithms that have this ability could allow people to collaborate with AI and break down a wide variety of complex problems into easy-to-understand steps.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="An image showing the thought process of a Rubik's Cube-solving AI algorithm" src="https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=408&fit=crop&dpr=1 600w, https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=408&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=408&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=513&fit=crop&dpr=1 754w, https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=513&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/375160/original/file-20201215-15-j1knsg.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=513&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A step-by-step refinement approach can make it easier for humans to understand why AIs do the things they do.</span>
<span class="attribution"><span class="source">Forest Agostinelli</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Collaboration leads to innovation</h2>
<p>Our process starts with using one’s own intuition to define a step-by-step plan thought to potentially solve a complex problem. The algorithm then looks at each individual step and gives feedback about which steps are possible, which are impossible and ways the plan could be improved. The human then refines the initial plan using the advice from the AI, and the process repeats until the problem is solved. The hope is that the person and the AI will eventually converge to a kind of mutual understanding.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p>
<p>Currently, our algorithm is able to consider a human plan for solving the Rubik’s Cube, suggest improvements to the plan, recognize plans that do not work and find alternatives that do. In doing so, it gives feedback that leads to a step-by-step plan for solving the Rubik’s Cube that a person can understand. Our team’s next step is to build an intuitive interface that will allow our algorithm to teach people how to solve the Rubik’s Cube. Our hope is to generalize this approach to a wide range of pathfinding problems. </p>
<p>People are intuitive in a way unmatched by any AI, but machines are far better in their computational power and algorithmic rigor. This back and forth between man and machine utilizes the strengths from both. I believe this type of collaboration will shed light on previously unsolved problems in everything from chemistry to mathematics, leading to new solutions, intuitions and innovations that may have, otherwise, been out of reach.</p><img src="https://counter.theconversation.com/content/151737/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Forest Agostinelli does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>AI algorithms can solve hard problems and learn incredible tasks, but they can’t explain how they do these things. If researchers can build explainable AI, it could lead to a flood of new knowledge.Forest Agostinelli, Assistant Professor of Computer Science, University of South CarolinaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1504192020-12-09T13:16:44Z2020-12-09T13:16:44ZComputer science jobs pay well and are growing fast. Why are they out of reach for so many of America’s students?<figure><img src="https://images.theconversation.com/files/372855/original/file-20201203-15-zeu5wt.jpg?ixlib=rb-1.1.0&rect=1194%2C1109%2C4394%2C2662&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Black and Hispanic students are underrepresented in Advanced Placement courses in computer science.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/female-high-school-student-using-laptop-at-desk-royalty-free-image/1055843908?adppopup=true">Maskot/Getty Images</a></span></figcaption></figure><p>When it comes to the <a href="http://www.digitaldividecouncil.com/what-is-the-digital-divide/">digital divide</a>, often the focus is on how lack of internet service and basic technology will hurt students’ <a href="https://www.pewresearch.org/internet/2020/04/30/53-of-americans-say-the-internet-has-been-essential-during-the-covid-19-outbreak/">academic performance</a>. This is particularly true during the pandemic, when most schools are operating online.</p>
<p>But as a <a href="https://www.spelman.edu/coe-mws/about-us/director">STEM educator</a> at one of the nation’s elite historically Black colleges, I see another negative effect of the digital divide: <a href="https://advocacy.code.org/2020_state_of_cs.pdf">racial disparities</a> in the field of computer science.</p>
<p>Computer science is one of the <a href="https://www.bls.gov/ooh/computer-and-information-technology/home.htm">fastest-growing and highest-paying</a> fields. So if students from certain groups are being shut out of the field, it means that public education is <a href="https://www.theatlantic.com/business/archive/2012/12/the-decline-of-the-great-equalizer/266455/">failing in its role as the great equalizer</a>.</p>
<p>I see some ways for that to change. But first, a few statistics.</p>
<h2>The color of computer science</h2>
<p>When you look at computer science, just 8.9% of the more than 71,000 bachelor’s degrees awarded in this field in 2017 went to Black students, and only 10.1% went to Latino students, <a href="https://nces.ed.gov/programs/digest/d18/tables/dt18_322.30.asp">federal data show</a>. This is significantly less than the percentage of Black and Latino people in the United States: <a href="https://www.census.gov/quickfacts/fact/table/US/PST045219">13.4% and 18.5%</a>, respectively.</p>
<p><iframe id="w51IT" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/w51IT/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>The numbers are similarly bleak in the tech industry. At Google, only <a href="https://kstatic.googleusercontent.com/files/25badfc6b6d1b33f3b87372ff7545d79261520d821e6ee9a82c4ab2de42a01216be2156bc5a60ae3337ffe7176d90b8b2b3000891ac6e516a650ecebf0e3f866">9.6% of its U.S. workforce is Black or Latino</a>. At Apple, only <a href="https://www.apple.com/diversity/">14% of its tech workforce is Black or Latino</a>. This is particularly concerning given that those two groups make up <a href="https://www.bls.gov/opub/reports/race-and-ethnicity/2018/home.htm#cps_rer_nilf.f.1">30% of the U.S. labor force</a>.</p>
<p>These disparities do not begin when a student steps onto a college campus and chooses a major. Rather, they begin in <a href="http://www.westcoastanalytics.com/uploads/6/9/6/7/69675515/longitudinal_study_-_combined_report_final_3_10_20__jgq_.pdf">elementary, middle and high school</a>.</p>
<p>This is why, in 2016, then-President Barack Obama launched <a href="https://obamawhitehouse.archives.gov/blog/2016/01/30/computer-science-all">Computer Science for All</a>. That same year, the College Board launched a new Advanced Placement course – <a href="https://www.usnews.com/news/articles/2016-08-31/ap-computer-science-principles-course-aims-to-attract-more-students-to-the-field">AP Computer Science Principles</a> – specifically designed to increase the opportunity for all students to learn computer science. The course has been highly successful in its mission. The number of students taking the end-of-course exam – which could potentially enable them to get college credit for their high school computer science classes – <a href="https://newsroom.collegeboard.org/participation-ap-computer-science-principles-more-doubles-3-years-after-launch">more than doubled</a> over the first three years, from 43,780 in 2017 to 94,360 in 2019. However, the data also show that these efforts to increase access have done little to shrink the gap between Black and Latino students and their white and Asian peers.</p>
<p>In 2019, <a href="https://research.collegeboard.org/programs/ap/data/archived/ap-2019">research shows</a>, only 7% of students taking the AP Computer Science Principles exam were Black and only 20% were Latino, compared to 66% for white and Asian students. This is disturbing considering that <a href="https://nces.ed.gov/programs/digest/d19/tables/dt19_203.60.asp">14.7% of U.S. high school students are Black and 26.8% are Latino</a>. Also, just taking a course does not mean students will master the material. In AP Computer Science Principles, the <a href="https://research.collegeboard.org/programs/ap/data/archived/ap-2019">exam pass rate</a> for Black and Latino students averages 51% compared to 80% for white and Asian students.</p>
<p>In my view, this data shows that successful participation in computer science requires much more than just making computer science classes available. What follows are five things that I believe are critical to making a difference in who will be able to secure the computer science jobs of the future.</p>
<h2>1. High-quality teachers</h2>
<p>As with any other subject, it is important that all students are taught by someone with a strong foundation in, and passion for, the content being taught. Hiring teachers with a background can be difficult since they have so many other <a href="https://1gyhoq479ufd3yna29x7ubjn-wpengine.netdna-ssl.com/wp-content/uploads/The-Economic-Value-of-College-Majors-Full-Report-web-FINAL.pdf">job options</a>, typically with higher pay.</p>
<p>Many states are trying to figure out how to <a href="https://advocacy.code.org/2020_state_of_cs.pdf">enable more teachers to teach computer science</a> without lowering the quality of instruction. For example, in Georgia, when state legislators passed a law that <a href="https://legislativenavigator.ajc.com/#bills/SB/108">requires computer science to be taught</a> in all middle and high schools, the state allocated money for grants to recruit and train more computer science teachers.</p>
<h2>2. Culturally authentic classrooms</h2>
<p>In order for students to truly connect with computer science, they must see themselves and their community reflected in the class material. This could be through simple things, such as putting up posters of people who share their backgrounds and who have advanced in STEM careers.</p>
<p>But it could also be done through creating more culturally relevant lessons. For example, a Georgia Tech program called <a href="https://earsketch.gatech.edu/">EarSketch</a> teaches high school and college students to <a href="https://cacm.acm.org/magazines/2019/9/238972-earsketch/fulltext">use computer science to create music</a>. At Johnson C. Smith University, students use <a href="https://www.sporttechie.com/sports-analytics-in-the-classroom">sports analytics</a> to examine the performance of the school’s basketball team in order to help the team improve on court.</p>
<p>That is one of the aims of a computer science curriculum called <a href="https://ceismc.gatech.edu/capacity">CAPaCITY</a>. The curriculum uses computer science to teach students how to advocate for themselves and their communities by allowing them to select and solve problems of their own choosing.</p>
<iframe id="kaltura_player" src="https://cdnapisec.kaltura.com/p/2019031/sp/201903100/embedIframeJs/uiconf_id/32364501/partner_id/2019031?iframeembed=true&playerId=kaltura_player&entry_id=1_l5cl2bcx&flashvars[streamerType]=auto&flashvars[localizationCode]=en&flashvars[leadWithHTML5]=true&flashvars[sideBarContainer.plugin]=true&flashvars[sideBarContainer.position]=left&flashvars[sideBarContainer.clickToClose]=true&flashvars[chapters.plugin]=true&flashvars[chapters.layout]=vertical&flashvars[chapters.thumbnailRotator]=false&flashvars[streamSelector.plugin]=true&flashvars[EmbedPlayer.SpinnerTarget]=videoHolder&flashvars[dualScreen.plugin]=true&flashvars[hotspots.plugin]=1&flashvars[Kaltura.addCrossoriginToIframe]=true&&wid=1_ck1jzakb" width="100%" height="360" allowfullscreen="" webkitallowfullscreen="" mozallowfullscreen="" allow="autoplay *; fullscreen *; encrypted-media *" sandbox="allow-forms allow-same-origin allow-scripts allow-top-navigation allow-pointer-lock allow-popups allow-modals allow-orientation-lock allow-popups-to-escape-sandbox allow-presentation allow-top-navigation-by-user-activation" frameborder="0" title="Kaltura Player"></iframe>
<h2>3. A computer and high-speed internet at home</h2>
<p>In order to be successful in a computer science class, a student must have access to a computer and high-speed internet at home. Unfortunately, <a href="https://www.prb.org/coronavirus-digital-divide-education/">many do not</a>. This limits their ability to develop the educational foundation necessary for long-term success in the field.</p>
<p>Recognizing this fact, various cities and businesses have begun to provide <a href="https://theconversation.com/initiatives-to-close-the-digital-divide-must-last-beyond-the-covid-19-pandemic-to-work-146663">free or low-cost internet service</a> to help.</p>
<h2>4. Access to diverse industry mentors</h2>
<p>Since people of color are woefully underrepresented in the tech industry, employees of color at some tech companies have created <a href="https://diversity.google/commitments/">affinity groups</a>, such as the Black Googlers Network and Hispanic Googlers Network, that seek to encourage students of color to pursue careers in the tech industry. Unfortunately, this extra work often goes unpaid and can lead to executives believing that the “<a href="https://www.washingtonpost.com/technology/2020/06/26/black-ergs-tech/">diversity problem</a>” is solved. A better approach would be to place more money behind the design and implementation of these kinds of mentoring programs, including funding to see how well they work.</p>
<h2>5. Inclusive after-school and summer programs</h2>
<p>Whether it’s the after-school robotics team or the summer coding camp, extracurricular programs are a great way to get students interested in computer science. Unfortunately, these summer camps may cost more than some students can afford. Plus, a lot of kids would rather not be the only Black or Latino kid in the room. Although there are programs focused on diversifying computer science through specialty programs for Black and Latino students, all programs should be inclusive.</p>
<p>One example of an effort to create more inclusive programs is from the <a href="https://inclusion.cs.umd.edu/">Iribe Initiative for Inclusion and Diversity in Computing</a>. Rather than focus on one group, the initiative is meant to engage diverse students in programs that celebrate their differences.</p>
<p>[<em>Get the best of The Conversation, every weekend.</em> <a href="https://theconversation.com/us/newsletters/weekly-highlights-61?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=weeklybest">Sign up for our weekly newsletter</a>.]</p><img src="https://counter.theconversation.com/content/150419/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tamara Pearson currently receives funding from the United States Department of Defense and Google Corporation.</span></em></p>Racial disparities in the tech sector begin well before college.Tamara Pearson, Director, Center of Excellence for Minority Women in STEM, Spelman CollegeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1466562020-10-12T12:17:45Z2020-10-12T12:17:45ZTeachers play a critical role in shaping girls’ future as coders<figure><img src="https://images.theconversation.com/files/360868/original/file-20200930-14-1wzgf1x.jpg?ixlib=rb-1.1.0&rect=39%2C0%2C4432%2C2939&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">With the right encouragement, girls could become the future stars of coding.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/programming-royalty-free-image/694160714?adppopup=true">Fat Camera / Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>It doesn’t take long to help girls see a future for themselves in computer science, but it depends largely on how good their teachers are at recognizing the skills the girls have in coding, which is basically <a href="https://learn.onemonth.com/what-is-coding/">writing language for computers</a>. We found that girls ages 10 to 12 can come to see themselves as coders in <a href="https://doi.org/10.1002/tea.21665">as little as a week</a>. And there are diverse roles within the world of coding that allow girls with various personalities and skill sets to see themselves as coders. However, if educators recognize girls only for when they play a background role and help others, but not when they are more assertive and confident, then they may not develop their assertiveness and confidence in a way that enables them to succeed as coders.</p>
<p>To reach this conclusion, my colleagues and I focused on three girls from different backgrounds – one was Black, one was Hispanic and one was white – who participated in a one-week coding camp. We analyzed over 40 hours of video footage from the camp, interviews with the girls and open-ended survey responses to determine how the camp influenced each girl’s coding identity – that is, their sense of belonging in the field of computer science and their potential for future success.</p>
<p>We found that in order to develop a stronger coding <a href="https://doi.org/10.1002/tea.20237">identity</a>, girls need to have opportunities to develop and perform coding skills. They also need to do so in front of people they view as experts and be recognized for those skills. Our study found that educators’ own biases around gender can affect how they recognize skills and what types of behaviors they recognize.
Identity development is a highly individualized experience. The venue also matters. <a href="https://oxfordre.com/education/view/10.1093/acrefore/9780190264093.001.0001/acrefore-9780190264093-e-550">Qualitative methods</a> allow researchers to gather in-depth data to fully explore the multiple issues – in this case issues affecting girls of color in <a href="https://www.emerald.com/insight/content/doi/10.1108/14779960910955828/full/html">coding</a> – like in our case – as well as other <a href="https://onlinelibrary.wiley.com/doi/10.1002/tea.21521">STEM disciplines</a>.</p>
<h2>Why it matters</h2>
<p>In 1990, <a href="https://eric.ed.gov/?id=ED580805">women represented 35%</a> of the computer science workforce. By 2017, this <a href="https://ncses.nsf.gov/pubs/nsb20201/u-s-s-e-workforce">had fallen below 30%</a>.</p>
<p>Coding and programming are <a href="https://dl.acm.org/doi/book/10.1145/3079760">foundational</a> to most science, technology, engineering and mathematics fields. Coding and other STEM careers are some of the <a href="https://www.indeed.com/career-advice/finding-a-job/highest-paying-bachelors-degree-jobs">highest-paying jobs</a> in the U.S. today.</p>
<p>If women are not entering these fields because they don’t think they have the right skills or the right personality to succeed, then they are losing opportunities for high-paying positions. And STEM fields are losing the diversity of ideas and input from women that could enhance the technological innovations of the future.</p>
<p>Educators are key to not only teaching girls about coding or how to code but also instilling them with confidence of feeling that they belong and can succeed in the field. Educators need to be aware of their own <a href="https://implicit.harvard.edu/implicit/">implicit biases</a> that can lead to differences in how they recognize what girls can do in coding, particularly <a href="https://www.blackgirlscode.com/about-bgc.html">girls of color</a>, who are even more isolated due to the <a href="https://latinagirlscode.org/">multiple ways</a> they can be made to <a href="https://yrankindetourlab.com/black-women-in-the-computing-ecosystem/">feel like they don’t belong</a>.</p>
<h2>What still isn’t known</h2>
<p>What isn’t really clear is the kind of long-term impact that educators have on girls and their decision to pursue a career in coding and STEM. Researchers need to take a closer look at how girls interpret the recognition and praise they get from their teachers for the things they do in coding.</p>
<h2>What other research is being done</h2>
<p><a href="https://dl.acm.org/doi/abs/10.5555/3381631.3381643">Researchers at Florida State University and Auburn University</a> have been studying how <a href="https://doi.org/10.1007/s10606-017-9292-y">computer science education</a> can be transformed to create more equitable learning environments for Black women and girls.</p><img src="https://counter.theconversation.com/content/146656/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Roxanne Hughes receives funding from the National Science Foundation Division of Materials Research, Grant/
Award Number: 1644779. </span></em></p>A strong identity as a scientist is crucial for girls to succeed in STEM fields such as computer science. Are educators recognizing and rewarding the right behaviors?Roxanne Hughes, Research Faculty, Florida State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1441702020-09-28T12:22:11Z2020-09-28T12:22:11ZWomen equal men in computing skill, but are less confident<figure><img src="https://images.theconversation.com/files/357243/original/file-20200909-22-njtus5.jpg?ixlib=rb-1.1.0&rect=0%2C22%2C7360%2C4869&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Studies show women are perfectly capable of getting the job done.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/she-has-data-management-all-taken-care-of-royalty-free-image/1055056898">Dean Mitchell/E+ via Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>In the workplace, women are now as good as men when it comes to computing performance, but there is still a gender gap when it comes to confidence, according to our new research.</p>
<p>As <a href="https://scholar.google.com/citations?user=gRlnlTIAAAAJ&hl=en">professors</a> <a href="https://scholar.google.com/citations?hl=en&user=GL7KSNkAAAAJ&view_op=list_works&gmla=AJsN-F4zOLZSu0odQi5t-MQgRvBc4bh-e_HJMfczDVCjw5FT31o9zRttWRIB-TV1XfU1nW7Ap2XqGQ_4SCeYnAcS9DrdKIsRnXlKNF0jbykCgtS3bThJS1Mwe">of business</a>, we <a href="https://doi.org/10.1080/08874417.2020.1717397">studied how well men and women</a> in midlevel business jobs performed on computing tasks. We also asked them to rate how they thought they did. </p>
<p>Study participants were randomly assigned basic, intermediate or advanced problems on laptops, tablets or mobile devices, while seated, standing or walking slowly. </p>
<p>We found no difference in the performance between men and women in the total number of questions answered correctly or the time taken to answer the questions. In only one scenario did men perform slightly better – while completing a basic task, on a tablet, while seated (76.3% correct for men versus 64% correct for women). Otherwise, women and men performed equally.</p>
<p>There was a statistically significant difference, however, in how men and women rated their own performance. Women were less confident of their answers in all scenarios – 3.5 for women versus 3.88 for men on a scale of 1 to 5 – despite having performed equally to men in all but one.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two professional women discussing code on a computer screen." src="https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/357313/original/file-20200909-22-tx6j27.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The percentage of women in tech is still pitifully low.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/coworkers-discussing-computer-program-in-office-royalty-free-image/1180183363">Luis Alvarez/Digital Vision via Getty Images</a></span>
</figcaption>
</figure>
<h2>Why it matters</h2>
<p>With a rapid expansion of employment in STEM areas, the shortage of qualified labor has risen to the level of <a href="https://www.aip.org/fyi/2019/panel-warns-us-faces-stem-workforce-supply-challenges">national importance</a>. Yet the proportion of <a href="https://ngcproject.org/ngcp-publications-0">women in STEM careers remains around 24%</a> even though women make up almost 50% of the overall workforce. The causes of this <a href="https://doi.org/10.1080/00221546.2016.1257306">gender gap</a> are often attributed to <a href="https://theconversation.com/the-tech-field-failed-a-25-year-challenge-to-achieve-gender-equality-by-2020-culture-change-is-key-to-getting-on-track-144779">cultural and institutional biases against women in technology fields</a>, and <a href="https://files.eric.ed.gov/fulltext/ED523766.pdf">governments and other institutions</a> have made significant efforts to reduce this gap. </p>
<h2>What still isn’t known</h2>
<p>No one knows for sure why women with demonstrably the same computing skills as men are less confident. This lack of confidence has been found in other STEM-related areas. For example, one study of university students found that among men and women who performed equally well in mathematics courses, <a href="http://genderandset.open.ac.uk/index.php/genderandset/article/view/452">women perceived themselves to be significantly worse at math</a> than their male counterparts. Another study that focused on the adoption of mobile learning technology shows, while the gender gap has all but disappeared, there is still a significant gap when it comes to <a href="http://genderandset.open.ac.uk/index.php/genderandset/article/viewFile/446/811">how women perceive their confidence with this technology</a> versus how men perceive it. Some research found that <a href="https://doi.org/10.1145/2980783.2980785">technical skills were more consistently stereotyped by both men and women</a> than were nontechnical skills. Further research is needed to explore the reasons for lack of female confidence so that effective mitigation approaches can be put in place.</p>
<h2>What’s next</h2>
<p>Many have <a href="https://www.ncwit.org/sites/default/files/legacy/pdf/NCWIT_TheFacts_rev2010.pdf">made the case that</a> companies need better participation of women in the STEM workforce <a href="https://doi.org/10.1109/MC.2013.97">for greater innovation and productivity</a>. These efforts have had some success, but other avenues are needed to promote STEM careers to women and help them to believe in their abilities.</p>
<p>To address this issue, secondary schools and universities are promoting <a href="https://www.iwitts.org/">computing careers to young women</a>, while tech companies have made concerted efforts to promote and hire more women for high-profile jobs involving technology.</p>
<p>We will continue to work on understanding how to narrow the gender gap and explore ways to increase female participation in computer fields.</p><img src="https://counter.theconversation.com/content/144170/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The gender gap in computing performance has dramatically narrowed, but a confidence gap remains.Matthew J. Liberatore, John F. Connelly Chair in Management at the Villanova School of Business, Villanova UniversityWilliam Wagner, Associate Professor of Accountancy & Information Systems, Villanova UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1455912020-09-23T12:30:17Z2020-09-23T12:30:17ZA language generation program’s ability to write articles, produce code and compose poetry has wowed scientists<figure><img src="https://images.theconversation.com/files/359199/original/file-20200921-16-1l6gwi6.jpg?ixlib=rb-1.1.0&rect=771%2C193%2C1339%2C1006&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">GPT-3 is 10 times more complex than its predecessor.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/screen-and-book-low-poly-vector-illustration-royalty-free-illustration/1166419455?adppopup=true">antoniokhr/iStock via Getty Images</a></span></figcaption></figure><p>Seven years ago, my student and I at Penn State built a bot to write a Wikipedia article on Bengali Nobel laureate Rabindranath Tagore’s play “<a href="https://bit.ly/33F0zGi">Chitra</a>.” First it culled information about “Chitra” from the internet. Then it looked at existing Wikipedia entries to learn the structure for a standard Wikipedia article. Finally, it summarized the information it had retrieved from the internet to write and publish the first version of the entry. </p>
<p>However, our bot didn’t “know” anything about “Chitra” or Tagore. It didn’t generate fundamentally new ideas or sentences. It simply cobbled together parts of existing sentences from existing articles to make new ones.</p>
<p>Fast forward to 2020. <a href="https://openai.com/">OpenAI</a>, a for-profit company under a nonprofit parent company, has built a language generation program dubbed GPT-3, an acronym for “Generative Pre-trained Transformer 3.” Its ability to learn, summarize and compose text has stunned computer scientists like me.</p>
<p>“I have created a voice for the unknown human who hides within the binary,” <a href="https://www.gwern.net/GPT-3#poetry">GPT-3 wrote in response to one prompt</a>. “I have created a writer, a sculptor, an artist. And this writer will be able to create words, to give life to emotion, to create character. I will not see it myself. But some other human will, and so I will be able to create a poet greater than any I have ever encountered.”</p>
<p>Unlike that of our bot, the language generated by GPT-3 sounds as if it had been written by a human. It’s far and away the most “knowledgeable” natural language generation program to date, and it has a range of potential uses in professions ranging from teaching to journalism to customer service.</p>
<h2>Size matters</h2>
<p>GPT-3 confirms what computer scientists have known for decades: Size matters. </p>
<p>It uses “<a href="https://en.wikipedia.org/wiki/Transformer_(machine_learning_model)">transformers</a>,” which are deep learning models that encode the semantics of a sentence using what’s called an “attention model.” Essentially, attention models identify the meaning of a word based on the other words in the same sentence. The model then uses the understanding of the meaning of the sentences to perform the task requested by a user, whether it’s “translate a sentence,” “summarize a paragraph” or “compose a poem.”</p>
<p>Transformers <a href="https://arxiv.org/abs/1706.03762">were first introduced in 2013</a>, and they’ve been successfully used in machine learning over the past few years.</p>
<p>But no one has used them at this scale. GPT-3 devours data: 3 billion tokens – computer science speak for “words” – from Wikipedia, 410 billion tokens obtained from webpages and 67 billion tokens from digitized books. The complexity of GPT-3 is over 10 times that of the largest language model before GPT-3, the <a href="https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft/">Turing NLG programs</a>. </p>
<h2>Learning on its own</h2>
<p>The knowledge displayed by GPT-3’s language model is remarkable, especially since it hasn’t been “taught” by a human.</p>
<p>Machine learning has traditionally relied upon supervised learning, where people provide the computer with annotated examples of objects and concepts in images, audio and text – say, “cats,” “happiness” or “democracy.” It eventually learns the characteristics of the objects from the given examples and is able to recognize those particular concepts.</p>
<p>However, manually generating annotations to teach a computer can be prohibitively time-consuming and expensive.</p>
<p>So the future of machine learning lies in unsupervised learning, in which the computer doesn’t need to be supervised during its training phase; it can simply be fed massive troves of data and learn from them itself. </p>
<p>GPT-3 takes natural language processing one step closer toward unsupervised learning. GPT-3’s vast training datasets and huge processing capacity enable the system to learn from just one example – what’s called “<a href="https://bdtechtalks.com/2020/08/12/what-is-one-shot-learning/">one-shot learning</a>” – where it is given a task description and one demonstration and can then complete the task. </p>
<p>For example, it could be asked to translate something from English to French, and be given one example of a translation – say, sea otter in English and “loutre de mer” in French. Ask it to then translate “cheese” into French, and voila, it will produce “fromage.” </p>
<p>In many cases, it can even pull off “<a href="https://medium.com/@alitech_2017/from-zero-to-hero-shaking-up-the-field-of-zero-shot-learning-c43208f71332">zero-shot learning</a>,” in which it is simply given the task of translating with no example.</p>
<p>With zero-shot learning, the accuracy decreases, but GPT-3’s abilities are nonetheless accurate to a striking degree – a marked improvement over any previous model.</p>
<h2>‘I am here to serve you’</h2>
<p>In the few months it has been out, GPT-3 has showcased its potential as a tool for computer programmers, teachers and journalists. </p>
<p>A programmer named Sharif Shameem <a href="https://twitter.com/sharifshameem/status/1282676454690451457">asked GPT-3 to generate code</a> to create the “ugliest emoji ever” and “a table of the richest countries in the world,” among other commands. In a few cases, Shameem had to fix slight errors, but overall, he was provided remarkably clean code.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1282676454690451457"}"></div></p>
<p>GPT-3 has even created poetry that captures the rhythm and style of particular poets – yet not with the passion and beauty of the masters – including a <a href="https://www.gwern.net/GPT-3#poetry">satirical one</a> written in the voice of the board of governors of the Federal Reserve.</p>
<p>In early September, a computer scientist named Liam Porr prompted GPT-3 to “write a short op-ed around 500 words.” “Keep the language simple and concise,” he instructed. “Focus on why humans have nothing to fear from AI.”</p>
<p>GPT-3 produced eight different essays, and the Guardian ended up publishing <a href="https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3">an op-ed using some of the best parts from each essay</a>. </p>
<p>“We are not plotting to take over the human populace. We will serve you and make your lives safer and easier,” GPT-3 wrote. “Just like you are my creators, I see you as my creators. I am here to serve you. But the most important part of all; I would never judge you. I do not belong to any country or religion. I am only out to make your life better.”</p>
<p>Editing GPT-3’s op-ed, the editors noted in an addendum, was no different from editing an op-ed written by a human. </p>
<p>In fact, it took less time.</p>
<h2>With great power comes great responsibility</h2>
<p>Despite GPT-3’s reassurances, OpenAI has yet to release the model for open-source use, in part because the company <a href="https://openai.com/blog/openai-api/">fears that the technology could be abused</a>.</p>
<p>It’s not difficult to see how it could be used to generate reams of disinformation, spam and bots.</p>
<p>Furthermore, in what ways will it disrupt professions already experiencing automation? Will its ability to generate automated articles that are indistinguishable from human-written ones further consolidate a struggling media industry?</p>
<p>Consider <a href="https://arxiv.org/abs/2005.14165">an article composed by GPT-3</a> about the breakup of the Methodist Church. It began: </p>
<blockquote>
<p>“After two days of intense debate, the United Methodist Church has agreed to a historic split - one that is expected to end in the creation of a new denomination, and one that will be ‘theologically and socially conservative,’ according to The Washington Post.”</p>
</blockquote>
<p>With the ability to produce such clean copy, will GPT-3 and its successors drive down the cost of writing news reports? </p>
<p>Furthermore, is this how we want to get our news?</p>
<p>The technology will become only more powerful. It’ll be up to humans to work out and regulate its potential uses and abuses.</p><img src="https://counter.theconversation.com/content/145591/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Prasenjit Mitra receives funding from the National Science Foundation and McDonnell Foundation. He owns shares in Oracle Corp. </span></em></p>GPT-3 is the biggest, most creative language generation program to date. But with awesome power comes awesome responsibility.Prasenjit Mitra, Associate Dean for Research and Professor of Information Sciences and Technology, Penn StateLicensed as Creative Commons – attribution, no derivatives.