tag:theconversation.com,2011:/africa/topics/programming-5878/articlesProgramming – The Conversation2024-02-06T13:29:09Ztag:theconversation.com,2011:article/2207572024-02-06T13:29:09Z2024-02-06T13:29:09ZAI helps students skip right to the good stuff in this intro programming course<figure><img src="https://images.theconversation.com/files/572925/original/file-20240201-19-gfnaj2.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5294%2C3960&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Letting AI do the dirty work of programming frees students to work on problem-solving.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/an-ai-robot-and-a-human-hand-engage-in-a-futuristic-royalty-free-image/1496700152">Issarawat Tattong/Moment via Getty Images</a></span></figcaption></figure><figure class="align-right ">
<img alt="Text saying: Uncommon Courses, from The Conversation" src="https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=375&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=375&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=375&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=471&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=471&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=471&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em><a href="https://theconversation.com/topics/uncommon-courses-130908">Uncommon Courses</a> is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.</em> </p>
<h2>Title of course:</h2>
<p>“Learn AI-Assisted Python Programming”</p>
<h2>What prompted the idea for the course?</h2>
<p>Generative AI is <a href="https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/unleashing-developer-productivity-with-generative-ai">really good at computer programming</a> – to the point where the way we teach and assess students who are learning to program must change.</p>
<p>We used to give students dozens or hundreds of small targeted programming tasks, drilling each aspect of the syntax – the words and symbols – of programming. That worked well as a starting point, except now generative AI tools can solve all of these problems. Educators can try to ban these tools (good luck with that!), or embrace them. We chose to embrace them in our new course, where students learn to program – supported by a generative AI assistant.</p>
<h2>What does the course explore?</h2>
<p>The course re-imagines what learning to program means now that generative AI is available to handle more of the low-level syntax issues that have historically slowed down and frustrated students. The more students struggle with finicky syntax details, the less time and energy they have to accomplish their programming-related goals like starting a business, writing apps for social good, or contributing to projects that are meaningful to them.</p>
<p>Generative AI clears the decks for us to focus on more valuable, high-level skills that students need to become effective programmers. For example, generative AI struggles to solve large problems; we still need humans to divide those problems into bite-sized chunks – a process known as problem decomposition – each of which AI can solve well. People are still needed to test code to ensure it’s doing what was intended, and to ensure that the code is used to help, not harm, society and its vulnerable groups.</p>
<h2>Why is this course relevant now?</h2>
<p>Professional programmers in droves have already adopted generative AI tools and are using them to be more efficient in their daily work. If the goal is to prepare students for these jobs, teachers need to train them in how to use these new tools.</p>
<p>Perhaps more importantly, what students can do in introductory courses changes. With a more powerful tool comes an ability <a href="https://doi.org/10.1016/j.caeai.2023.100147">to work at higher, more efficient levels</a>. These tools save people time. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/7h732qLxtAk?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">AI code assistants are changing what it means to do computer programming.</span></figcaption>
</figure>
<h2>What’s a critical lesson from the course?</h2>
<p>A critical lesson is that generative AI is impressive, but that it is fallible. You cannot simply ask it for code and assume that the code it gives you is perfect. It may not do the right thing. It <a href="https://www.wired.com/story/fast-forward-power-danger-ai-generated-code/">may produce errors</a>, or bugs. It may cause security concerns. It may exclude underrepresented groups or discourses. You need to critically examine the code that you get from generative AI. </p>
<h2>What materials does the course feature?</h2>
<p>The course is built on our new book “<a href="https://www.manning.com/books/learn-ai-assisted-python-programming">Learn AI-Assisted Python Programming</a>.” The book reconceptualizes an introductory programming course in the context of generative AI tools. </p>
<p>The main tool used in the book and in our course is called <a href="https://docs.github.com/en/copilot">GitHub Copilot</a>, which is like ChatGPT for programmers. Students use Copilot from day one. They build complete apps: apps to automate tedious, error-prone tasks; computer games; even an app to guess who wrote a novel whose author may be unknown. To ensure that students are still learning fundamentals, the book teaches them how to understand the code that the generative AI is creating. </p>
<h2>What will the course prepare students to do?</h2>
<p>Some students take an intro programming course to start their computer science major. For those students, we continue to teach evergreen skills like code reading and code testing, but now also introduce the higher-level skill of problem decomposition so students can solve larger tasks than ever before. </p>
<p>The majority of students in the course, though, are studying other disciplines like sociology, psychology, business, engineering and science. The course prepares those students to use generative AI to boost their careers through programming.</p><img src="https://counter.theconversation.com/content/220757/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Leo Porter receives compensation for sales of the book "Learn AI-Assisted Python Programming."</span></em></p><p class="fine-print"><em><span>Daniel Zingaro receives compensation for sales of the book "Learn AI-Assisted Python Programming." He also consults on books for Manning Publications.
</span></em></p>Learning to program requires mastering the nitty-gritty of code syntax. Generative AI turns out to be good at that. Adding AI to intro programming courses frees students to focus on problem-solving.Leo Porter, Teaching Professor of Computer Science and Engineering, University of California, San DiegoDaniel Zingaro, Associate Professor of Mathematical and Computational Sciences, University of TorontoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2198022023-12-14T04:06:45Z2023-12-14T04:06:45ZThe AI industry is on the verge of becoming another boys’ club. We’re all going to lose out if it does<figure><img src="https://images.theconversation.com/files/565707/original/file-20231214-23-2bm6wg.jpg?ixlib=rb-1.1.0&rect=63%2C63%2C5993%2C3968&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>A recent New York Times <a href="https://www.nytimes.com/2023/12/03/technology/ai-key-figures.html">article</a> released a list of people “behind the dawn of the modern artificial intelligence movement” – and not a single woman was named. It came less than a week after news of a fake auto-generated woman being listed as a speaker on the agenda <a href="https://apnews.com/article/tech-conference-fake-women-ai-generated-devternity-98ed551e90ec49e81589cc928715ae3c">for a software conference</a>.</p>
<p>Unfortunately, the omission of women from the history of STEM isn’t a new phenomenon. Women have been missing from these narratives for centuries.</p>
<p>In the wake of recent AI developments, we now have a choice: are we going to leave women out of these conversations as well – even as they continue to make massive contributions to the AI industry? </p>
<p>Doing so risks leading us into the same fallacy that established computing itself as a “man’s world”. The reality, of course, is quite different. </p>
<h2>A more accurate history</h2>
<p>Prior to computers as we know them, “computer” was the title given to people who performed complex mathematical calculations. These people <a href="https://www.smithsonianmag.com/science-nature/history-human-computers-180972202">were commonly women</a>.</p>
<p>English mathematician Ada Lovelace (1815–1852) is often referred to as <a href="https://www.newyorker.com/tech/annals-of-technology/ada-lovelace-the-first-tech-visionary">the first computer programmer</a>. She was the <a href="https://lemelson.mit.edu/resources/ada-lovelace">first person to realise</a> computers could do much more than just math calculations. Her work on <a href="https://www.britannica.com/technology/Analytical-Engine">the analytical engine</a> – a proposed automatic and fully programmable mechanical computer – dates back to the mid-1800s.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565705/original/file-20231214-29-49fdp7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A blue plaque in St James’s Square in London marks the location Ada Lovelace once lived.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>By the 1870s, a group of about 80 women worked as computers <a href="https://www.thecrimson.com/article/2019/9/26/women-computers-observatory/">at the Harvard Observatory</a>. They catalogued and analysed copious amounts of astronomic data for astronomer Edward Charles Pickering (who exploited the fact they’d work for less money than men, or even as volunteers).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=469&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=469&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=469&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=590&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=590&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565700/original/file-20231214-17-oqi67x.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=590&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 1886, Pickering put Williamina P.S. Fleming in charge of the Harvard computers. Over the course of her career she discovered 10 novae, 52 nebulae and hundreds of stars.</span>
<span class="attribution"><span class="source">Wikimedia</span></span>
</figcaption>
</figure>
<p>By the late 19th century, increased access to education meant there was an entire generation of women trained in maths. These woman computers were cheaper labour than men at the time, and so <a href="https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/">employing them</a> significantly reduced the costs of computation.</p>
<p>During the first world war, women were hired to <a href="https://cs.brown.edu/courses/cs1951i/lightWhenComputersWereWomen.pdf">calculate artillery trajectories</a>. This work continued into the second world war, when they were actively encouraged to <a href="https://www.history.com/news/coding-used-to-be-a-womans-job-so-it-was-paid-less-and-undervalued">take on wartime jobs</a> as computers in the absence of men. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=700&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=700&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=700&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=880&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=880&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565699/original/file-20231214-17-msi71b.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=880&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Former NASA mathematician Katherine Johnson was awarded the Presidential Medal of Freedom in 2015.</span>
<span class="attribution"><span class="source">NASA/Bill Ingalls</span></span>
</figcaption>
</figure>
<p>Women continued to work as computers into the early days of the <a href="https://education.nationalgeographic.org/resource/women-nasa/">American space program in the 1960s</a>, playing a pivotal role in advancing NASA’s space projects. One of these computers was <a href="https://www.nasa.gov/centers-and-facilities/langley/katherine-johnson-biography/">Katherine Johnson</a>, who was responsible for quality-checking the outputs of early IBM computers for an orbital mission in 1962. </p>
<p>Many women made significant contributions to computing, yet few were recognised for these contributions – let alone financially compensated. <a href="https://books.google.com.au/books?id=GWOIXDsLQWwC&printsec=frontcover&dq=Recoding+Gender:+Women%2527s+Changing+Participation+in+Computing&hl=en&sa=X&redir_esc=y#v=onepage&q=salary&f=false">According to</a> Virginia Tech professor Janet Abbate, by 1969 a female computer specialist’s median salary was US$7,763, compared to US$11,193 for a male computer specialist.</p>
<p>Woman computers worked behind the scenes, while their male counterparts received recognition, awards and publicity.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-your-money-is-helping-subsidise-sexism-in-academia-and-what-you-can-do-about-it-218347">How your money is helping subsidise sexism in academia – and what you can do about it</a>
</strong>
</em>
</p>
<hr>
<h2>Women in AI</h2>
<p>Computing and programming are the foundation of AI as we know it today. At a basic level, today’s generative and predictive AI systems work by analysing large amounts of data and <a href="https://medium.com/@stahl950/the-math-behind-predictions-in-ai-unraveling-the-magic-44b4fcb8af6">finding patterns in it</a>. </p>
<p>The women who pioneered computing from as early as the 1800s laid the foundations for this work. The work they were doing by hand for more than a century has now been replaced by machines capable of analysing much larger quantities of data in much a shorter time.</p>
<p>This transition does not diminish women’s contributions to the field of computing and, more recently, AI. Myriad women are doing pioneering work in the AI industry today, including the 12 women named is this recent <a href="https://medium.com/womenintechnology/ny-times-missed-these-12-trailblazers-meet-the-women-transforming-ai-ae522f52a8b7">Medium article</a>. </p>
<p>From Google’s ex-chief decision scientist Cassie Kozyrkov, to Canadian computer scientist Joy Buolamwini, to OpenAI’s CTO Mira Murati (pictured in this article’s banner image) – these women are helping make AI safer, more accurate, more accessible, more inclusive and more reliable.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=841&fit=crop&dpr=1 600w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=841&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=841&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1057&fit=crop&dpr=1 754w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1057&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/565702/original/file-20231214-27-42uznz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1057&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Joy Buolamwini is a Rhodes scholar, Fulbright fellow, Stamps scholar, Astronaut scholar and Anita Borg Institute scholar. Her work focuses on reducing bias in AI.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Joy_Buolamwini#/media/File:Joy_Buolamwini_-_Wikimania_2018_01.jpg">Wikimedia</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>And they’re taking these strides despite working in a heavily male-dominated industry. <a href="https://medium.com/element-ai-research-lab/estimating-the-gender-ratio-of-ai-researchers-around-the-world-81d2b8dbe9c3">One 2018 study</a> of 4,000 researchers who had been published in leading AI conferences found women made up just 12% of this group.</p>
<h2>The impact of omission</h2>
<p>The omission of women isn’t limited to the AI industry, or even to STEM. As historian Bettany Hughes notes, women occupy a <a href="https://www.english-heritage.org.uk/visit/inspire-me/blog/blog-posts/why-were-women-written-out-of-history-an-interview-with-bettany-hughes/#">meagre 0.5%</a> of recorded history. Clearly, a lack of gender diversity in the workforce is part of a much larger, systemic problem – one that affects many more people than the individuals being excluded. </p>
<p>In 1983, NASA engineers suggested packing 100 tampons on the <a href="https://prospect.org/culture/books/astronaut-sally-ride-burden-first/">Challenger space shuttle</a> for astronaut Sally Ride – for a trip that was one week long. Such an incident is seemingly harmless on the surface. But what happens when gender bias and stereotypes bleed into the design and development of AI? </p>
<p>Research <a href="https://edition.cnn.com/2023/06/12/tech/facebook-job-ads-gender-discrimination-asequals-intl-cmd/index.html">published in 2018</a> by international non-profit Global Witness found Facebook’s job ad platform, which uses algorithms to target users with ads, based its targeting on sexist stereotypes. For example, ads for mechanics were targeted mostly at men, while ads for preschool teachers were targeted mostly at women. </p>
<p>Another <a href="https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">2018 study</a> found computer vision systems reported higher error rates for recognising women, and in particular women with darker skin tones. </p>
<p>A <a href="https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/">lack of gender diversity</a> in AI has a demonstrated ability to harm and disadvantage women and, by extension, all of us. While many argue that improving AI training datasets could address the gender gap, others rightly point out that women should also be included in <a href="https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/?sh=35e1baed534f">data-collection processes</a></p>
<h2>Breaking the glass ceiling</h2>
<p>Speaking at the <a href="https://www.heforshe.org/en/join-us-heforshe-summit-2023">UN Women’s HeForShe summit</a> earlier this year, <a href="https://huggingface.co/">Hugging Face</a> research scientist Sasha Luccioni made a <a href="https://www.unwomen.org/en/news-stories/feature-story/2023/09/heforshe-summit-discusses-gender-bias-in-ai-and-how-to-encourage-male-feminist-allies">salient point</a>:</p>
<blockquote>
<p>AI bias doesn’t come from thin air – it comes from the patterns we perpetuate in our societies.</p>
</blockquote>
<p>The recent New York Times article is an example of how both media and industry play a role in reinforcing a status quo that disproportionately favours men. This form of bias does nothing to help close a persistent and problematic gender gap.</p>
<p>Despite <a href="https://www.smh.com.au/national/tie-research-funding-to-progress-on-diversity-stem-review-says-20230814-p5dw8j.html">millions of dollars</a> being spent to encourage women to take up careers in STEM, these fields are struggling to <a href="https://www.lgea.org.au/Scientists/News/2021_women_in_stem_report.aspx">retain woman workers</a>. </p>
<p>Women’s contributions to AI are not insignificant. Failing to acknowledge this can make the glass ceiling seem impossible to break through.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/chief-scientist-women-in-stem-are-still-far-short-of-workplace-equity-covid-19-risks-undoing-even-these-modest-gains-143092">Chief Scientist: women in STEM are still far short of workplace equity. COVID-19 risks undoing even these modest gains</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/219802/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Zena Assaad does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>For decades, woman ‘computers’ worked behind the scenes while their male counterparts received recognition. The AI industry must not be an example of history repeating itself.Zena Assaad, Senior Lecturer, School of Engineering, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1939302022-12-08T23:04:04Z2022-12-08T23:04:04ZAda Lovelace’s skills with language, music and needlepoint contributed to her pioneering work in computing<figure><img src="https://images.theconversation.com/files/499373/original/file-20221206-10118-sz9tym.jpg?ixlib=rb-1.1.0&rect=0%2C14%2C2435%2C1657&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Ada King, Countess of Lovelace, was more than just another mathematician.</span> <span class="attribution"><a class="source" href="https://upload.wikimedia.org/wikipedia/commons/a/a4/Ada_Lovelace_portrait.jpg">Watercolor portrait of Ada King, Countess of Lovelace by Alfred Edward Chalon via Wikimedia</a></span></figcaption></figure><p>Ada Lovelace, known as the first computer programmer, was born on Dec. 10, 1815, more than a century before digital electronic computers were developed. </p>
<p>Lovelace has been hailed as a model for girls in science, technology, engineering and math (STEM). A dozen biographies for young audiences were published for the 200th anniversary of her birth in 2015. And in 2018, <a href="https://www.nytimes.com/interactive/2018/obituaries/overlooked-ada-lovelace.html">The New York Times added hers</a> as one of the first “missing obituaries” of women at the rise of the #MeToo movement. </p>
<p>But Lovelace – properly Ada King, Countess of Lovelace after her marriage – drew on many different fields for her innovative work, including languages, music and needlecraft, in addition to mathematical logic. Recognizing that her well-rounded education enabled her to accomplish work that was well ahead of her time, she can be a model for all students, not just girls. </p>
<p>Lovelace was the daughter of the scandal-ridden romantic poet George Gordon Byron, aka Lord Byron, and his highly educated and strictly religious wife Anne Isabella Noel Byron, known as Lady Byron. Lovelace’s parents separated shortly after her birth. At a time when women were not allowed to own property and had few legal rights, her mother managed to secure custody of her daughter.</p>
<p>Growing up in a privileged aristocratic family, Lovelace was educated by home tutors, <a href="https://blogs.bodleian.ox.ac.uk/adalovelace/2018/07/27/ada-lovelace-the-making-of-a-computer-scientist/">as was common for girls like her</a>. She received lessons in French and Italian, music and in suitable handicrafts such as embroidery. Less common for a girl in her time, she also studied math. Lovelace continued to work with math tutors into her adult life, and she eventually corresponded with mathematician and logician <a href="https://www.britannica.com/biography/Augustus-De-Morgan">Augustus De Morgan</a> at London University about symbolic logic. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="antique black-and-white photograph of a woman in an elaborate outfit" src="https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=750&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=750&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=750&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=942&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=942&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499374/original/file-20221206-8973-zv7gqi.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=942&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A rare photograph of Ada Lovelace.</span>
<span class="attribution"><a class="source" href="https://upload.wikimedia.org/wikipedia/commons/b/b7/Ada_Byron_daguerreotype_by_Antoine_Claudet_1843_or_1850_-_cropped.png">Daguerreotype by Antoine Claudet via Wikimedia</a></span>
</figcaption>
</figure>
<h2>Lovelace’s algorithm</h2>
<p>Lovelace drew on all of these lessons when she wrote her <a href="https://catalog.lindahall.org/discovery/delivery/01LINDAHALL_INST:LHL/12100178280005961#page=680">computer program</a> – in reality, it was a set of instructions for a mechanical calculator that had been built only in parts. </p>
<p>The computer in question was the <a href="https://www.computerhistory.org/babbage/engines/">Analytical Engine</a> designed by mathematician, philosopher and inventor <a href="https://www.britannica.com/biography/Charles-Babbage">Charles Babbage</a>. Lovelace had met Babbage when she was introduced to London society. The two related to each other over their shared love for mathematics and fascination for mechanical calculation. By the early 1840s, Babbage had won and lost government funding for a mathematical calculator, fallen out with the skilled craftsman building the precision parts for his machine, and was close to giving up on his project. At this point, Lovelace stepped in as an advocate. </p>
<p>To make Babbage’s calculator known to a British audience, Lovelace proposed to translate into English an article that described the Analytical Engine. The article was written in French by the Italian mathematician <a href="https://mathshistory.st-andrews.ac.uk/Biographies/Menabrea/">Luigi Menabrea</a> and published in a Swiss journal. Scholars believe that <a href="https://www.mhpbooks.com/books/adas-algorithm/">Babbage encouraged her to add notes of her own</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/J7ITqnEmf-g?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Ada Lovelace envisioned in the early 19th century the possibilities of computing.</span></figcaption>
</figure>
<p>In her notes, which ended up twice as long as the original article, Lovelace drew on different areas of her education. Lovelace began by describing how to code instructions onto cards with punched holes, like those used for the <a href="https://www.sciencehistory.org/distillations/the-french-connection">Jacquard weaving loom</a>, a device patented in 1804 that used punch cards to automate weaving patterns in fabric. </p>
<p>Having learned embroidery herself, Lovelace was familiar with the repetitive patterns used for handicrafts. Similarly repetitive steps were needed for mathematical calculations. To avoid duplicating cards for repetitive steps, Lovelace used <a href="https://dl.acm.org/doi/book/10.1145/28095230">loops, nested loops and conditional testing</a> in her program instructions.</p>
<p>The notes included instructions on how to calculate <a href="https://mathworld.wolfram.com/BernoulliNumber.html">Bernoulli numbers</a>, which Lovelace knew from her training to be important in the study of mathematics. Her program showed that the Analytical Engine was capable of performing original calculations that had not yet been performed manually. At the same time, Lovelace noted that the machine could only follow instructions and not “<a href="https://www.simonandschuster.com/books/The-Innovators/Walter-Isaacson/9781476708706">originate anything</a>.”</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a yellowed sheet of paper with spreadsheet-like lines" src="https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=407&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=407&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=407&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=512&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=512&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499815/original/file-20221208-7231-ctxrb1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=512&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ada Lovelace created this chart for the individual program steps to calculate Bernoulli numbers.</span>
<span class="attribution"><span class="source">Courtesy of Linda Hall Library of Science, Engineering & Technology</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Finally, Lovelace recognized that the numbers manipulated by the Analytical Engine could be seen as other types of symbols, such as musical notes. An accomplished singer and pianist, Lovelace was familiar with musical notation symbols representing aspects of musical performance such as pitch and duration, and she had manipulated logical symbols in her correspondence with De Morgan. It was not a large step for her to realize that the Analytical Engine could process symbols — not just crunch numbers — and even compose music. </p>
<h2>A well-rounded thinker</h2>
<p>Inventing computer programming was not the first time Lovelace brought her knowledge from different areas to bear on a new subject. For example, as a young girl, she was fascinated with flying machines. Bringing together biology, mechanics and poetry, she asked her mother for anatomical books to study the function of bird wings. She built and experimented with wings, and in her letters, she metaphorically expressed her longing for her mother in the <a href="https://books.google.com/books/about/Ada_the_Enchantress_of_Numbers.html?id=jCKmtAEACAAJ">language of flying</a>. </p>
<p>Despite her talents in logic and math, Lovelace <a href="https://link.springer.com/book/10.1007/978-3-030-78973-2">didn’t pursue a scientific career</a>. She was independently wealthy and never earned money from her scientific pursuits. This was common, however, at a time when freedom – including financial independence – was equated with the <a href="https://press.princeton.edu/books/paperback/9780691178165/leviathan-and-the-air-pump">capability to impartially conduct scientific experiments</a>. In addition, Lovelace devoted just over a year to her only publication, the translation of and notes on Menabrea’s paper about the Analytical Engine. Otherwise, in her life cut short by cancer at age 37, she vacillated between math, music, her mother’s demands, care for her own three children, and eventually a passion for gambling. Lovelace thus may not be an obvious model as a female scientist for girls today.</p>
<p>However, I find Lovelace’s way of drawing on her well-rounded education to solve difficult problems inspirational. True, she lived in an age before scientific specialization. Even Babbage was a <a href="https://theconversation.com/nobel-prizes-most-often-go-to-researchers-who-defy-specialization-winners-are-creative-thinkers-who-synthesize-innovations-from-varied-fields-and-even-hobbies-186193">polymath</a> who worked in mathematical calculation and mechanical innovation. He also published a treatise on industrial manufacturing and another on religious questions of creationism. </p>
<p>But Lovelace applied knowledge from what we today think of as disparate fields in the sciences, arts and the humanities. A well-rounded thinker, she created solutions that were well ahead of her time.</p><img src="https://counter.theconversation.com/content/193930/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Corinna Schlombs does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Lovelace was a prodigious math talent who learned from the giants of her time, but her linguistic and creative abilities were also important in her invention of computer programming.Corinna Schlombs, Associate Professor of History, Rochester Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1908362022-09-27T20:10:03Z2022-09-27T20:10:03Z‘Protestware’ is on the rise, with programmers self-sabotaging their own code. Should we be worried?<figure><img src="https://images.theconversation.com/files/486674/original/file-20220927-21-j7bai9.jpg?ixlib=rb-1.1.0&rect=0%2C153%2C6024%2C3589&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/KgLtFCgfC28">Alexander Sinn/Unsplash</a></span></figcaption></figure><p>In March 2022, the author of <a href="http://riaevangelist.github.io/node-ipc/">node-ipc</a>, a software library with <a href="https://www.npmjs.com/package/node-ipc">over a million weekly downloads</a>, deliberately <a href="https://techcrunch.com/2022/07/27/protestware-code-sabotage/">broke their code</a>. If the code discovers it is running within Russia or Belarus, it attempts to replace the contents of every file on the user’s computer with a heart emoji.</p>
<p>A software library is a collection of code other programmers can use for their purposes. The library node-ipc is used by <a href="https://vuejs.org/">Vue.js</a>, a framework that powers millions of websites for businesses such as Google, Facebook, and Netflix.</p>
<p>This <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-23812">critical security vulnerability</a> is just one example of a <a href="https://research.unimelb.edu.au/research-updates/the-emergence-of-political-protestware-in-the-software-ecosystem">growing trend</a> of programmers self-sabotaging their own code for political purposes. When programmers protest through their code – a phenomenon known as “protestware” – it can have consequences for the people and businesses who rely on the code they create.</p>
<h2>Different forms of protest</h2>
<p>My colleague <a href="https://raux.github.io/">Raula Gaikovina Kula</a> and I <a href="https://arxiv.org/abs/2208.01393">have identified</a> three main types of protestware.</p>
<p><strong>Malignant protestware</strong> is software that intentionally damages or takes control of a user’s device without their knowledge or consent.</p>
<p><strong>Benign protestware</strong> is software created to raise awareness about a social or political issue, but does not damage or take control of a user’s device.</p>
<p><strong>Developer sanctions</strong> are instances of programmers’ accounts being <a href="https://www.jessesquires.com/blog/2022/04/19/github-suspending-russian-accounts/">suspended</a> by the <a href="https://github.com/">internet hosting service</a> that provides them with a space to store their code and collaborate with others.</p>
<p>Modern software systems are prone to vulnerabilities because they rely on third-party libraries. These libraries are made of code that performs particular functions, created by someone else. Using this code lets programmers add existing functions into their own software without having to “<a href="https://arxiv.org/abs/2005.12574">reinvent the wheel</a>”.</p>
<p>The use of third-party libraries <a href="https://arxiv.org/abs/2112.10165">is common</a> among programmers – it speeds up the development process and reduces costs. For example, libraries listed in the popular <a href="https://www.npmjs.com/">NPM registry</a>, which contains more than 1 million libraries, rely on an average of <a href="https://arxiv.org/abs/2205.13231">five to six</a> other libraries from the same <a href="https://link.springer.com/chapter/10.1007/978-981-13-7099-1_6">ecosystem</a>. It’s like a car manufacturer who uses parts from other manufacturers to complete their vehicles.</p>
<p>These libraries are typically maintained by one or a handful of volunteers and made available to other programmers for free under an open-source software license.</p>
<p>The success of a third-party library is based on its reputation among programmers. A library builds its reputation over time, as programmers gain trust in its capabilities and the responsiveness of its maintainers to reported defects and feature requests.</p>
<p>If third-party library weaknesses are exploited, it could give attackers access to a software system. For example, a <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=cve-2021-44228">critical security vulnerability</a> was recently discovered in the popular <a href="https://logging.apache.org/log4j/">Log4j</a> library. This flaw could allow a remote attacker to access sensitive information that was logged by applications using Log4j – such as passwords or other sensitive data.</p>
<p>What if vulnerabilities are not created by an attacker looking for passwords, but by the programmer themselves with the intention to make users of their library aware of a political opinion? The emergence of protestware is giving rise to such questions, and responses are mixed.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-log4j-a-cybersecurity-expert-explains-the-latest-internet-vulnerability-how-bad-it-is-and-whats-at-stake-173896">What is Log4j? A cybersecurity expert explains the latest internet vulnerability, how bad it is and what's at stake</a>
</strong>
</em>
</p>
<hr>
<h2>Ethical questions abound</h2>
<p>A <a href="https://blog.opensource.org/open-source-protestware-harms-open-source/">blog post</a> on the <a href="https://opensource.org/">Open Source Initiative site</a> responds to the rise of protestware stating “protest is an important element of free speech that should be protected” but concludes with a warning:</p>
<blockquote>
<p>“The downsides of vandalising open source projects far outweigh any possible benefit, and the blowback will ultimately damage the projects and contributors responsible.”</p>
</blockquote>
<p>What is the main ethical question behind protestware? Is it ethical to make something worse in order to make a point? The answer to this question largely depends on the individual’s personal ethical beliefs.</p>
<p>Some people may see the impact of the software on its users and argue protestware is unethical if it’s designed to make life more difficult for them. Others may argue that if the software is designed to make a point or raise awareness about an issue, it may be seen as more ethically acceptable.</p>
<p>From a utilitarian perspective, one might argue that if a form of protestware is effective in bringing about a greater good (such as political change), then it can be morally justified.</p>
<p>From a technical standpoint, we are developing ways to automatically detect and counteract protestware. Protestware would be an <a href="https://arxiv.org/abs/1710.01943">unusual</a> or <a href="https://arxiv.org/abs/2204.07363">surprising</a> event in the change history of a third-party library. Mitigation is possible through redundancies – for example, code that is similar or identical to other code in the same or different libraries.</p>
<p>The rise of protestware is a symptom of a larger social problem. When people feel they are not being heard, they may resort to different measures to get their message across. In the case of programmers, they have the unique ability to protest through their code.</p>
<p>While protestware may be a new phenomenon, it is likely here to stay. We need to be aware of the ethical implications of this trend and take steps to ensure software development remains a stable and secure field.</p>
<p>We rely on software to run our businesses and our lives. But every time we use software, we’re putting our trust in the people who wrote it. The emergence of protestware threatens to destabilise this trust if we don’t take action.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-does-the-optus-data-breach-mean-for-you-and-how-can-you-protect-yourself-a-step-by-step-guide-191332">What does the Optus data breach mean for you and how can you protect yourself? A step-by-step guide</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/190836/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Christoph Treude does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Major companies around the world rely on third-party code. What happens when a programmer has a political point to make?Christoph Treude, Senior Lecturer in Software Engineering, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1859572022-06-29T19:56:36Z2022-06-29T19:56:36ZSo this is how it feels when the robots come for your job: what GitHub’s Copilot ‘AI assistant’ means for coders<figure><img src="https://images.theconversation.com/files/471517/original/file-20220629-24-n3q489.jpeg?ixlib=rb-1.1.0&rect=0%2C0%2C5562%2C3705&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/female-programmer-typing-source-codes-coffee-705197296">Shutterstock</a></span></figcaption></figure><p>I love writing code to make things: apps, websites, charts, even
<a href="https://benswift.me/livecoding/">music</a>. It’s a skill I’ve worked hard at for
more than 20 years. </p>
<p>So I must confess <a href="https://github.blog/2022-06-21-github-copilot-is-generally-available-to-all-developers/">last week’s news</a>
about the release of a new “AI assistant” coding helper called <a href="https://copilot.github.com">GitHub Copilot</a> gave me complicated feelings.</p>
<p>Copilot, which spits out code to order based on “plain English” descriptions, is a remarkable tool. But is it about to put coders like me out of a job?</p>
<h2>Trained on billions of lines of human code</h2>
<p><a href="https://github.com/features/copilot/#faq-human-oversight">GitHub</a> (now <a href="https://news.microsoft.com/2018/06/04/microsoft-to-acquire-github-for-7-5-billion/">owned by Microsoft</a>) is a collaboration platform and social network for coders. You can think of it as something like a cross between Dropbox and Instagram, used by everyone from individual hobbyists through to highly paid software engineers at big tech companies. </p>
<p>Over the past decade or so, GitHub’s users have uploaded tens of billions of lines of code for more than 200 million apps. That’s a lot of <code>if</code>s and <code>for</code>s and
<code>print("hello world")</code> statements.</p>
<p>The Copilot AI works like many other machine learning tools: it was “trained” by scanning through and looking for patterns in those tens of billions of lines of code written and uploaded by members of GitHub’s coder community.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A screenshot of computer code produced by Copilot." src="https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=330&fit=crop&dpr=1 600w, https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=330&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=330&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=415&fit=crop&dpr=1 754w, https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=415&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/471504/original/file-20220629-18-u21rpx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=415&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Copilot produces code from instructions in plain English (the pale blue text).</span>
<span class="attribution"><a class="source" href="https://github.com/features/copilot/">GitHub</a></span>
</figcaption>
</figure>
<p>The training can take many months, hundreds of millions of dollars in computing equipment, and enough electricity to run a house for a decade. Once it’s done, though, human coders can then write a description (in plain English) of what they want their code to do, and the Copilot AI helper will write the code for them.</p>
<p>Based on the <a href="https://openai.com/blog/openai-codex/">Codex “language model”</a>, Copilot is the next step in a long line of “intelligent auto-completion” tools. However, these have been far more limited in the past. Copilot is a significant improvement.</p>
<h2>A startlingly effective assistant</h2>
<p>I was given early “preview” access to Copilot about a year ago, and I’ve been using it on and off. It takes some practice to learn exactly how to frame your requests in English so the Copilot AI gives the most useful code output, but it can be startlingly effective.</p>
<p>However, we’re still a <em>long</em> way from “Hey Siri, make me a million dollar iPhone app”. It’s still necessary to use my software design skills to figure out what the different bits of code should do in my app. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1539594049753153536"}"></div></p>
<p>To understand the level Copilot is working at, imagine writing an essay. You can’t just throw the essay question at it and expect it to produce a useful, well-argued piece. But if you figure out the argument and maybe write the topic sentence for each paragraph, it will often do a pretty good job at filling in the rest of each paragraph automatically. </p>
<p>Depending on the type of coding I’m doing, this can sometimes be a huge time- and brainpower-saver.</p>
<h2>Biases and bugs</h2>
<p>There are some open questions with these sorts of AI coding helper tools. I’m a bit worried they’ll introduce, and reinforce, winner-takes-all dynamics: very few companies have the data (in this case, the billions of lines of code) to build tools like this, so creating a competitor to Copilot will be challenging. </p>
<p>And will Copilot itself be able to suggest new and better ways to write code and build software? We have seen AI systems <a href="https://www.wired.com/2016/03/two-moves-alphago-lee-sedol-redefined-future/">innovate</a> before. On the other hand, Copilot may be limited to doing things the way we’ve always done them, as AI systems <a href="https://www.wired.com/story/ai-biased-how-scientists-trying-fix/">trained on past data</a> are prone to do.</p>
<p>My experiences with Copilot have also made me very aware my expertise is still needed, to check the “suggested” code is actually what I’m looking for. </p>
<p>Sometimes it’s trivial to see that Copilot has misunderstood my input. Those are the easy cases, and the tool makes it easy to ask for a different suggestion. </p>
<p>The trickier cases are where the code looks right, but it may contain a subtle bug. The bug might be because this AI code generation stuff is <em>hard</em>, or it might be because the billions of lines of human-written code that Copilot was trained on contained bugs of their own. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1539731632931803137"}"></div></p>
<p>Another concern is <a href="https://fossa.com/blog/analyzing-legal-implications-github-copilot/">potential issues</a> about licensing and ownership of the code Copilot was trained on. GitHub has said it is <a href="https://github.com/features/copilot/#faq-human-oversight">trying to address these issues</a>, but we will have to wait and see how it turns out.</p>
<h2>More output from the same input</h2>
<p>At times, using Copilot has made me feel a little wistful. The skill I often think makes me at least a <em>little bit</em> special (my ability to write code and make things with computers) may be in the process of being “automated away”, like many other jobs have been at different times in human history. </p>
<p>However, I’m not selling my laptop and running off to live a simple
life in the bush just yet. The human coder is still a crucial part of the system, but as curator rather than creator.</p>
<p>Of course, you may be thinking “that’s what a coder <em>would</em> say” … and you may be right. </p>
<p>AI tools like Copilot, OpenAI’s <a href="https://openai.com/blog/gpt-3-apps/">text generator GPT-3</a>, and Google’s <a href="https://imagen.research.google">Imagen text-to-image engine</a>, have seen huge improvements in the past few years.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/robots-are-creating-images-and-telling-jokes-5-things-to-know-about-foundation-models-and-the-next-generation-of-ai-181150">Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI</a>
</strong>
</em>
</p>
<hr>
<p>Many in white-collar “creative industries” which deal in
text and images are starting to wrestle with their fears of being (at least partially) automated away. Copilot shows some of us in the tech industry are in the same boat.</p>
<p>Still, I’m (cautiously) excited. Copilot is a force multiplier in the most optimistic tool-building tradition: it provides more leverage, to increase the useful output for the same amount of input. </p>
<p>These new tools and the new leverage they provide are embedded in wider systems of people, technology and environmental actors, and I’m really fascinated to see how these systems reconfigure themselves in response. </p>
<p>In the meantime, it might help save my brain juice for the hard parts of my coding work, which can only be a good thing.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/artificial-intelligence-is-now-part-of-our-everyday-lives-and-its-growing-power-is-a-double-edged-sword-169449">Artificial intelligence is now part of our everyday lives – and its growing power is a double-edged sword</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/185957/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>As mentioned in the article, I was given early beta access to the GitHub copilot AI tool.</span></em></p>A new AI tool that writes computer code on demand has programmers considering their future.Ben Swift, Educational Experiences team lead (Senior Lecturer), ANU School of Cybernetics, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1800872022-05-18T12:13:22Z2022-05-18T12:13:22ZNonprogrammers are building more of the world’s software – a computer scientist explains ‘no-code’<p>Traditional computer programming has a steep learning curve that requires learning a programming language, for example C/C++, Java or Python, just to build a simple application such as a calculator or Tic-tac-toe game. Programming also requires substantial debugging skills, which easily frustrates new learners. The study time, effort and experience needed often stop nonprogrammers from making software from scratch. </p>
<p>No-code is a way to program websites, mobile apps and games without using codes or scripts, or sets of commands. People readily <a href="https://guides.lib.unc.edu/visual-literacy/learning">learn from visual cues</a>, which led to the development of “what you see is what you get” (<a href="https://www.merriam-webster.com/dictionary/WYSIWYG">WYSIWYG</a>) document and multimedia editors as early as the 1970s. WYSIWYG editors allow you to work in a document as it appears in finished form. The concept was extended to software development in the 1990s.</p>
<p>There are many no-code development platforms that allow both programmers and nonprogrammers to create software through drag-and-drop graphical user interfaces instead of traditional line-by-line coding. For example, a user can drag a label and drop it to a website. The no-code platform will show how the label looks and create the corresponding HTML code. No-code development platforms generally offer templates or modules that allow anyone to build apps.</p>
<h2>Early days</h2>
<p>In the 1990s, websites were the most familiar interface to users. However, building a website required HTML coding and script-based programming that are not easy for a person lacking programming skills. This led to the release of early no-code platforms, including Microsoft FrontPage and Adobe Dreamweaver, to help nonprogrammers build websites. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a screenshot showing computer code" src="https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/463761/original/file-20220517-12-21r5xc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Traditional programming requires learning a programming language.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/williamismael/35024677024/">WILLPOWER STUDIOS/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Following the WYSIWYG mindset, nonprogrammers could drag and drop website components such as labels, text boxes and buttons without using HTML code. In addition to editing websites locally, these tools also helped users upload the built websites to remote web servers, a key step in putting a website online. </p>
<p>However, the websites created by these editors were basic static websites. There were no advanced functions such as user authentication or database connections. </p>
<h2>Website development</h2>
<p>There are many current no-code website-building platforms such as <a href="https://bubble.io/">Bubble</a>, <a href="https://www.wix.com/">Wix</a>, <a href="https://wordpress.org/">WordPress</a> and <a href="https://workspace.google.com/products/sites/">GoogleSites</a> that overcome the shortcomings of the early no-code website builders. Bubble allows users to design the interface by defining a workflow. A workflow is a series of actions triggered by an event. For instance, when a user clicks on the save button (the event), the current game status is saved to a file (the series of actions).</p>
<p>Meanwhile, Wix launched an <a href="https://www.techradar.com/news/internet/web/html5-what-is-it-1047393">HTML5</a> site builder that includes a library of website templates. In addition, Wix supports modules – for example, data analysis of visitor data such as contact information, messages, purchases and bookings; booking support for hotels and vacation rentals; and a platform for independent musicians to market and sell their music. </p>
<p>WordPress was originally developed for personal blogs. It has since been extended to support forums, membership sites, <a href="https://citl.illinois.edu/citl-101/teaching-learning/resources/teaching-across-modalities/teaching-tips-articles/teaching-tips/2021/08/05/what-is-a-learning-management-system">learning management systems</a> and online stores. Like WordPress, GoogleSites lets users create websites with various embedded functions from Google, such as YouTube, Google Maps, Google Drive, calendar and online office applications.</p>
<h2>Game and mobile apps</h2>
<p>In addition to website builders, there are no-code platforms for game and mobile app development. The platforms are aimed at designers, entrepreneurs and hobbyists who don’t have game development or coding knowledge. </p>
<p><a href="https://gamemaker.io/en">GameMaker</a> provides a user interface with built-in editors for raster graphics, game level design, scripting, paths and “<a href="https://www.gamedesigning.org/learn/shaders/">shaders</a>” for representing light and shadow. GameMaker is primarily intended for making games with 2D graphics and 2D skeletal animations. </p>
<p><a href="https://signup.buildbox.com/">Buildbox</a> is a no-code 3D game development platform. The main features of Buildbox include the image drop wheel, asset bar, option bar, collision editor, scene editor, physics simulation and even monetization options. While using Buildbox, users also get access to a library of game assets, sound effects and animations. In addition, Buildbox users can create the story of the game. Then users can edit game characters and environmental settings such as weather conditions and time of day, and change the user interface. They can also animate objects, insert video ads, and export their games to different platforms such as PCs and mobile devices. </p>
<p>Games such as <a href="https://www.minecraft.net/en-us">Minecraft</a> and <a href="https://www.ea.com/games/simcity">SimCity</a> can be thought of as tools for creating virtual worlds without coding. </p>
<h2>Future of no-code</h2>
<p>No-code platforms help <a href="https://www.zdnet.com/article/low-code-and-no-code-platforms-move-beyond-the-shiny-tools-stage/">increase the number of developers</a>, in a time of <a href="https://www.infoworld.com/article/3654480/demand-for-software-developers-doubled-in-2021.html">increasing demand for software development</a>. No-code is showing up in fields such as <a href="https://hackernoon.com/top-10-platforms-to-create-an-ecommerce-mobile-app-without-coding-z58z33po">e-commerce</a>, <a href="http://bweducation.businessworld.in/article/No-Code-A-Game-Changing-Technology-To-Enable-Digital-Education/08-09-2021-403529/">education</a> and <a href="https://www.beckershospitalreview.com/no-code-tech-and-healthcare-s-new-digital-front-door-4-experts-weigh-in.html">health care</a>.</p>
<p>I expect that no-code will play a <a href="https://www.nytimes.com/2022/03/15/technology/ai-no-code.html">more prominent role in artificial intelligence</a>, as well. Training machine-learning models, the heart of AI, requires time, effort and experience. No-code programming can help reduce the time to train these models, which makes it easier to use AI for many purposes. For example, one no-code AI tool allows nonprogrammers to <a href="https://juji.io/">create chatbots</a>, something that would have been unimaginable even a few years ago.</p><img src="https://counter.theconversation.com/content/180087/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tam Nguyen receives funding from National Science Foundation, Lam Research, and NVIDIA.</span></em></p>Developing software used to require programming skills. Today, a growing number of people are building websites, games and even AI programs without writing a line of code.Tam Nguyen, Assistant Professor of Computer Science, University of DaytonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1574832021-04-06T15:09:30Z2021-04-06T15:09:30ZPerfecting self-driving cars – can it be done?<figure><img src="https://images.theconversation.com/files/390949/original/file-20210322-15-1nbaibc.jpg?ixlib=rb-1.1.0&rect=0%2C11%2C3867%2C2440&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/self-driving-electronic-computer-cars-on-586488983">posteriori/Shutterstock</a></span></figcaption></figure><p>Robotic vehicles have been used in dangerous environments for decades, from decommissioning the <a href="https://www.sciencemag.org/news/2016/03/how-robots-are-becoming-critical-players-nuclear-disaster-cleanup">Fukushima nuclear power plant</a> or inspecting <a href="https://orcahub.org/innovation/focus-areas/mapping-surveying-inspection">underwater energy infrastructure</a> in the North Sea. More recently, autonomous vehicles from <a href="https://militaryembedded.com/unmanned/isr/autonomous-sea-boats-demonstrated-with-bae-systems-royal-navy">boats</a> to <a href="https://www.bbc.co.uk/news/uk-england-northamptonshire-55076342">grocery delivery carts</a> have made the gentle transition from research centres into the real world with very few hiccups.</p>
<p>Yet the promised arrival of self-driving cars has not progressed beyond the testing stage. And in one test drive of an Uber self-driving car in 2018, <a href="https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html">a pedestrian was killed</a> by the vehicle. Although these accidents happen every day when humans are behind the wheel, the public holds driverless cars to far higher safety standards, interpreting one-off accidents as proof that these vehicles are too unsafe to unleash on public roads. </p>
<figure class="align-center ">
<img alt="A small trolley-like robot with a flag on a city street." src="https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/391394/original/file-20210324-17-11grj9o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">If only it were as easy as autonomous grocery delivery robots.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/west-lafayette-circa-august-2020-starship-1806786550">Jonathan Weiss/Shutterstock</a></span>
</figcaption>
</figure>
<p>Programming the perfect self-driving car that will always make the safest decision is a huge and technical task. Unlike other autonomous vehicles, which are generally rolled out in tightly controlled environments, self-driving cars must function in the endlessly unpredictable road network, rapidly processing many <a href="https://ieeexplore.ieee.org/abstract/document/5940562">complex variables</a> to remain safe. </p>
<p>Inspired by the <a href="https://www.gov.uk/browse/driving/highway-code-road-safety">highway code</a>, we’re working on a set of rules that will help self-driving cars make the safest decisions in every conceivable scenario. Verifying that these rules work is the final roadblock we must overcome to get trustworthy self-driving cars safely onto our roads.</p>
<h2>Asimov’s first law</h2>
<p>Science fiction author Isaac Asimov penned the “three laws of robotics” in 1942. The first and most important law reads: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” When self-driving cars <a href="https://www.digitaltrends.com/cool-tech/most-significant-self-driving-car-crashes/">injure humans</a>, they clearly violate this first law.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/are-self-driving-cars-safe-expert-on-how-we-will-drive-in-the-future-128644">Are self-driving cars safe? Expert on how we will drive in the future</a>
</strong>
</em>
</p>
<hr>
<p>We at the <a href="https://www.hw.ac.uk/uk/research/the-national-robotarium.htm">National Robotarium</a> are leading research intended to guarantee that <a href="https://www.hw.ac.uk/news/articles/2020/national-robotarium-unveils-3m-research.htm">self-driving vehicles</a> will always make decisions that abide by this law. Such a guarantee would provide the solution to the very serious safety concerns that are preventing self-driving cars from taking off worldwide. </p>
<p>AI software is actually quite good at learning about scenarios it has never faced. Using “<a href="https://dl.acm.org/doi/10.1145/3414080.3414081">neural networks</a>” that take their inspiration from the layout of the human brain, such software can spot patterns in data, like the movements of cars and pedestrians, and then recall these patterns in novel scenarios.</p>
<p>But we still need to prove that any safety rules taught to self-driving cars will work in these new scenarios. To do this, we can turn to <a href="https://dl.acm.org/doi/10.1145/363235.363259">formal verification</a>: the method that computer scientists use to prove that a rule works <a href="https://link.springer.com/chapter/10.1007%2F978-3-030-64437-6_4">in all circumstances</a>. </p>
<p>In mathematics, for example, rules can prove that x + y is equal to y + x without testing every possible value of x and y. Formal verification does something similar: it allows us to prove how AI software will react to different scenarios without our having to exhaustively test every scenario that could occur on public roads.</p>
<p>One of the more notable recent successes in the field is the verification of an AI system that uses neural networks to avoid collisions between <a href="https://ieeexplore.ieee.org/abstract/document/9081748">autonomous aircraft</a>. Researchers have successfully formally verified that the system will always respond correctly, regardless of the horizontal and vertical manoeuvres of the aircraft involved.</p>
<h2>Highway coding</h2>
<p>Human drivers follow a <a href="https://www.gov.uk/guidance/the-highway-code">highway code</a> to keep all road users safe, which relies on the human brain to learn these rules and applying them sensibly in innumerable real-world scenarios. We can teach self-driving cars the highway code too. That requires us to unpick each rule in the code, teach vehicles’ neural networks to understand how to obey each rule, and then <a href="https://link.springer.com/chapter/10.1007/978-3-319-63387-9_5">verify that</a> they can be relied upon to safely obey these rules in all circumstances.</p>
<p>However, the challenge of verifying that these rules will be safely followed is complicated when examining the consequences of the phrase “must never” in the highway code. To make a self-driving car as reactive as a human driver in any given scenario, we must program these policies in such a way that accounts for nuance, weighted risk and the occasional scenario where different rules are in direct conflict, requiring the car to ignore one or more of them. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/ixIoDYVfKA0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Robot ethicist Patrick Lin introducing the complexity of automated decision-making in self-driving cars.</span></figcaption>
</figure>
<p>Such a task cannot be left solely to programmers – it’ll require input from lawyers, security experts, system engineers and policymakers. Within our newly formed <a href="https://www.macs.hw.ac.uk/aisec/index.php/goals">AISEC project</a>, a team of researchers is designing a tool to facilitate the kind of interdisciplinary collaboration needed to create ethical and legal standards for self-driving cars.</p>
<p>Teaching self-driving cars to be perfect will be a dynamic process: dependent upon how legal, cultural and technological experts define perfection over time. The AISEC tool is being built with this in mind, offering a “mission control panel” to monitor, supplement and adapt the most successful rules governing self-driving cars, which will then be made available to the industry. </p>
<p>We’re hoping to deliver the first experimental prototype of the AISEC tool by 2024. But we still need to create <a href="https://dl.acm.org/doi/10.1145/3414080.3414081">adaptive verification methods</a> to address remaining safety and security concerns, and these will likely take years to build and embed into self-driving cars.</p>
<p>Accidents involving self-driving cars always create headlines. A self-driving car that recognises a pedestrian and stops before hitting them 99% of the time is a cause for celebration in research labs, but a killing machine in the real world. By creating robust, verifiable safety rules for self-driving cars, we’re attempting to make that 1% of accidents a thing of the past.</p><img src="https://counter.theconversation.com/content/157483/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span><a href="mailto:e.komendantskaya@hw.ac.uk">e.komendantskaya@hw.ac.uk</a> receives funding from EPSRC, NCSC, DSTL. </span></em></p><p class="fine-print"><em><span>Luca Arnaboldi and Matthew Daggitt do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The public holds self-driving cars to incredibly high safety standards – and we’re working to meet them.Ekaterina Komendantskaya, Professor, School of Mathematical and Computer Sciences, Heriot-Watt UniversityLuca Arnaboldi, Research Associate, School of Informatics, The University of EdinburghMatthew Daggitt, Research Associate, School of Mathematical and Computer Sciences, Heriot-Watt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1279372019-12-08T07:14:52Z2019-12-08T07:14:52ZWhy all children must learn code<figure><img src="https://images.theconversation.com/files/304447/original/file-20191129-95226-21ys3b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Coding can enhance children’s creativity and their understanding of mathematics</span> <span class="attribution"><span class="source">wavebreakmedia/Shutterstock</span></span></figcaption></figure><p>Across the world, the conversion of information into a digital format – also called “digitalisation” – <a href="https://www.idrc.ca/sites/default/files/sp/Images/idl-57429_2.pdf">has increased</a> productivity in the public and private sectors. As a result, virtually every country in the world is working towards a digital economy. </p>
<p>As this new economy evolves, special skills like computer programming are needed. This is like a language of numbers, known as code, which <a href="https://www.edx.org/learn/computer-programming">allows people</a> to write instructions that are executed by computers. The goal is to create something: from a web page, to an image, to a piece of software. </p>
<p>Early coding languages emerged in the 1940s. These were basic in what they could do but complex to learn and needed an advanced understanding of maths. By the 1990s – when universities, businesses and people started to connect over the internet – computing speed and memory improved to use high-level coding languages. These became widely available on open source platforms and online tutorials made it possible for many people to learn and continue advancing the languages so that they became simpler. Today languages like <a href="https://www.w3schools.com/js/">Javascript</a> can easily be learnt by children. </p>
<p>Nobody can escape the touch of digital technologies. It’s used in fields as diverse as hospital equipment, remote education delivery, marketing creative art pieces or improving agricultural productivity. Coding language develops the software that can effectively deal with problems and challenges – for instance, because of coding, people who couldn’t get a bank account <a href="https://www.safaricom.co.ke/personal/m-pesa/do-more-with-m-pesa/m-pesa-and-your-bank">can now</a> keep, send and borrow money using mobile phones. It’s an important skill to have as countries develop. </p>
<p>In the past four decades, several studies have assessed the effect of learning code on primary school children – usually between the ages of six and 13. In each case, the findings <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.359.8341&rep=rep1&type=pdf">show</a> that it is beneficial to children, irrespective of their career path later on in life.</p>
<h2>Computer language</h2>
<p>Coding is just another language, and children <a href="https://www.foreignpolicyjournal.com/2010/11/17/children-learning-languages-faster-than-adults-the-argument-continues-in-vietnam/">are known</a> to learn new languages faster than older people. So starting young is a good idea. </p>
<p><a href="https://www.nytimes.com/2014/03/24/world/europe/adding-coding-to-the-curriculum.html">Several countries</a> – including Australia, Finland, Italy and England – have developed coding curriculum for children between the ages of five and 16 years. </p>
<p>Coding language works with zeros (0s) and ones (1s) and strings of these numbers represent an alphabet. These then translate into words and sentences which triggers the computer or processor to initiate specific tasks. For example, print an image on a screen, open a document that is saved in a computer or play some music. </p>
<p>There are various coding languages. Some are so easy to understand and work with that even children can learn them. Visual programming languages – <a href="https://scratch.mit.edu/">like Scratch</a> – have been developed to help children learn code using images, signs and diagrams. Other programming languages that children can use include <a href="https://www.python.org">Python</a>, <a href="https://www.ruby-lang.org/en/">Ruby</a> and <a href="https://golang.org">Go</a>.</p>
<p>Most of these languages can be used to write a series of commands or to develop web applications. </p>
<h2>Benefits of coding</h2>
<p>Aside from giving them a head start for the future of work, compared to other forms of numeric sciences, learning code can enhance children’s creativity. </p>
<p>For instance, much of teaching math in Africa is <a href="http://aadcice.hiroshima-u.ac.jp/e/publications/sosho4_1-01_02.pdf">still done</a> through rote learning, a pedagogical method that is outdated and <a href="https://www.theguardian.com/teacher-network/2013/oct/15/play-creative-learning-roundtable">discourages creativity</a> in children. Rote learning is based on memorisation of information and repetition, “parroting” so to speak. Research <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.911.2397&rep=rep1&type=pdf">shows</a> that rote learning isn’t effective because the learner rarely gets to understand the application of what they have learnt. </p>
<p>By comparison, coding <a href="https://www.researchgate.net/publication/282868061_Programming_Games_for_Logical_Thinking">builds logical thinking</a> as it requires a focus on solving a specific challenge. This teaches children to evaluate situations from different angles and come up with creative solutions. They also get to test these ideas and, if they don’t work, figure out what went wrong. </p>
<p><a href="https://pdfs.semanticscholar.org/671a/bf589392f7505255ede2a092804113a837a0.pdf">Some studies</a> have further suggested that coding enhances collaboration and communication, <a href="http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf">essential skills</a> for future jobs. </p>
<h2>Access to coding</h2>
<p>Broadband and digital devices – such as computers and smart phones – are key tools for learning how to code. Access and affordability of these is essential. Governments must invest in broadband so that high quantities of data can be transmitted at high speeds. They should also provide subsidies, or at least not tax information and communications technology (ICT) tools, so that more children can learn coding at home or at school. </p>
<p>Many African countries, like Kenya, Ethiopia, Ghana and Rwanda, have taken steps to reform the ICT sector and expand broadband capacity. </p>
<p>In Kenya the government is aware of the need for ICT education and <a href="https://en.unesco.org/news/ict-integration-education-kenya-roll-out-digital-literacy-programme">has started</a> to integrate ICT in the curriculum.
The government also <a href="http://www.ict.go.ke/digital-literacy-programmedlp/">rolled out</a> an ambitious <a href="http://pubdocs.worldbank.org/en/967221540488971590/Kennedy-Ogola-Entry-Digital-Literacy-Kenya.pdf">Digital Literacy Program</a> which would bring broadband to schools and try to integrate technology into learning. </p>
<p>But it may take some time to cover the entire country until there’s enough resources to integrate it into all schools. In many African countries, even the more developed ones like Kenya, there are still basic challenges to address. For instance, <a href="https://www.ictworks.org/12-challenges-facing-computer-education-kenyan-schools/#.XeTxwC2B00o">a lack of</a> infrastructure – like electricity – resources, computers and teachers who know how to use the technology.</p>
<p>Fortunately there are informal ways in which children can learn to code. These include boot camps, <a href="http://www.imperial.ac.uk/computing/outreach/codelab/">codelabs</a>, holiday coding camps and after school coding groups. In Kenya, independent <a href="https://digikids.co.ke/">modular</a> coding programmes exist for children. There are also many free online learning tools that children can use such as Massachusettes Institute of Technology’s <a href="https://ocw.mit.edu/index.htm">OpenCourseWare</a> and <a href="https://www.codecademy.com/">Codecademy</a>. </p>
<p>Coding is no longer the preserve of computer scientists. Every profession in some way needs it. Like other subjects, it is always better introduced at an early age.</p><img src="https://counter.theconversation.com/content/127937/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bitange Ndemo receives funding from the World Bank to conduct a review of the Digital Learning Program in Kenya.</span></em></p>Coding is beneficial to children, irrespective of their career path later on in life.Bitange Ndemo, Professor, University of NairobiLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/885712017-12-10T19:11:20Z2017-12-10T19:11:20ZFrom robots to board games, it’s easy to do science this Christmas<figure><img src="https://images.theconversation.com/files/198239/original/file-20171208-11282-nwp5xi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Beebots are robots that kids can easily program, with direct feedback seen in where the robot goes. </span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/arselectronica/16605101000/in/album-72157651296329862/">arselectronica/flickr </a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>We all want to spoil the children in our lives at Christmas time. Some of us like to sneak in a bit of learning too. </p>
<p>From an educational perspective, toys are an excellent way to engage all ages in STEM (<a href="https://theconversation.com/stem-education-in-primary-schools-will-fall-flat-unless-serious-issues-are-addressed-88017">science, technology, engineering and mathematics</a>). </p>
<p>Here we’ve put together some tips for those of you looking for yuletide shopping inspiration beyond just the typical array of toys marked “science and technology” on the shelf. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-were-building-a-climate-change-game-for-12-year-olds-85983">Why we're building a climate change game for 12-year-olds</a>
</strong>
</em>
</p>
<hr>
<p>But first, a quick wrap of <a href="http://www.educationcouncil.edu.au/site/DefaultSite/filesystem/documents/National%20STEM%20School%20Education%20Strategy.pdf">key terms</a>. </p>
<p>Science is about exploring the nature of things, and involves skills such as predicting (hypothesising), observing, collecting data, fair testing, explaining and communicating.</p>
<p>Technology is a process that builds over time, and involves meeting a need or solving a problem. It includes both design technologies and digital technologies.</p>
<p>Engineering, very simply put, involves how systems can be put together to produce the desired outcome. For young children this is almost the same as technological skills.</p>
<p>Mathematics should go beyond just measuring and counting, and incorporate problem-solving skills such as those involved in coding.</p>
<p>Under the STEM umbrella, it’s also important to consider what are termed “<a href="http://www.oecd.org/site/educeri21st/40756908.pdf">21st-century skills</a>”, such as critical thinking, problem-solving, creativity, innovation, communication and collaboration. </p>
<h2>I want a robot</h2>
<p>Many of us have access to smart phones and portable computing, and <a href="https://theconversation.com/why-digital-apps-can-be-good-gifts-for-young-family-members-85893">apps</a> – including those that feature augmented reality such as <a href="https://theconversation.com/gaming-in-the-classroom-what-we-can-learn-from-pokemon-go-technology-63766">Pokémon GO</a> – are incredibly popular and accessible. </p>
<p>But moving beyond just regular use of digital technology, coding and robotic products are available that teach children how coding works, where it can be applied, and what its limitations are. These include programmable toys that can be operated by simply pressing buttons or sequencing physical tokens to produce movement. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vdIla-6A6jA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">You have to tell Spheros where you want it to go.</span></figcaption>
</figure>
<p>Devices such as <a href="https://core-electronics.com.au/brands/bee-bot-australia">Beebots</a> and <a href="https://www.primotoys.com/">Cubettos</a> allow children to use their imaginations to create scenarios – for example, tunnels, roads or bridges – that their robot can negotiate. Kids learn planning, algorithmic thinking, and mathematical reasoning. </p>
<p>More sophisticated robots such as <a href="https://www.sphero.com">Spheros</a> or <a href="https://meetedison.com">Edisons</a> connect toys to tablets or computers. Children must negotiate the constraints and opportunities of the real world – for example, slopes, different surfaces, and wind – and test their code under different situations. These toys encourage children to think creatively when coming up with their solutions.</p>
<p>In the future we will see more toys building in augmented reality, with many companies now investing in the interactivity between physical and virtual worlds. The goal is to build products that allow children to use physical objects and real-world locations (via GPS) in conjunction with computational devices, not only on their own, but also in competitive and collaborative environments. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/sxUJKn6TJOI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Kids are capable of applying computational thinking to solve problems.</span></figcaption>
</figure>
<h2>Old-fashioned fun</h2>
<p>Even with the rise of digital technology, <a href="https://www.theguardian.com/technology/2016/sep/25/board-games-back-tabletop-gaming-boom-pandemic-flash-point">board and card games</a> and <a href="https://theconversation.com/blocks-are-still-the-best-present-you-can-buy-children-for-christmas-87171">building toys</a> are still very popular. Chosen carefully, these too allow players to learn about important aspects of science. </p>
<p>Popular collaborative board games such as Pandemic <a href="http://www.asset-scienceinsociety.eu/news/features/pandemic-legacy-what-game-can-teach-us-about-pandemics">are praised</a> for their accurate depiction of how health workers respond to disease outbreaks. Children see the real impact of disease, how it can spread across the world, and the role of science in bringing outbreaks to a resolution. Children also learn <a href="https://digitalcommons.usu.edu/itls_facpub/138/">collaborative problem-solving skills</a>, computational thinking, and the benefits of <a href="https://pdfs.semanticscholar.org/7173/50cdc56af6d5ddad9f351c4462c6529bdcce.pdf">planning and sequencing</a>.</p>
<p>Science literacy can also improve by playing board games. Card games like <a href="http://organattack.com">Organ Attack!</a> give children the opportunity to learn about real diseases and the organs that they affect – for example, <a href="https://www.healthline.com/health/hepatosplenomegaly">hepatosplenomegaly</a>, a disease that affects the spleen and the liver. Amusing drawings – the game is based on the comic series <a href="http://theawkwardyeti.com">The Awkward Yeti</a> – depict the organs in an engaging and entertaining way, which adds to children’s understanding of their own body parts. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/198237/original/file-20171207-11303-z0vtc3.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The object of Organ Attack is to remove your opponents’ organs before they remove yours.</span>
<span class="attribution"><a class="source" href="https://www.kickstarter.com/projects/theawkwardyeti/organattack-a-card-game-by-the-awkward-yeti">Organ Attack</a></span>
</figcaption>
</figure>
<p>Construction sets are also useful for developing a range of STEM skills. </p>
<p>At a basic level, <a href="https://theconversation.com/blocks-are-still-the-best-present-you-can-buy-children-for-christmas-87171">simple wooden blocks</a> have been shown to bring many benefits to children’s development, including spatial reasoning and language. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/blocks-are-still-the-best-present-you-can-buy-children-for-christmas-87171">Blocks are still the best present you can buy children for Christmas</a>
</strong>
</em>
</p>
<hr>
<p>More <a href="https://shop.lego.com/en-AU/LEGO-MINDSTORMS-EV3-31313">complex building sets</a> can involve digital aspects such as designing, making and programming robotic toys. </p>
<p>Consider toys that give children the scope to go beyond simply putting the pieces together according to the instructions, but also encourage them to use their imaginations to turn the pieces into something unexpected. </p>
<h2>Tips to get it right</h2>
<p>It is worth noting that when children engage with toys and games with a STEM focus, they will not necessarily be aware of the knowledge and skills involved. <a href="http://onlinelibrary.wiley.com/doi/10.1002/sce.1035/abstract">Parents can support</a> their children’s scientific thinking, elaborate on scientific information, and help them structure meaning from their observations, using the following tips: </p>
<ul>
<li>question children about their ideas</li>
<li>gently highlight inconsistencies in their thinking that contradict the evidence in front of them</li>
<li>support them in not focusing on only one piece of evidence, at the cost of other relevant information.</li>
</ul>
<p>It can be difficult to get the balance right between digital technologies that involve individual use (and can isolate children) and those that focus on collaboration and conversation. To address this concern, </p>
<ul>
<li>look at how you may tap into the skills and knowledge that are learned</li>
<li>focus on apps that encourage multiple players, turn taking and collaboration.</li>
</ul>
<p>Toys and games that involve friends and family members are more than just fun: they can foster new skills, challenge children to work in a team and encourage thinking and idea development.</p><img src="https://counter.theconversation.com/content/88571/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Toys and games that involve friends and family members are more than just fun: they can foster new skills, challenge children to work in a team and encourage thinking and idea development.George Aranda, Lecturer in Science Education, Deakin UniversityWendy Jobling, Lecturer, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/676232017-02-21T01:19:18Z2017-02-21T01:19:18ZBuilding privacy right into software code<figure><img src="https://images.theconversation.com/files/157370/original/image-20170217-10200-1yp574g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Putting privacy right in the code.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/red-digital-keyhole-concept-cyber-security-518678134">Keyhole image via shutterstock.com</a></span></figcaption></figure><p>When I was 15, my parents did not allow me to use AOL Instant Messenger. All of my friends used it, so I had to find a way around this rule. I would be found out if I installed the software on my computer, so I used the <a href="https://www.lifewire.com/how-to-sign-in-aim-express-1949378">web browser version</a> instead. Savvy enough to delete my internet history every time, I thought my chatting was secret.</p>
<p>Then one day my mother confronted me with all the times I had gone on Instant Messenger in the past week. Whenever I visited the site, it had left a trail of cookies behind. Intended to make my user experience more convenient, <a href="http://computer.howstuffworks.com/internet/basics/question82.htm">cookies saved my login information for repeat visits</a>. Unfortunately, the cookies made my life less convenient: My mother knew how to inspect them to determine when I had been illicitly instant messaging.</p>
<p>Since then, I have been very interested in protecting user privacy. I studied computer science in college and ended up pursuing a career in the field. I became fascinated with programming languages, the construction materials for the information age. <a href="https://www.technologyreview.com/s/536356/toolkits-for-the-mind/">Languages shape how programmers think about software, and how they construct it</a>, by making certain tasks easier and others harder. For instance, some languages allow rapid website prototyping, but don’t handle large amounts of traffic very well.</p>
<p>Regarding my main interest, I discovered that many of today’s most common languages make it difficult for programmers to protect users’ privacy and security. It’s bad enough that this state of affairs means programmers have lots of opportunities to make privacy-violating errors. Even worse, it means we users have trouble understanding what computer programs are doing with our information – even as we increasingly rely on them in our daily lives.</p>
<h2>A history of insecurity</h2>
<p>As part of the first generation who <a href="https://doi.org/10.1177/1461444806059871">came of age on the internet</a>, I enjoyed the benefits of participating in digital life, like instant messaging my friends when I was supposed to be doing homework. I also knew there was the potential for unintended information leaks.</p>
<p>A then-crush once told me that he took advantage of a fleeting Facebook opportunity to discover that I was among his top five stalkers. For a brief period of time, when a user <a href="http://gawker.com/390004/whos-stalking-you-on-facebook">typed “.” into the search bar</a>, the autocompleted searches were the users who most searched for them. I was mortified, and avoided even casual browsing on Facebook for a while.</p>
<p>This small social crisis was the result of a programming problem, a combination of both human programmer error and a shortcoming of the language and environment in which that human worked. And we can’t blame the programmer, because the languages Facebook uses were not built with modern security and privacy in mind. They need the programmer to manage everything by hand.</p>
<h2>Spreading protections across the program</h2>
<p>As those older languages developed into today’s programming environments, security and privacy remained as add-ons, rather than built-in automatic functions. Though programmers try to keep instructions for different functions separate, code dedicated to enforcing privacy and security concerns gets mixed in with other code, and spread all throughout the software.</p>
<p>The decentralized nature of information leaks is what allowed my mother to catch me messaging. The web browser I used stored evidence of my secret chatting in more than one place – in both the history of what sites I visited and in the cookie trail I left behind. Clearing only one of them left me vulnerable to my mother’s scrutiny.</p>
<p>If the program had been built in such a way that all evidence of my activity was handled together, it could have known that when I deleted the history, I wanted the cookies deleted too. But it wasn’t, it didn’t and I got caught.</p>
<h2>Making programmers do the work</h2>
<p>The problem gets even more difficult in modern online systems. Consider what happens when I share my location – let’s say Disney World – on Facebook with friends who are nearby. On Facebook, this location will be displayed on my “timeline.” But it will also be used for other purposes: Visitors to Disney World’s Facebook page can see <a href="http://www.techlicious.com/tip/complete-guide-to-facebook-privacy-settings/">which of their friends has also been to the amusement park</a>. I can tell Facebook to limit who can see that information about me, so people I don’t know can’t go to Disney World’s page and see “Jean Yang checked in 1 hour ago.” </p>
<p>It is the programmer’s job to enforce these privacy restrictions. Because privacy-related code is scattered throughout all the programs Facebook uses to run its systems, the programmer must be vigilant everywhere. To make sure nobody finds out where I am unless I want them to, the programmer must tell the system to check my privacy settings everywhere it uses my location value, directly or indirectly. </p>
<p>Every time a programmer writes instructions to refer to my location – when displaying my profile, the Disney World page, the results of queries such as “friends at Disney World” and countless other places – she has to remember to include instructions to check my privacy settings and act accordingly.</p>
<p>This results in a tangle of code connecting the rules and their implementation. It is easy for programmers to make mistakes, and difficult for anybody else to check that the code is doing what it’s supposed to do.</p>
<h2>Shifting the burden to computers</h2>
<p>The best way to avoid these problems is to take the task of privacy protection away from humans and entrust it to the computers themselves. We can – and should – develop programming models that allow us to more easily incorporate security and privacy into software. <a href="https://www.cs.cornell.edu/andru/papers/jsac/sm-jsac03.pdf">Prior research in what is called “language-based information flow”</a> looks at how to automatically check programs to ensure that sloppy programming is not inadvertently violating privacy or other data-protection rules.</p>
<p>Even with tools that can check programs, however, the programmer needs to do the heavy lifting of writing programs that do not leak information. This still involves writing those labor-intensive and error-prone privacy checks throughout the program. My work on a new programming model called “<a href="http://projects.csail.mit.edu/jeeves/">policy-agnostic programming</a>” goes one step farther, making sloppy programming impossible. In these systems, programmers attach security and privacy restrictions directly to every data value.</p>
<p>For instance, they could label location as information requiring protection. The program itself would understand that my “Disney World” location should be shown only to my close friends. They could see that not only on my own page, but on Disney World’s page.</p>
<p>But people I don’t know would be shown a less specific value in both places. Perhaps friends of my friends might see “away from home,” and total strangers could only learn that I was “in the United States.” Looking at my page, they wouldn’t be able to tell exactly where I am. And if they went to the Disney World page, I wouldn’t appear there either.</p>
<p>With this type of structure, the humans need no longer write code to repeatedly check which information should be shared; the computer system handles that automatically. That means one less thing for programmers to think about. It also helps users feel more confident that some element of a complicated piece of software – much less a human error – won’t violate their personal privacy settings.</p>
<p>With software programs handling our driving, shopping and even <a href="http://www.lifehacker.co.uk/2015/02/16/tinderbox-bot-intelligently-swipes-tinder-matches-can-even-start-conversation">choosing potential dates</a>, we have much bigger problems than our mothers seeing our internet cookies. If our computers can protect our privacy, that would be a huge improvement to our rapidly changing world.</p><img src="https://counter.theconversation.com/content/67623/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jean Yang is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. Jean receives funding from DARPA and the National Science Foundation. </span></em></p>Most of today’s computer languages make it hard for programmers to protect users’ privacy and security. The fix is to take those tasks out of human hands entirely.Jean Yang, Assistant Professor of Computer Science, Carnegie Mellon UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/717272017-01-23T18:02:27Z2017-01-23T18:02:27ZMobile phones offer a new way for Africa’s students to learn programming<figure><img src="https://images.theconversation.com/files/153854/original/image-20170123-8067-im6k7g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Students could learn how to program with the right applications on their mobile phones.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>It’s not easy for Computer Science students at most universities in Africa to practice and develop their programming skills. They have the ability to program, but access to desktop or laptop computers might be a problem. I experienced this first-hand while teaching programming at a Kenyan university.</p>
<p>Most African universities have public computer laboratories, but these tend to be used to teach various classes, hence limiting students’ access. Many institutions may also have very few computers for a large number of students. This means that students might need to access computers outside the classroom in order to practise programming. Yet, most people in developing countries <a href="http://www.pewglobal.org/2015/03/19/internet-seen-as-positive-influence-on-education-but-negative-influence-on-morality-in-emerging-and-developing-nations/technology-report-15/">do not</a> own computers at home.</p>
<p>Limited access to PCs aggravates the learning difficulties faced by programming students. This is especially true because programming is best learnt through practice. However, most students own mobile phones. Cell phones are the most <a href="http://www.pewglobal.org/2015/04/15/cell-phones-in-africa-communication-lifeline/">widely used</a> devices among students in developing countries – and, indeed, among Africans more generally. </p>
<p>I therefore set out to develop a solution that would enable students to learn programming using mobile phones. The biggest challenge was turning mobile phones into functional programming environments. After all, they aren’t designed with programming in mind. They have small screens and small keypads that impede their use as programming platforms.</p>
<p>So I designed what I called scaffolding – or supporting – techniques that allow for the effective construction of programs on mobile phones using the Java language. These techniques can also address new learners’ needs. <a href="https://open.uct.ac.za/handle/11427/16609">The results</a>, taken from my work with 182 students at four universities in South Africa and Kenya, are encouraging.</p>
<h2>Techniques for mobile phones</h2>
<p>The scaffolding techniques I designed can be used on Android platforms. They are specifically aimed at students learning <a href="https://docs.oracle.com/javase/tutorial/java/concepts/">Object Oriented Programming</a> using Java.</p>
<p>The technology works by offering three types of scaffolding techniques:</p>
<ol>
<li><p>Automatic scaffolding, which are supporting techniques automatically presented on the interface. These include instructions on which buttons to press, error prompts and suggestions to view an example while working on a program. These scaffolding techniques fade away as the student gets more familiar with the application.</p></li>
<li><p>Static scaffolding, which involves supporting techniques that never fade away. I included two such techniques. One presents the layout of a Java program on the main interface, so the student always has a visual representation before interacting with the program. This technique is said particularly to <a href="http://web.media.mit.edu/%7Eedith/publications/1996-persp.taking.pdf">support</a> a new student’s learning. The second static scaffolding technique involves creating the program one part at a time, breaking it into smaller parts. This is an effective way to support the creation of a program on small screen devices like mobile phones.</p></li>
<li><p>User-initiated scaffolding, which are supporting techniques that a student can activate. Examples include hints, examples and tutorials.</p></li>
</ol>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=708&fit=crop&dpr=1 600w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=708&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=708&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=890&fit=crop&dpr=1 754w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=890&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/153853/original/image-20170123-8082-1wzg9c.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=890&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A student puts the scaffolding for mobile phones to the test.</span>
<span class="attribution"><span class="source">Dr Chao Mbogo</span></span>
</figcaption>
</figure>
<p>I tested these techniques on the students while they constructed Java programs on mobile phones. Their feedback was largely positive and suggested that scaffolding techniques specifically designed for mobile phones and based on students’ needs could support the learning of programming using a mobile phone. </p>
<h2>Findings and challenges</h2>
<p>Desktop programming environments are complex interfaces. Large screens make it possible for students to be exposed to large amounts of information in one sitting. Large screens also mean that students can be given support, in one place, without having to leave the interface. Providing all this functionality and support in one interface doesn’t work well on small screens.</p>
<p>But my research suggests that small screens have some advantages. Students told me that the more simple interface on a small screen helped them to focus on the task at hand. When they had to create a program one step at a time, they didn’t have to grasp a huge amount of information all at once. This may assist their learning in the long run. </p>
<p>Certainly, the study wasn’t perfect. The scaffolding I developed was only for Android platforms, which excludes users from other platforms such as Windows and iOS. And while mobile phones are far more common among students than private desktop or laptop computers, there are some students who do not have and cannot afford even these devices. </p>
<p>My research is not over yet. My next steps will take these problems into account. For example, the techniques I designed will be tested on other programming languages – such as C++ – and on other mobile platforms. I am also keen to investigate the design of such scaffolding for tablets which are becoming more common among African university students.</p>
<h2>Next steps</h2>
<p>The study I’ve described here relates to my PhD, which I was awarded at the University of Cape Town in December 2015. Since then a number of my peers have suggested other areas to explore and improve. From 2017 my programming students at Kenya Methodist University will use the prototype I tested in a longitudinal study. None of them have ever used a mobile phone to program, so this will be a new experience.</p>
<p>For the foreseeable future, African universities and other institutions offering programming subjects will continue to struggle with resources. As long as this situation persists and students’ access to mobile phones and tablets grows, the techniques I’m developing could offer a smart solution that allows the continent to keep producing young programmers.</p><img src="https://counter.theconversation.com/content/71727/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Chao Charity Mbogo received funding for her Ph.D. research, related fieldwork and related conference grants from Hasso Plattner Institute (HPI), Department of Computer Science at the University of Cape Town, Google, The International Network for Postgraduate Students in the area of ICT4D (IPID), ACM-W, and Schlumberger’s Faculty for the Future fellowship. </span></em></p>Computer programming is best learned through practice, but students in developing economies don’t always have access to desktop or laptop computers. Mobile phones may be the solution.Dr. Chao Mbogho, Researcher and Lecturer of Computer Science, Mentor, Kenya Methodist UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/643132016-08-25T20:21:33Z2016-08-25T20:21:33ZInterview: Steve Wozniak, Apple co-founder and inventor of the home computer<p>If you haven’t heard of Steve Wozniak, it is because he has been overshadowed by his fellow co-founder of Apple, Steve Jobs. This is despite the fact that he was the sole person behind the <a href="https://www.amazon.com/iWoz-Computer-Invented-Personal-Co-Founded/dp/0393330435">invention</a> and building of the Apple 1, the first home computer that used a keyboard and normal TV screen as a display. </p>
<p>Steve Jobs was arguably the force behind the creation of Apple, but the technology came from the mind of Wozniak.</p>
<p>Wozniak is currently on a speaking tour of Australia and appeared in Perth this week. I had the opportunity to talk to him before the show and ask a few questions.</p>
<p><em>You can see the full interview between David Glance and Steve Wozniak in the video below.</em></p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/920QAxtook8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Interview with Steve Wozniak.</span></figcaption>
</figure>
<p>Wozniak <a href="https://en.wikipedia.org/wiki/Steve_Wozniak">left Apple</a> in 1985 to finish a degree of electrical engineering and computer science at the University of California at Berkeley. He did this under the pseudonym of Rocky Raccoon Clark. After that time, he spent 10 years teaching computing primary school children from Grade 5 - 9.</p>
<p>In terms of the disruption of education, Wozniak says that massive online open courses (<a href="https://theconversation.com/au/topics/mooc-9120">MOOCs</a>) may be appropriate for the older, more educated, student, but human interaction is always going to be more important at the primary and high school level. </p>
<p>In fact, this reflects Wozniak’s personal experience where he talks about the influence of his father on inspiring him into the area of electronics, and the encouragement and inspiration of teachers at his primary school. This contrasts with his relatively bad experiences at high school and university with teachers that were, in some cases, openly antagonistic.</p>
<p>Asked about Apple’s new app <a href="http://www.apple.com/au/swift/playgrounds/">Swift Playgrounds</a>, aimed at teaching entry level programming in the language Swift, he thinks that there may have been better languages to start with. For Wozniak, one reason for developing computer was to provide everyone with access to computing, along with all the benefits that would come from that, including the ability to program.</p>
<p>At Apple’s birth, nobody thought the concept of the home computer – like the one Wozniak had built – had a future. When Wozniak had worked at Hewlett-Packard, the company turned down his idea of developing a personal computer. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/135359/original/image-20160824-30246-nefaec.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Apple 1 Computer, a piece of history and art.</span>
<span class="attribution"><a class="source" href="https://www.charitybuzz.com/catalog_items/extremely-rare-celebration-apple-1-computer-1092901">CharityBuzz</a></span>
</figcaption>
</figure>
<p>In Australia, it is debatable whether Apple would have been funded by the current venture capitalist community. At the time Apple received its first investment from <a href="https://en.wikipedia.org/wiki/Mike_Markkula">Mike Markkulla</a>. It was yet to turn a profit and had only one client. </p>
<p>When asked about the prospect of a new company like Apple starting today, Wozniak is more optimistic. He believes there is a huge amount of activity in the startup and innovation areas. </p>
<p>He has previously talked enthusiastically about the <a href="https://www.theguardian.com/sustainable-business/2016/aug/24/steve-wozniak-the-status-quo-doesnt-have-to-exist-we-can-come-up-with-solutions">investment</a> of the Queensland Government of A$405 million in the startup scene. </p>
<p>In terms of privacy, Wozniak feels that all technology companies talked about privacy being central to their product strategies but that only Apple actually delivered on this. The culture of privacy existed before CEO Tim Cook at Apple, but Cook has stood behind the importance of privacy when it came to handing over details of people’s devices to the FBI.</p>
<p>Wozniak, has not really been following the dispute between the Australian banks and Apple. But he was really enjoying using Apple Pay in Australia. He does say that he wishes that the technology behind the payment system was an open standard but hasn’t thought about the problem deeply.</p>
<p>Wozniak says he always wanted to be an engineer. To a large extent, the ascendency of technological companies has elevated the status of the computer and electrical engineer in society. </p>
<p>Popular culture is now reflecting this with extremely popular TV series like <a href="http://www.imdb.com/title/tt2543312/">Halt and Catch Fire</a>, <a href="http://www.imdb.com/title/tt4158110/">Mr. Robot</a> and the comedy series that Wozniak has actually appeared in, <a href="http://www.imdb.com/title/tt0898266/?ref_=fn_al_tt_4">Big Bang Theory</a>. Unlike Elon Musk, Wozniak has never appeared in the Simpsons, even in the show parodying Apple and “Steve Mobs”.</p>
<p>Despite all of this, Wozniak feels that “nerds and geeks” will still have a hard time at school, a situation that was the case when he was at school.</p>
<p>During our chat, Wozniak expanded on the needs of entrepreneurship. What is critical he says, is the need for business and marketing along with engineering capability. </p>
<p>It is not good enough to have an idea that sounds great if nobody wants to buy it. It is the role of the marketing person to do what Apple has done in not only seeing what the public wants, but to persuade them that they actually want the product you are selling, even if they didn’t know it.</p>
<p>In many ways, Wozniak is a stereotypical engineer. He is a very nice and largely self-effacing person who just happened to use his passion to create something that was truly great. </p>
<p>There is currently an Apple 1 computer that is being <a href="https://www.charitybuzz.com/catalog_items/extremely-rare-celebration-apple-1-computer-1092901?utm_medium=Call_To_Action_Button&utm_campaign=Apple_1_Auction_2016&utm_source=Apple_1_Landing_Page%3Cbr%20/%3E%0D%0A">auctioned</a> with manuals and tape cassettes. It is expected to sell for US$1 million. This is a far cry from the US$666.66 that this computer sold for in 1976. </p>
<p>On the description of the computer, it describes the computer not only as a radical device that would change society so dramatically, but simply as a piece of art. In those terms Wozniak can let his art and its legacy speak for itself.</p><img src="https://counter.theconversation.com/content/64313/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Glance owns shares in Apple. UWA sponsored the talk by Steve Wozniak in Perth.</span></em></p>David Glance sits down with Apple co-founder and inventor of the Apple 1 computer, Steve Wozniak, to talk about his life, his thoughts on Apple then and now and how technology is changing the world.David Glance, Director of UWA Centre for Software Practice, The University of Western AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/625332016-07-21T20:05:28Z2016-07-21T20:05:28ZApple is taking its first steps towards a more comprehensive post-PC world<figure><img src="https://images.theconversation.com/files/131024/original/image-20160719-13871-1ivan8w.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Apple-developed lessons help students learn to code on the same device the code will be used on.</span> <span class="attribution"><a class="source" href="http://www.apple.com/newsroom/2016/06/swift-playgrounds-app-makes-learning-to-code-easy-and-fun.html">Apple</a></span></figcaption></figure><p>Hands up if you’ve heard of Swift Playgrounds? No, it’s not some new start-up providing quick playdates for bedraggled parents, although that might be interesting.</p>
<p>Swift Playgrounds is the new <a href="http://www.apple.com/swift/playgrounds/">programming tool</a>, introduced by Apple in June at its annual <a href="http://developer.apple.com/wwdc/">Worldwide Developer Conference</a>, based on the Swift programming language the company introduced a few years ago. </p>
<p>What makes Swift Playgrounds interesting is that it provides a first-party computer programming platform that can be run entirely on an iPad, no computer required.</p>
<p>While Apple has been slowly <a href="http://www.imore.com/new-multitasking-features-are-coming-ipad-part-ios-9">adding features</a> to the iPad over the past few years, this represents a pretty significant step change for Apple.</p>
<p>It means the company is starting to acknowledge that these machines – famously called <a href="https://www.youtube.com/watch?v=XdSQbVFobu4">post-PC devices</a> by the late Apple CEO Steve Jobs – are now powerful enough to be used to write apps for use on the same device.</p>
<p>That means it may not be long before these devices can be used totally without a personal computer for everything, from writing content to developing apps.</p>
<h2>They might be Swift, but they’re not the first</h2>
<p>Of course, Apple is not the first company to launch programming tools for the iPad.</p>
<p>Universities such as MIT have been developing tools such as the <a href="https://scratch.mit.edu">Scratch visual programming language</a> for the iPad for a number of years. This gives primary school and middle school students a platform to develop their own games. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=436&fit=crop&dpr=1 600w, https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=436&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=436&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=548&fit=crop&dpr=1 754w, https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=548&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/131025/original/image-20160719-13868-li4ue3.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=548&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Creative experimentation in Swift Playgrounds.</span>
<span class="attribution"><a class="source" href="http://www.apple.com/newsroom/2016/06/swift-playgrounds-app-makes-learning-to-code-easy-and-fun.html">Apple</a></span>
</figcaption>
</figure>
<p>But what makes Swift Playgrounds significant is that in using the same programming language as iPad apps themselves are developed in, <a href="http://www.apple.com/swift/">Swift</a>, it gives insight into a future where iPad apps could be written on iPad themselves, and published from that same location.</p>
<p>It’s not a great stretch to envision a future where digital natives could potentially develop and run totally new apps using only post-PC devices. They would never have to touch a personal computer for anything at all.</p>
<p>Not surprisingly, Steve Jobs, ever the visionary, predicted this possibility back in 2008 in an interview with Apple journalist Walt Mossberg at the D8 conference. At D8, Jobs described a future where he likened iPads to cars, usable by the majority of people, and PCs to trucks, required by only those with specialised needs. </p>
<p>With the introduction of Swift Playgrounds, Apple is acknowledging that more and more users only need a car, and that perhaps trucks are becoming more and more rare.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/YfJ3QxJYsw8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Moving between digital devices</h2>
<p>This change is a good thing, because <a href="http://dl.acm.org/citation.cfm?id=2843349">research work</a> I conducted with colleagues at Central Queensland University shows that while many of our students, regardless of their age, are comfortable with technology, they are not as comfortable with changing between devices.</p>
<p>Specifically, our work found that digital competencies do not transfer well between devices. Their comfort with one device does not translate to comfort with another.</p>
<p>The introduction of Swift Playgrounds and the potential for app development on iOS devices suggests that this preference will eventually be catered for, which is a good thing given our findings.</p>
<p>Perhaps we are finally pushing towards a world of truly pervasive computing. Rather than being locked behind a desk for some tasks, or finding ourselves desperately missing the keyboard we left at home, we are able to use <a href="https://theconversation.com/how-apple-watch-and-pervasive-computing-can-lure-you-into-leveling-up-your-fitness-59045">whatever computing device</a> we have at hand to complete whatever task we need to complete.</p>
<p>After all, as long as the device has the right buttons and the right inputs, then why not be able to use it for anything we need to do with it?</p>
<p>What’s more, why don’t we make it so that our progress on tasks transfers seamlessly between devices? That way we can pick up any device and simply continue with the work that we started earlier.</p>
<p>Ben Thompson, of Stratechery, called this concept <a href="https://stratechery.com/2015/apple-watch-and-continuous-computing/">Continuous Computing</a> back in 2015 when he envisioned a world where we move seamlessly between devices to get our work done.</p>
<p>Apple’s announcements at WWDC this year certainly indicate this is the direction they are heading. This should be applauded and I am hopeful for our digital native students. </p>
<p>While we can’t stop them from having an iPhone or an iPad continuously in their hand, it’s good to know we are working towards a world where they can drive these devices confidently to do what they need, moving seamlessly between devices as the need arises. They don’t need to find themselves behind the wheel of an unfamiliar truck-style PC. </p>
<p>Which raises the question, what will other tech giants such as Google and Microsoft do now to catch up and avoid being left behind in any post-PC world? After all, more safe post-PC driving can only be a good thing!</p><img src="https://counter.theconversation.com/content/62533/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Cowling does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If you’re creating an app for an iPad, then why not create it on an iPad too. Is Apple’s Swift move to do this just another step towards the end of the personal computer?Michael Cowling, Visiting Project Scientist in Informatics, University of California, IrvineLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/623282016-07-13T11:26:16Z2016-07-13T11:26:16ZWhy everyone should have to learn computer programming<figure><img src="https://images.theconversation.com/files/130217/original/image-20160712-9264-zt66ib.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">'All Greek to me, mate.'</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/cat.mhtml?lang=en&language=en&ref_site=photo&search_source=search_form&version=llv1&anyorall=all&safesearch=1&use_local_boost=1&autocomplete_id=&searchterm=python%20computer%20program&show_color_wheel=1&orient=&commercial_ok=&media_type=images&search_cat=&searchtermx=&photographer_name=&people_gender=&people_age=&people_ethnicity=&people_number=&color=&page=1&inline=288042365">dencg</a></span></figcaption></figure><p><a href="https://www.theguardian.com/world/2016/jul/04/latin-revival-cathedral-courses-find-new-fans-of-dead-language">News that</a> numerous cathedrals are offering short courses in Latin is a reminder of the long decline of the language over the years. It was a core subject in the British education system until fairly recently – and not because anyone planned to speak it, of course. It was believed to offer valuable training for intellectual composition, as well as skills and thinking that were transferable to other fields. </p>
<p>It may have been the right decision, but when it was ultimately decided that these advantages were outweighed by Latin being a dead language we arguably lost that intellectual training in the process. This is why we want to make the case for moving another discipline to the centre of the curriculum that offers analogous benefits – computer programming. And unlike Latin, it is anything but dead. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=699&fit=crop&dpr=1 600w, https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=699&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=699&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=878&fit=crop&dpr=1 754w, https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=878&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/130219/original/image-20160712-9274-wu6vhe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=878&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Noam lore.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/b-tal/65264349/in/photolist-6LuPp-64qbTH-9wPeCA-edtzz3-7fgXTV-9UoK6F-8rBQWn-cyix3N-8rBQUT-4b4jNJ-4aZiYa-r6dJW2-5ySUa-9CZGFw-9wPhWL-c9YyzQ-9UWHNi-9wPham-9wLhaZ-9A5VQ-9wPmCN-7WkAyC-aRMouc-aurdsy-8XAjhV-5LmwAx-7pCz9f-s7jJWw-rs66S8-9wPeWh-9CWKQe-9wLgLg-7xBniK-7xBn1D-9A5UX-9R1r37-9A59H-qDoZXq-9A4AE-9R1vhf-CKbWm-6fp3PF-7xFbDJ-7Whjsi-7Qg8Q4-9wLkGg-c2mMrL-6kBMSR-7xBn4n-bBqZGr">Brian Talbot</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>There are many computer languages for different purposes. C and C++ remain the fastest to execute and are used by the gaming industry, for instance. In the internet era, much of the page design is done with the likes of JavaScript or PHP. Meanwhile Python has been rapidly gaining a reputation as a general purpose code that is easy to learn. </p>
<p>There are many parallels between natural languages and programming languages like these. You must learn to express yourself within the rules of the language. There is a grammar to comprehend. And what you write must be interpretable by another human being. (Yes, it must be interpretable by a computer. But just as <a href="https://archive.org/stream/NoamChomskySyntcaticStructures/Noam%20Chomsky%20-%20Syntcatic%20structures_djvu.txt">Noam Chomsky’s example</a> of “colourless green ideas sleep furiously” is grammatically correct nonsense, you can write obfuscated computer code that no one else can decipher.)</p>
<p>People who program can communicate with computers, which is becoming more and more important now that computers have a hand in almost everything. In today’s IT-literate world, we are all expected to be fluent in word processing and spreadsheets. The next logical step is to be able to program. </p>
<p>The younger generation are already exposed to computers almost from the day they are born, which explains for example Barclays bank’s <a href="http://www.barclays.co.uk/DigitalEagles/BarclaysCodePlayground/P1242686640999">recent launch</a> of Code Playground, an initiative to engage young children in the basics of programming via a colourful website.</p>
<h2>Problematis solvendis</h2>
<p>There is a myth that only maths geniuses are suited to programming. It is more accurate to say you need a logical approach and an ability to problem solve. Just as Latin constructs reinforce communication, programming constructs reinforce problem solving. It teaches you to break a problem into achievable chunks and to think very precisely. And once you have mastered the basics, it opens up great potential for creative thinking. </p>
<p>Then there are specific workplace benefits, such as for businesses that are building a bespoke piece of software. Errors sometimes occur when documents outlining in English how a program should work are translated into computer code. Those who have an appreciation of a programming language can write these more clearly. Indeed, businesses usually have to employ specialist analysts as intermediaries to help with this translation process. </p>
<p>As computers become more dominant, those who don’t know how to think in this way risk being increasingly left behind. We can foresee a time when greater numbers of people become interested in learning to program for themselves, but in the meantime there is a great case for making the basics of computer programming a core skill at school. </p>
<p>One candidate language would be <a href="https://www.python.org/doc/essays/blurb/">Python</a>, it’s freely available and one of the easier programming languages to learn – compared, say, to C/C++. It has grown in popularity in recent years, initially for this simplicity but lately because it has been adopted by the <a href="http://www.forbes.com/sites/lisaarthur/2013/08/15/what-is-big-data/#58e893c33487">big data community</a>. It is likely to be around for a few years and not become a dead language any time soon. There are <a href="https://www.mooc-list.com/tags/python?static=true">plenty MOOCs</a> (online courses) to get you started.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/130218/original/image-20160712-9274-p9gk2h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Lighten your load with code …</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/cat.mhtml?lang=en&language=en&ref_site=photo&search_source=search_form&version=llv1&anyorall=all&safesearch=1&use_local_boost=1&autocomplete_id=&search_tracking_id=c5sEOdYoDST5PCRuuIyzqg&searchterm=python%20code&show_color_wheel=1&orient=&commercial_ok=&media_type=images&search_cat=&searchtermx=&photographer_name=&people_gender=&people_age=&people_ethnicity=&people_number=&color=&page=1&inline=236404216">Mclek</a></span>
</figcaption>
</figure>
<p>If a teacher walked into a classroom and told today’s students they were going to study a dead language, you can imagine the reaction. Imagine instead introducing them to an easy-to-use programming language which is probably already installed on their laptops. It can allow them to automate many boring tasks such as checking email and sending out pre-written responses; or receive custom notifications by text; or download files or copy text from a website whenever it updates. </p>
<p>It’s time that those in charge of education policy recognised the shift in employability skills and the need for a new generation of problem solvers. We may have reached the point where the three Rs of education – reading, writing and ‘rithmetic – should become the four Rs, with the addition of programming. Or 'rogramming, as we would soon get used to calling it.</p><img src="https://counter.theconversation.com/content/62328/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>When we ditched high-school Latin we lost a great intellectual training. Here’s how to get it back without resorting to a dead language.John R. Woodward, Lecturer in Computer Science, University of StirlingMarwan Fayed, Lecturer in Computing, University of StirlingLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/548052016-03-28T10:07:01Z2016-03-28T10:07:01ZCould the language barrier actually fall within the next 10 years?<figure><img src="https://images.theconversation.com/files/116474/original/image-20160325-17840-l4cqce.jpg?ixlib=rb-1.1.0&rect=0%2C57%2C1280%2C869&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Pieter Bruegel the Elder's 'The Tower of Babel' (1563).</span> <span class="attribution"><a class="source" href="https://upload.wikimedia.org/wikipedia/commons/thumb/f/fc/Pieter_Bruegel_the_Elder_-_The_Tower_of_Babel_(Vienna)_-_Google_Art_Project_-_edited.jpg/1280px-Pieter_Bruegel_the_Elder_-_The_Tower_of_Babel_(Vienna)_-_Google_Art_Project_-_edited.jpg">Wikimedia Commons</a></span></figcaption></figure><p>Wouldn’t it be wonderful to travel to a foreign country without having to worry about the nuisance of communicating in a different language?</p>
<p>In a <em>Wall Street Journal</em> <a href="http://www.wsj.com/articles/the-language-barrier-is-about-to-fall-1454077968">article</a>, technology policy expert <a href="http://bit.ly/1o4BRIp">Alec Ross</a> argued that, within a decade or so, we’ll be able to communicate with one another via small earpieces with built-in microphones.</p>
<p>No more trying to remember your high school French when checking into a hotel in Paris. Your earpiece will automatically translate “Good evening, I have a reservation” to <em>Bon soir, j’ai une réservation</em> – while immediately translating the receptionist’s unintelligible babble to “I am sorry, Sir, but your credit card has been declined.”</p>
<p>Ross argues that because technological progress is <a href="http://www.kurzweilai.net/the-law-of-accelerating-returns">exponential</a>, it’s only a matter of time.</p>
<p>Indeed, some parents are so convinced that this technology is imminent that they’re wondering if their kids should even learn a second language.</p>
<p>Max Ventilla, one of <a href="https://www.altschool.com/about-us#about-us">AltSchool</a> Brooklyn’s founders, <a href="http://www.newyorker.com/magazine/2016/03/07/altschools-disrupted-education">told <em>The New Yorker</em> </a> </p>
<blockquote>
<p>…if the reason you are having your child learn a foreign language is so that they can communicate with someone in a different language twenty years from now – well, the relative value of that is changed, surely, by the fact that everyone is going to be walking around with live-translation apps.</p>
</blockquote>
<p>Needless to say, communication is only one of the many advantages of learning another language (and I would argue that it’s not even the most important one).</p>
<p>Furthermore, while it’s undeniable that translation tools like <a href="https://www.bing.com/translator/">Bing Translator</a>, <a href="https://www.babelfish.com/">Babelfish</a> or <a href="https://translate.google.com/">Google Translate</a> have improved dramatically in recent years, prognosticators like Ross could be getting ahead of themselves. </p>
<p>As a language professor and translator, I understand the complicated nature of language’s relationship with technology and computers. In fact, language contains nuances that are impossible for computers to ever learn how to interpret.</p>
<h2>Language rules are special</h2>
<p>I still remember grading assignments in Spanish where someone had accidentally written that he’d sawed his parents in half, or where a student and his brother had acquired a well that was both long and pretty. Obviously, what was meant was “I saw my parents” and “my brother and I get along pretty well.” But leave it to a computer to navigate the intricacies of human languages, and there are bound to be blunders.</p>
<p>In 2016, when asked about <a href="https://twitter.com/">Twitter</a>’s translation feature for foreign language tweets, the company’s CEO <a href="https://en.wikipedia.org/wiki/Jack_Dorsey">Jack Dorsey</a> <a href="http://www.newyorker.com/magazine/2016/03/28/celebrities-fund-best-school-day">conceded</a> that it does not happen in “real time, and the translation is not great.”</p>
<p>Still, anything a computer can “learn,” it will learn. And it’s safe to assume that any finite set of data (like every single work of literature ever written) will eventually make its way into the cloud.</p>
<p>So why not log all the rules by which languages govern themselves?</p>
<p>Simply put: because this is not how languages work. Even if the Florida State Senate <a href="https://www.flsenate.gov/Session/Bill/2016/0468/BillText/e1/HTML">ruled</a> that studying computer code is equivalent to learning a foreign language, the two could not be more different.</p>
<p>Programming is a constructed, formal language. Italian, Russian or Chinese – to name a few of the <a href="http://www.linguisticsociety.org/sites/default/files/how-many-languages.pdf">estimated 7,000 languages</a> in the world – are natural, breathing languages which rely as much on social convention as on syntactic, phonetic or semantic rules.</p>
<h2>Words don’t indicate meaning</h2>
<p>As long as one is dealing with a simple written text, online translation tools will get better at replacing one “signifier” – the name Swiss linguist <a href="https://en.wikipedia.org/wiki/Ferdinand_de_Saussure">Ferdinand de Saussure</a> gave to the idea that a sign’s physical form is distinct from its meaning – with another.</p>
<p>Or, in other words, an increase in the quantity and accuracy of the data logged into computers will make them more capable of translating <em>“No es bueno dormir mucho”</em> as “It’s not good to sleep too much,” instead of the faulty “Not good sleep much,” as <a href="https://translate.google.com/#es/en/No%20es%20bueno%20dormir%20mucho.">Google Translate</a> still does.</p>
<p>Replacing a word with its equivalent in the target language is actually the “easy part” of a translator’s job. But even this seems to be a daunting task for computers.</p>
<p>So why do programs continue to stumble on what seem like easy translations? </p>
<p>It’s so difficult for computers because translation doesn’t – or shouldn’t – involve simply translating words, sentences or paragraphs. Rather, it’s about translating <em>meaning</em>. </p>
<p>And in order to infer meaning from a specific utterance, humans have to interpret a multitude of elements at the same time.</p>
<p>Think about all the contextual clues that go into understanding an utterance: volume, pitch, situation, even your culture – all are as likely to convey as much meaning as the words you use. Certainly, a mother’s soft-spoken advice to “be careful” elicits a much different response than someone yelling “Be careful!” from the passenger’s seat of your car.</p>
<p>So can computers really interpret?</p>
<p>As the now-classic book <em><a href="http://press.uchicago.edu/ucp/books/book/chicago/M/bo3637992.html">Metaphors We Live By</a></em> has shown, languages are more metaphorical than factual in nature. Language acquisition often relies on learning abstract and figurative concepts that are very hard – if not impossible – to “explain” to a computer. </p>
<p>Since the way we speak often has nothing to do with the reality that surrounds us, machines are – and will continue to be – puzzled by the metaphorical nature of human communications.</p>
<p>This is why even a promising newcomer to the translation game like the website <a href="https://unbabel.com/">Unbabel</a>, which defines itself as an “AI-powered human-quality translation,” has to rely on an army of 42,000 translators around the world to fine-tune acceptable translations.</p>
<p>You need a human to tell the computer that “I’m seeing red” has little to do with colors, or that “I’m going to change” probably refers to your clothes and not your personality or your self.</p>
<p>If interpreting the intended meaning of a written word is already overwhelming for computers, imagine a world where a machine is in charge of translating what you say out loud in specific situations.</p>
<h2>The translation paradox</h2>
<p>Nonetheless, technology seems to be trending in that direction. Just as “intelligent personal assistants” like <a href="http://www.apple.com/ios/siri/">Siri</a> or <a href="http://www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E">Alexa</a> are getting better at understanding what you say, there is no reason to think that the future will not bring “personal assistant translators.”</p>
<p>But translating is an altogether different task than finding the nearest Starbucks, because machines aim for perfection and rationality, while languages – and humans – are always imperfect and irrational.</p>
<p>This is the paradox of computers and languages.</p>
<p>If machines become too sophisticated and logical, they’ll never be able to correctly interpret human speech. If they don’t, they’ll never be able to fully interpret all the elements that come into play when two humans communicate. </p>
<p>Therefore, we should be very wary of a device that is incapable of interpreting the world around us. If people from different cultures can offend each other without realizing it, how can we expect a machine to do better?</p>
<p>Will this device be able to detect sarcasm? In Spanish-speaking countries, will it know when to use “tú” or “usted” (the informal and formal personal pronouns for “you”)? Will it be able to sort through the many different forms of address used in Japanese? How will it interpret jokes, puns and other figures of speech?</p>
<p>Unless engineers actually find a way to <a href="http://www.jpost.com/Business/Business-Features/For-artificial-intelligence-pioneer-Marvin-Minsky-computers-have-soul-352076">breathe a soul into a computer</a> – pardon my figurative speech – rest assured that, when it comes to conveying and interpreting meaning using a natural language, a machine will never fully take our place.</p><img src="https://counter.theconversation.com/content/54805/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Arbesú does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>While translation technology has improved dramatically, there are some significant hurdles.David Arbesú, Assistant Professor of Spanish, University of South FloridaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/564182016-03-22T01:36:29Z2016-03-22T01:36:29ZWant to inspire kids to learn STEM? Get them to build a robot<figure><img src="https://images.theconversation.com/files/115934/original/image-20160322-32323-643h7f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Sometimes the best way to learn is to build a robot.</span> <span class="attribution"><span class="source">Chris Stacey</span>, <span class="license">Author provided</span></span></figcaption></figure><p>The music is pumping, the crowd is cheering and people are dancing. This is science, technology, engineering and maths (<a href="https://theconversation.com/au/topics/stem">STEM</a>), but not as you know it. </p>
<p>I’m at the Sydney Olympic Park Sports Centre as an invited judge for the 2016 Australia Regional <a href="https://firstaustralia.org/programs/first-robotics-competition/">FIRST (For the Inspiration and Recognition of Science and Technology) Robotics Competition</a>. </p>
<p>The competition is for students aged around 14-18 who, with the help of mentors and teachers, have six weeks (or significantly less in several cases) to design, build and program a robot for a designated challenge. This would be a difficult task even for seasoned engineers. </p>
<p>Forty-three teams from all around Australia, China, India, Singapore, Taiwan and the USA are here to take part, and the atmosphere is electric.</p>
<p>This year’s challenge is a medieval quest, with the arena designed as a castle and the challenge is to break through their opponent’s defences, weaken their tower with boulders (sponge balls) and try to capture it. </p>
<p>The teams have to work in an alliance with two other teams and develop a strategy together to beat the opposite alliance. Things can go wrong, and when something fails it’s back to the pit to problem solve and fix things under intense time pressure, all with the additional stress of the judges pestering them with questions.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/115935/original/image-20160322-32327-lxxc8u.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The task for the robots is to knock down a castle wall.</span>
<span class="attribution"><span class="source">Paul Wright</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>Robots across the nation</h2>
<p>Every team I spoke to had an incredible story to tell. The perseverance and dedication of the students in both building their robots and getting here is overwhelming, and for some teams both have been a major struggle. </p>
<p>A Chinese team from Lanzhou travelled here on their own without their mentor and had to ask companies and universities in China if they could borrow equipment and space in their laboratories to build their robot. </p>
<p>The Narooma High School team, from New South Wales, raised funds by selling 300 cupcakes and ran a RoboCamp to help 8-11 year olds learn the basics of robotics and computing to also generate money. </p>
<p>Another team is <a href="http://www.thethunderdownunder.org/us/">Thunder Down Under</a>, which was established at Macquarie University and brings together mentors with students from schools across Sydney. It’s the first Australian FIRST Robotics Competition (FRC) team, and helped bring the competition to Australia.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=689&fit=crop&dpr=1 600w, https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=689&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=689&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=866&fit=crop&dpr=1 754w, https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=866&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/115932/original/image-20160322-32283-5zk5ng.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=866&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">One of the members of Thunder Down Under working on the team’s robot.</span>
<span class="attribution"><span class="source">Chris Stacey</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Since starting up in 2009, Thunder Down Under has brought robotics to rural and remote communities in Australia. It has provided no-interest loans to teams for robotics kits so that teams can run RoboCamps and become self-sustaining. It’s partnered with another team to create <a href="http://www.ladiesinfirst.com/">FIRST ladies</a>, a network for girls in FIRST globally. It has helped start up teams in China and also helped develop an underwater robot and lego-robotics-style water safety game to utilise technology to help save lives. </p>
<p>At the inspiring FIRST ladies’ breakfast on Friday morning, I spoke to Louise from the Kan-Bot Crew, a rookie team from Kaniva, a small Victorian farming town located about half way between Adelaide and Melbourne. </p>
<p>Kaniva College has around 100 students of secondary age and about 17% of the students are taking part in the team, an accomplishment in itself. The team was supported though <a href="https://www.facebook.com/Robots-in-the-Outback-1544995949151306/">Robots in the Outback</a>, a Macquarie University and Google initiative. </p>
<p>The Kan-Bot Crew had just two and a half weeks to put their robot together and just one day with a mentor. They had difficulty finding local sponsorship due to a major drought last year, which placed financial stress on the small farming town. </p>
<p>They were unable to bring their two programmers to Sydney and so three other teams from Wollongong, Narooma and Ulladulla, have been lending them their programmers and other technical assistance in order to keep them up and running. For the Kaniva students this has been an extremely valuable opportunity to mix with like minded peers.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/115692/original/image-20160320-4436-1ikgxxs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The winning alliance and their machines: Barker Redbacks (red shirts); House of Ulladulla, Game of Drones (green shirts); and Thunder Down Under (yellow shirts).</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>Education first</h2>
<p>What really surprised me is that FIRST Robotics is not just about STEM. The students learn lifelong skills in leadership, entrepreneurship and communication as well as gaining confidence and meeting like minded peers from around the world. </p>
<p>There is a real emphasis on teamwork and assisting those around you, and I don’t think I’ve ever seen such generosity of time and resources in the heat of intense competition. Teams go out of their way to assist each other through “gracious professionalism”, part of the ethos of FIRST.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/115936/original/image-20160322-32285-1lwwwm2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Prime Minister Malcolm Turnbull and Foreign Minister Julie Bishop stopped by to see the action.</span>
<span class="attribution"><span class="source">Chris Stacey</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Judging the competition was tough. We spent hours behind closed doors trying to narrow teams down to worthy award winners. All decisions needed to be unanimous and eventually we reached consensus, wrote the award scripts and headed out to the arena just in time to catch the semifinal and finals. </p>
<p>It is heart breaking that some teams – especially the rookie teams – do not know how close they came to getting an award and how long we agonised over the decisions. All teams were deserving of awards and should be proud of their efforts at the competition. But in the end, the winning alliance was made up of the Barker Redbacks, House of Ulladulla, Game of Drones and Thunder Down Under.</p>
<p>As a judge, I’m also an ambassador for FIRST Robotics with a hope to inspire students by communicating my love of science, especially my passion for volcanoes, to show them what is possible through STEM. </p>
<p>However, at the end of the tournament, I am the one feeling truly inspired and uplifted after meeting such an ambitious, motivated, and brilliant set of young people.</p><img src="https://counter.theconversation.com/content/56418/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Heather Handley works for Macquarie University </span></em></p>The FIRST robotics competition brings school students together to build a robot to complete a challenge. And it’s an inspiring way to encourage interest in STEM.Heather Handley, Senior Lecturer in Geochemistry and Volcanology, Macquarie UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/542452016-03-02T23:04:23Z2016-03-02T23:04:23ZBesides feverish excitement, hackathons really can spur innovation<figure><img src="https://images.theconversation.com/files/113005/original/image-20160226-18094-1j5zlng.jpg?ixlib=rb-1.1.0&rect=486%2C0%2C2739%2C1633&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A 360 of a hackathon in full flight.</span> <span class="attribution"><span class="source">Carbon Visuals/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p><em>It’s 26 hours into the Random Hacks of Kindness hackathon, and I am on my ninth cup of coffee. We have just over five hours until “show and tell” time, when we need to have a working mobile app.</em> </p>
<p><em>Among the screwed-up butcher’s paper diagrams, leftover sandwiches and wafts of unshowered programmers in our corner of the office, one member of the group is staging an epic battle between jelly pythons, another is rewriting the video function of our app so that it will actually play video, and the third is having a change of heart entirely.</em> </p>
<p><em>“Do you think,” he asks, “we should maybe build a solar-powered coffee cart instead?”</em></p>
<p>This chaotic – but typical – scene occurred during a hackathon in Western Sydney last December. Focused on solving a small but challenging problem, hackathons involve intensive periods of brainstorming, coding, designing, testing – and often much coffee drinking. </p>
<p>Our event was one of the Random Hacks of Kindness (<a href="http://www.rhokaustralia.org/#what-is">RHoK</a>) series of hackathons also taking place in Melbourne, Sydney and Brisbane. Run over a weekend twice a year, the RHoK events focus on projects with social impact. </p>
<p>They invite “change makers” – community organisations, social enterprises and committed individuals – to pitch a problem to teams of volunteers. At the end of the weekend, a group of judges choose a winning team, and provide all the teams with feedback for developing their projects further.</p>
<h2>Igniting the ‘Ideas Boom’</h2>
<p>Hackathons fit neatly with Australia’s recent rhetorical pivot towards technology and innovation. Prime Minister Malcolm Turnbull’s “<a href="http://www.innovation.gov.au/system/files/case-study/National%20Innovation%20and%20Science%20Agenda%20-%20Report.pdf">Welcome to the ideas boom</a>” invites the country to “create a culture that backs good ideas and learns from taking risks and making mistakes” and develop “greater collaboration between universities and businesses”. </p>
<p>Academics have been turning attention to the mechanisms that might enable such culture and collaboration. For instance, QUT’s <a href="https://theconversation.com/profiles/marcus-foth-199317">Marcus Foth</a> has suggested that Australia needs a “<a href="https://theconversation.com/australia-needs-an-innovation-skunkworks-51326">skunkworks</a>”. These are spaces of creative collaboration outside of routine organisational procedures that “attract, house, support and unleash innovators, makers, thinkers and doers”. </p>
<p>From their origins as informal late nights run by a handful of technology enthusiasts, hackathons have evolved into well-coordinated, multi-project and sponsored workshops. They include social entrepreneurs, designers, researchers and other professionals as well as coders. They offer the temporal equivalent of a space for collaboration across research, government, industry and community sectors. </p>
<p>But the rise of the hackathon has also invited scepticism. Do these novel “<a href="http://nms.sagepub.com.ezproxy.uws.edu.au/content/early/2016/02/10/1461444816629467.full.pdf+htm">proto-publics</a>” function as incubators for creativity and collaboration, as they claim? Or are they more like a technologically habilitated form of the military boot camp, the corporate retreat or the cult centre? </p>
<p>Do the long hours, the sensory deprivation caused by constantly staring at screens, and a missionary zeal for technology instead induce a kind of collective faith in the object of a “working app”? And what responsibility do hackathons have for the translation of technical outcomes to the kinds of social change they look to produce? </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/113159/original/image-20160229-26687-1i9fnga.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Hard at work during the Random Hacks of Kindness held in Parramatta in 2015.</span>
<span class="attribution"><span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>Playing with fire</h2>
<p>Our own response in the aftermath of the event was one of mild and exhausted euphoria, accompanied by a sense that each of the change-maker teams had developed a tangible product that would be helpful to the groups they were intended for. </p>
<p>Under pressure of the tight timelines, each of the teams had, in their own way, “gelled” over the course of the two days. This was all the more remarkable since we had relatively few experienced hackathoners in our group. </p>
<p>In a research environment, where time for consultation with project stakeholders can be difficult to find, the results seemed especially surprising. The hackathon promotes the idea of more regular, if still sporadic, bursts of high-intensity collaboration across diverse disciplinary and institutional boundaries. It’s neither work nor hobby “<a href="https://www.academia.edu/6258154/Journal_of_Peer_Production_The_Ethic_of_the_Code_An_Ethnography_of_a_Humanitarian_Hacking_Community">but something of both</a>”. </p>
<p>On the other hand, in the days following the event, we also registered a certain degree of scepticism about the “<a href="http://twentyfive.fibreculturejournal.org/fcj-186-hack-for-good-speculative-labour-app-development-and-the-burden-of-austerity/">narrowing attention to issues that can be solved in a compressed timeframe</a>”. </p>
<p>In the cold light of day – during the working week that followed – we experienced a distinct anti-climax, as though the beneficial outcomes stemmed from a form of participant collusion during the event itself. It became increasingly difficult to convince others who had not attended the event about the technical and social miracles we had produced in such a short time and under duress. </p>
<p>Each project would also need to address the question of how it would advance the weekend’s work into a product: what would be the fate of these bursts of innovation and collaboration, produced under the effects of self-imposed confinement?</p>
<h2>After the fever</h2>
<p>Both euphoria and anticlimax are perhaps understandable responses to the “bootcamp” atmosphere. In this respect, hackathons share much in common with the “<a href="https://uk.sagepub.com/en-gb/eur/intensive-culture/book231038">intensive culture</a>” infusing research, media, markets and work. </p>
<p>This includes writing retreats, grant preparations, political campaigns, fitness and weight-loss programs, public talk competitions, agile work practices and “sprints” in the software industry. </p>
<p>As laboratories for translating ideas into implementation, hackathons hold exciting potential. To become enduring, they will need to prepare organisers, change makers and participants for the emotional rollercoasters that they invariably produce. </p>
<p>As the word “hack” itself suggests, they can involve the transgressive sense of breaking boundaries and making new links. Their role in building innovative cultures requires, though, more than feverish excitement and technological expertise. </p>
<p>Other researchers have begun to study their effects in more detail, charting some of their <a href="http://onlinelibrary.wiley.com/doi/10.1111/ropr.12074/abstract">challenges</a> and <a href="http://sth.sagepub.com/content/early/2015/04/07/0162243915578486.abstract">benefits</a>. Hackathons have become an “<a href="http://peerproduction.net/issues/issue-2/peer-reviewed-papers/diybio-in-asia/">invaluable unique form of life worthy of careful investigation</a>”. </p>
<p>In the current enthusiasm for building cultures of innovation, this investigation is now vital to ensure the broader “ideas boom” doesn’t prematurely go bust.</p><img src="https://counter.theconversation.com/content/54245/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Teresa Swist is a member of the national steering committee of Random Hacks of Kindness. Her role as Postdoctoral Research Fellow at the Institute for Culture and Society is funded by the Young and Well Cooperative Research Centre.</span></em></p><p class="fine-print"><em><span>Liam Magee is affiliated with Random Hacks of Kindness. He has previously received funding from the ARC, and through the ARC, a number of government and private organisations.</span></em></p><p class="fine-print"><em><span>Rachel Hendery works for Western Sydney University and is on the organising committee of the Random Hacks of Kindness Parramatta node. The Random Hacks of Kindness Parramatta Hackathon was supported by funding from Western Sydney University in 2015 and 2016.</span></em></p>Hackathons are all the rage, but if the participants follow through on the results, they can be a powerful instrument for generating innovation.Teresa Swist, Postdoctoral Research Fellow, Western Sydney UniversityLiam Magee, Senior Research Fellow, Digital Media, Western Sydney UniversityRachel Hendery, Senior Lecturer in Digital Humanities, Western Sydney UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/491392015-10-21T19:25:52Z2015-10-21T19:25:52ZToday’s smart machines owe much to Australia’s first computer<figure><img src="https://images.theconversation.com/files/98986/original/image-20151020-32241-1p7le9b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">CSIRAC: Australia's first computer has had a lasting impact.</span> <span class="attribution"><a class="source" href="http://gallery.eng.unimelb.edu.au/computing-history/">University of Melbourne</a>, <span class="license">Author provided</span></span></figcaption></figure><p>Australia’s first computer weighed two tonnes, filled a large room and had a tiny fraction of the capacity of today’s typical smartphone. But why would such a machine continue to be relevant today?</p>
<p>Originally designed and built by the Council for Scientific and Industrial Research (now known as CSIRO) in Sydney as the CSIR Mk1 in 1947-50, it was one of the very first computers to be completed and is the oldest computer that is still substantially intact.</p>
<p>It was relocated to the University of Melbourne in 1955 and <a href="http://www.cis.unimelb.edu.au/about/csirac/">relaunched as CSIRAC</a> (pronounced sigh-rack) on June 14, 1956 (just a few months before Sydney’s <a href="http://sydney.edu.au/engineering/it/silliac/history.shtml">SILLIAC</a>, which was launched in September 1956), and operated until 1964. It is now a <a href="http://museumvictoria.com.au/csirac/">permanent exhibit</a> at Museum Victoria.</p>
<p>The core design of CSIRAC is still the basis of computers today. It consists of a processor that executes instructions and storage used for both data and sequences of instructions – that is, programs.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=487&fit=crop&dpr=1 600w, https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=487&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=487&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=612&fit=crop&dpr=1 754w, https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=612&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/98987/original/image-20151020-32258-9aurh5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=612&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The original CSIR Mk 1, later renamed CSIRAC, was constructed by the Division of Radiophysics to the designs of Trevor Pearcey (pictured) and Maston Beard.</span>
<span class="attribution"><a class="source" href="http://scienceimage.csiro.au/tag/computer-hardware/i/1975/csir-mk1-computer//large">CSIRO Archives</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Huge in size, it was tiny in terms of computational capacity. Think of a smartphone as a “unit” (call it a “smart phone unit”, or SPU) of processing size then CSIRAC’s capacity was roughly a millionth of that – a microSPU. </p>
<p>Over its 14 years or so of operating life it did about the work that a smartphone today could do in a minute. Its storage was sufficient for rather less than one second of an MP3 music file.</p>
<p>But in terms of power, weight and size, it was 10,000 times larger, or, overall, ten billion times less efficient than today’s processors. Scaling up CSIRAC’s memory to that of a smartphone would fill the Melbourne Cricket Ground to the brim, and running it would consume all the power generated in Australia.</p>
<h2>More than a calculating machine</h2>
<p>If CSIRAC was so feeble, in SPUs, what set it (and its peers) apart from the calculating machines that preceded it? Many of the tasks it was put to were calculations more or less of the kind that had been done for decades by generations of dedicated calculating machines, both mechanical and electronic.</p>
<p>One might expect the difference to lie in the instructions the machine can execute. A first glance at CSIRAC’s instruction set could suggest that it was indeed just a kind of calculator; many of the operations are elementary arithmetic.</p>
<p>Other instructions concerned reading and writing of data to and from storage, and specifications for where in memory to find the next instruction to execute. Perhaps these could be seen as just feeding numbers to a calculating engine.</p>
<p>But these machines embodied something utterly revolutionary: the fact that instruction sequences were stored in memory, in contrast to the fixed, pre-determined structure of their predecessors. </p>
<p>A computer without an instruction sequence is no more than a box of components – useless and meaningless until assembled (that is, programmed). This meant that for the first time a new machine no longer required physical construction; it could be created just by altering the instruction sequence (that is, installing a new program). And the instruction sequences were themselves data – programs could manipulate programs.</p>
<h2>Instructing the machine</h2>
<p>This fluidity leads to a property that is truly profound. The CSIRAC instruction set is simple and minimalistic, even primitive. But, critically, it is in a fundamental sense complete. </p>
<p>Just as multiplication can be defined in terms of a sequence of additions, the small CSIRAC instruction set can be used to define any more sophisticated instruction.</p>
<p>In terms of the computations it can undertake, the universes it can represent, the models it can build, the CSIRAC instruction set is as powerful as that of a smartphone or of a supercomputer which today might be a million SPUs (or a trillion CSIRACs).</p>
<p>Thus even this very first generation of computers was universal. They were a new kind of thing not seen in the world before, a device whose function could be changed to do anything that could be written down, just by changing what sequences of instructions were entered; and that “anything” could be translated to run on any computer. </p>
<p>Many of the innovations trialled on these early, miniscule computers are as valuable today as when they were first invented.</p>
<p>And for some context, take a look at the computer in the 1970 movie <a href="http://www.imdb.com/title/tt0064177/">Colossus: The Forbin Project</a>. In the movie, the US computer Colossus and its USSR counterpart become self-aware. As a guess, from the look of the hardware, even though Colossus fills a mountain it may be no more than an SPU. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/t46Kjy-IJpY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Skynet, in <a href="http://www.imdb.com/title/tt0088247/">The Terminator</a> movie series, may have had less processing power than is in the pockets of a cinema full of teenagers today – demonstrating that the potential of computers could be seen long before they were large enough for this potential to be realised. </p>
<p>Our computers today are in fundamental ways no more powerful than their predecessors – just faster, smaller and more deeply embedded in our lives.</p>
<hr>
<p><em>Further reading: <a href="http://collections.museumvictoria.com.au/items/2155137">Last Of The First</a>, by Doug McCann and Peter Thorne, is a comprehensive overview of CSIRAC, published by the University of Melbourne, and now available free as a PDF.</em></p><img src="https://counter.theconversation.com/content/49139/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Justin Zobel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>It may have been big, slow and lacking in much memory but almost seven decades on we have a lot to thank the creators of Australia’s first programmable computing machine.Justin Zobel, Head, Department of Computing & Information Systems, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/474092015-09-21T20:10:47Z2015-09-21T20:10:47ZWant your kids to learn another language? Teach them code<figure><img src="https://images.theconversation.com/files/95484/original/image-20150921-19274-1sds0yq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Coding: it's just another language to learn at school.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/colegioabg/16162384887/">Flickrabg_colegio</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>Among <a href="http://www.malcolmturnbull.com.au/media/transcript-vote-on-the-liberal-party-leadership">Malcolm Turnbull’s first words</a> as the newly elected leader of the Liberal Party, and hence heading for the Prime Minister’s job, were: “The Australia of the future has to be a nation that is agile, that is innovative, that is creative.”</p>
<p>And near the heart of the matter is the code literacy movement. This is a movement to <a href="https://theconversation.com/an-education-for-the-21st-century-means-teaching-coding-in-schools-42046">introduce all school children</a> to the concepts of coding computers, starting in primary school.</p>
<p>One full year after the computing curriculum was introduced by the UK government, a survey there found that six out of ten parents want their kids to learn <a href="http://www.ocadogroup.com/news-and-media/news-centre/2015/20150901python.aspx">a computer language instead of French</a>.</p>
<h2>The language of code</h2>
<p>The language comparison is interesting because computer languages are first and foremost, languages. They are analogous to the written versions of human languages but simpler, requiring expressions without ambiguity.</p>
<p>They have a defining grammar. They come with equivalent dictionaries of nouns, verbs, adjectives and adverbs; with prepositions and phrase patterns, conjunctions, conditionals and clauses. Of course the dictionaries are less extensive than those of human languages, but the pattern rendering nature of the grammars have much the same purpose.</p>
<p>Kids that code gain a good appreciation of computational thinking and logical thought, that helps them develop good critical thinking skills. I’ve sometimes heard the term “<a href="http://dictionary.reference.com/browse/language+lawyer">language lawyer</a>” used as a euphemism for a pedantic programmer. Code literacy is good for their life skills kit, never mind their career prospects.</p>
<p><a href="https://scratch.mit.edu/">Scratch</a> is one of a new generation of block programming languages aimed at teaching novices and kids as young as eight or nine to write code.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=285&fit=crop&dpr=1 600w, https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=285&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=285&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=358&fit=crop&dpr=1 754w, https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=358&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/95477/original/image-20150921-19299-uxazqe.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=358&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Scratch teaches code with movable instruction blocks.</span>
<span class="attribution"><a class="source" href="https://studio.code.org/hoc/15">Screenshot from code.org</a></span>
</figcaption>
</figure>
<p>The Scratch language uses coloured blocks to represent the set of language constructs in its grammar. A novice programmer can build up a new program by dragging-and-dropping from a palette of these blocks onto a blank canvas or workspace.</p>
<p>The individual shapes of the blocks are puzzle-like, such that only certain pieces can interlock. This visually enforces the grammar, allowing the coder to concentrate on the creativeness of their whole program.</p>
<p>The Scratch language (and its derivatives) are embedded in a number of different tools and websites, each dedicated to a particular niche of novice programmers. The <a href="https://code.org/">code.org</a> website is a prime example and has a series of exercises using the block language to teach the fundamentals of computer science. </p>
<p>Code.org is a non-profit used by 6 million students, 43% of whom are female. It runs the <a href="https://hourofcode.com/au/en">Hour of Code</a> events each year, a global effort to get novices to try to do at least an hour of code.</p>
<p>For a week in May this year, Microsoft Australia partnered with Code.org to run the <a href="https://news.microsoft.com/en-au/2015/05/15/microsoft-inspires-australian-students-to-start-computer-coding-now/">#WeSpeakCode</a> event, teaching coding to <a href="http://www.abc.net.au/news/2015-05-15/hundreds-of-students-learn-coding/6473344">more than 7,000 young Australians</a>. My local primary school in Belgrave South in Victoria is using Code.org successfully with grade 5 and 6 students.</p>
<p>Unlike prose in a human language, computer programs are most often interactive. In the screenshot of the Scratch example (above) it has graphics from the popular <a href="http://www.popcap.com/games/plants-vs-zombies/online">Plants vs Zombies</a> game, one that most kids have already played. They get to program some basic mechanics of what looks a little like the game.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=510&fit=crop&dpr=1 600w, https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=510&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=510&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=641&fit=crop&dpr=1 754w, https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=641&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/95493/original/image-20150921-22462-1okbuh7.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=641&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Hit the ‘Show Code’ button at it reveals the JavaScript language behind the coloured blocks.</span>
<span class="attribution"><a class="source" href="https://studio.code.org/hoc/15">Screenshot from code.org</a></span>
</figcaption>
</figure>
<p>But code.org has a ‘Show Code’ button that reveals the JavaScript code generated behind the coloured blocks (see above). This shows novices what they created in tiles, translated into the formal syntax of a programming language widely used in industry.</p>
<h2>It’s not all about the ICT industry</h2>
<p>Both parents and politicians with an eye to the future see the best jobs as the creative ones. Digging up rocks, importing, consuming and servicing is not all that should be done in a forward-thinking nation.</p>
<p>But teaching kids to code is not all about careers in computer programming, science and <a href="https://theconversation.com/a-bit-of-coding-in-school-may-be-a-dangerous-thing-for-the-it-industry-42259">software engineering</a>. Introducing young minds to the process of instructing a computer allows them to go from “I swiped this” to “I made this”. From watching YouTube stars, to showing schoolyard peers how they made their pet cat photo meow.</p>
<p>It opens up young minds to the creative aspects of programming. Not only widening the possible cohort who may well study computer science or some other information and communications technology (ICT) professions, but also in design and the creative arts, and other fields of endeavour yet to transpire or be disrupted.</p>
<p>For most kids, teaching them to code is about opening their mind to a means to an end, not necessarily the end in itself.</p><img src="https://counter.theconversation.com/content/47409/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Steve Goschnick has received research funding from the Australian Research Council, Ericsson Australia Ltd (1998-2000), The University of Melbourne, and a Telstra Broadband Development Grant (2004). He has been the Managing Director of Solid Software Pty Ltd, a data modelling and software development consultancy, since 1998. </span></em></p>Computer coding should be thought of as teaching children another language. If they get the basics right at an early age, who knows where their new-found language skills can take them.Steve Goschnick, Adjunct Professor, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/458492015-08-13T05:56:33Z2015-08-13T05:56:33ZBig data algorithms can discriminate, and it’s not clear what to do about it<figure><img src="https://images.theconversation.com/files/91621/original/image-20150812-18104-12ryumv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's all just data – how can it be prejudiced?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/treyguinn/255880566">Trey Guinn</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><blockquote>
<p>“This program had absolutely nothing to do with race…but multi-variable equations.”</p>
</blockquote>
<p>That’s what Brett Goldstein, a former policeman for the Chicago Police Department (CPD) and current Urban Science Fellow at the University of Chicago’s School for Public Policy, <a href="http://www.ft.com/cms/s/0/200bebee-28b9-11e4-8bda-00144feabdc0.html">said</a> about a <a href="http://mathbabe.org/2014/08/25/gilian-tett-gets-it-very-wrong-on-racial-profiling/">predictive policing algorithm</a> he deployed at the CPD in 2010. His algorithm tells police where to look for criminals based on where people have been arrested previously. It’s a “heat map” of Chicago, and the CPD claims it helps them allocate resources more effectively.</p>
<p>Chicago police also recently collaborated with Miles Wernick, a professor of electrical engineering at Illinois Institute of Technology, to algorithmically generate a “<a href="http://directives.chicagopolice.org/directives-mobile/data/a7a57bf0-13fa59ed-26113-fa63-2e1d9a10bb60b9ae.html?ownapi=1">heat list</a>” of 400 individuals it claims have the <a href="http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist">highest chance of committing a violent crime</a>. In response to criticism, Wernick said the algorithm does not use “any racial, neighborhood, or other such information” and that the approach is “unbiased” and “quantitative.” By deferring decisions to poorly understood algorithms, industry professionals effectively shed accountability for any negative effects of their code.</p>
<p>But do these algorithms discriminate, treating low-income and black neighborhoods and their inhabitants unfairly? It’s the kind of question many researchers are starting to ask as more and more industries use algorithms to make decisions. It’s true that an algorithm itself is quantitative – it boils down to a sequence of arithmetic steps for solving a problem. The danger is that these algorithms, which are trained on data produced by people, may reflect the biases in that data, perpetuating structural racism and negative biases about minority groups.</p>
<p>There are a lot of challenges to figuring out whether an algorithm embodies bias. First and foremost, many practitioners and “computer experts” still don’t publicly admit that algorithms can easily discriminate. <a href="http://doi.org/#sthash.KtkbzDKj.dpuf10.1515/popets-2015-0007">More</a> and <a href="http://www.eurekalert.org/pub_releases/2015-04/uow-wac040915.php">more</a> evidence <a href="http://www.nationaljournal.com/tech/feds-investigate-discrimination-by-algorithm-20140915">supports</a> that not only is this possible, but it’s happening already. The law is unclear on the legality of biased algorithms, and even algorithms researchers don’t precisely understand what it means for an algorithm to discriminate.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/91623/original/image-20150812-18074-1xz8yn7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Is bias baked in?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/hieronymus/1481344016">Justin Ruckman</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Being quantitative doesn’t protect against bias</h2>
<p>Both Goldstein and Wernick claim their algorithms are fair by appealing to two things. First, the algorithms aren’t explicitly fed protected characteristics such as race or neighborhood as an attribute. Second, they say the algorithms aren’t biased because they’re “quantitative.” Their argument is an appeal to abstraction. Math isn’t human, and so the use of math can’t be immoral.</p>
<p>Sadly, Goldstein and Wernick are repeating a common misconception about data mining, and mathematics in general, when it’s applied to social problems. The entire purpose of data mining is to discover hidden correlations. So if race is disproportionately (but not explicitly) represented in the data fed to a data-mining algorithm, the algorithm can infer race and use race indirectly to make an ultimate decision.</p>
<p>Here’s a simple example of the way algorithms can result in a biased outcome based on what it learns from the people who use it. Look at how how Google search suggests finishing a query that starts with the phrase “transgenders are”:</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=224&fit=crop&dpr=1 600w, https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=224&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=224&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=282&fit=crop&dpr=1 754w, https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=282&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/91631/original/image-20150812-18108-p6j6xq.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=282&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Taken from Google.com on 2015-08-10.</span>
</figcaption>
</figure>
<p>Autocomplete features are generally a tally. Count up all the searches you’ve seen and display the most common completions of a given partial query. While most algorithms might be neutral on the face, they’re designed to find trends in the data they’re fed. Carelessly trusting an algorithm allows dominant trends to cause harmful discrimination or at least have distasteful results. </p>
<p>Beyond biased data, such as Google autocompletes, there are other pitfalls, too. Moritz Hardt, a researcher at Google, describes what he calls the <a href="https://medium.com/@mrtz/how-big-data-is-unfair-9aa544d739de">sample size disparity</a>. The idea is as follows. If you want to predict, say, whether an individual will click on an ad, most algorithms optimize to reduce error based on the previous activity of users.</p>
<p>But if a small fraction of users consists of a racial minority that tends to behave in a different way from the majority, the algorithm may decide it’s better to be <em>wrong</em> for all the minority users and lump them in the “error” category in order to be more accurate on the majority. So an algorithm with 85% accuracy on US participants could err on the entire black sub-population and still seem very good. </p>
<p>Hardt continues to say it’s hard to determine why data points are erroneously classified. Algorithms rarely come equipped with an explanation for why they behave the way they do, and the easy (and dangerous) course of action is not to ask questions.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/91653/original/image-20150812-18071-14n98nl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Those smiles might not be so broad if they realized they’d be treated differently by the algorithm.</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-233268265/stock-photo-two-colleagues-discussing-data-on-laptop.html">Men image via www.shutterstock.com</a></span>
</figcaption>
</figure>
<h2>Extent of the problem</h2>
<p>While researchers clearly understand the theoretical dangers of algorithmic discrimination, it’s difficult to cleanly measure the scope of the issue in practice. No company or public institution is willing to publicize its data and algorithms for fear of being labeled racist or sexist, or maybe worse, having a great algorithm stolen by a competitor.</p>
<p>Even when the Chicago Police Department was <a href="http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist">hit with a Freedom of Information Act request</a>, they did not release their algorithms or heat list, claiming a credible threat to police officers and the people on the list. This makes it difficult for researchers to identify problems and potentially provide solutions.</p>
<h2>Legal hurdles</h2>
<p>Existing discrimination law in the United States isn’t helping. At best, it’s unclear on how it applies to algorithms; at worst, it’s a mess. Solon Barocas, a postdoc at Princeton, and Andrew Selbst, a law clerk for the Third Circuit US Court of Appeals, <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2477899">argued together</a> that US hiring law fails to address claims about discriminatory algorithms in hiring.</p>
<p>The crux of the argument is called the “business necessity” defense, in which the employer argues that a practice that has a discriminatory effect is justified by being directly related to job performance. According to Barocas and Selbst, if a company algorithmically decides whom to hire, and that algorithm is blatantly racist but even mildly successful at predicting job performance, this would count as business necessity – and not as illegal discrimination. In other words, the law seems to <em>support</em> using biased algorithms.</p>
<h2>What <em>is</em> fairness?</h2>
<p>Maybe an even deeper problem is that nobody has agreed on what it means for an algorithm to be fair in the first place. Algorithms are mathematical objects, and mathematics is far more precise than law. We can’t hope to design fair algorithms without the ability to precisely demonstrate fairness mathematically. A good mathematical definition of fairness will model biased decision-making in any setting and for any subgroup, not just hiring bias or gender bias. </p>
<p>And fairness seems to have two conflicting aspects when applied to a population versus an individual. For example, say there’s a pool of applicants to fill 10 jobs, and an algorithm decides to hire candidates completely at random. From a population-wide perspective, this is as fair as possible: all races, genders and orientations are equally likely to be selected. </p>
<p>But from an individual level, it’s as unfair as possible, because an extremely talented individual is unlikely to be chosen despite their qualifications. On the other hand, hiring based only on qualifications reinforces hiring gaps. Nobody knows if these two concepts are inherently at odds, or whether there is a way to define fairness that reasonably captures both. Cynthia Dwork, a Distinguished Scientist at Microsoft Research, and her colleagues <a href="http://arxiv.org/abs/1104.3913">have been studying</a> the relationship between the two, but even Dwork admits they have just <a href="http://www.nytimes.com/2015/08/11/upshot/algorithms-and-bias-q-and-a-with-cynthia-dwork.html">scratched the surface.</a></p>
<h2>Get companies and researchers on the same page</h2>
<p>There are immense gaps on all sides of the algorithmic fairness issue. When a panel of experts at this year’s <a href="http://www.fatml.org">Workshop on Fairness, Accountability, and Transparency in Machine Learning</a> was asked what the low-hanging fruit was, they <a href="http://blog.geomblog.org/2015/07/the-2nd-workshop-on-fairness-accuracy.html">struggled</a> to find an answer. My opinion is that if we want the greatest progress for the least amount of work, then businesses should start sharing their data with researchers. Even with <a href="http://www.fatml.org/resources.html">proposed “fair” algorithms</a> starting to appear in the literature, without well-understood benchmarks we can’t hope to evaluate them fairly.</p><img src="https://counter.theconversation.com/content/45849/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jeremy Kun does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Math isn’t prejudiced, goes the argument. But these arithmetic programs can learn bias from the data fed into them by human beings, leading to unfair treatment and discrimination.Jeremy Kun, PhD Student in Mathematics, University of Illinois ChicagoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/427382015-06-11T11:07:05Z2015-06-11T11:07:05ZWithout teacher guidance, all the tech in the world will be quite useless<figure><img src="https://images.theconversation.com/files/84429/original/image-20150609-10720-14x7hpo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How can technology be harnessed to teach children in an effective way?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/fn-goa/4063885464/in/photolist-dZau8h-7c3Ldk-7c3HDr-7c7veN-7c3GgH-7c7tyL-7c7s8L-7c7unN-7c7sMo-7c7zvY-7c7wSY-7c7yzY-7c7rtC-94EcJg-6T2Wt9-9bBdFG-8nyNVd-3oxYh-9NELbX-5A6y2h-5mP2XB-5mThfW-7Mxqhz-7MBpFo-7wigCB-rcd28x-8Rz6N3-6xd89E-6vSCa-ttunps-68xxks-bNpR4v-8mVc4V-6xcFLL-8TntHw">Frederick Noronha fredericknoronha1@gmail.com</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>A couple of years ago, I taught an afterschool class at a Seattle nonprofit, the <a href="http://www.seattlefoundation.org/npos/Pages/TechnologyAccessFoundation.aspx">Technology Access Foundation (TAF),</a> which provides STEM education (science, technology, engineering, math) to children from less-privileged backgrounds. My students were 8-11 years old, and it was the first time that I had taught elementary school students.</p>
<p>The curriculum devised by TAF’s staff involves hands-on interaction with laptops to explore programming, robotics and audio editing. With a PhD in computer science and a range of experience teaching older students, I thought it would be easy. </p>
<p>It was anything but. </p>
<p>To allow students a lot of interaction with their devices, I avoided lectures and instead had the students work on their own while I went from table to table to help them individually. My hope was to give the children a chance to learn at their own pace.</p>
<p>The students, however, had other ideas. The minute I turned my attention to one, the others started playing video games. However nutritious the syllabus, they were drawn to the cognitive candy of flashy graphics and sound effects.</p>
<p>The problem I faced at TAF was a small version of the conundrum that confronts parents and schools everywhere: how do we prepare children for a technological world while avoiding the distractions of technology?</p>
<h2>Diversions in India</h2>
<p>I first encountered this problem about a decade ago in India. At the time, I was the head of a research team at <a href="http://research.microsoft.com/en-us/collaboration/global/india/">Microsoft Research</a> in Bangalore. My group explored ways in which computing technology could support poor communities. Education was one of our focuses.</p>
<p>Many Indian government schools boasted computer labs, but given limited funds, they often had no more than five or six PCs. With class sizes of 40 or more, this inevitably meant that crowds of children would huddle around each machine, with most of them unable to access the mouse or keyboard.</p>
<p>We tried an innovation in which a single PC was outfitted with multiple mice, each with an attendant cursor on screen. This customized educational software, called MultiPoint, allowed several students to interact simultaneously.</p>
<p>MultiPoint was a hit with students. <a href="http://dl.acm.org/citation.cfm?doid=1240624.1240864">A controlled trial</a> showed that for some exercises, students could learn as much when they were sitting five to a PC as when they had a PC all to themselves.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/84453/original/image-20150609-10675-dlp7ee.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A Technology Access Foundation student sneaking in a video game during an afterschool programming class.</span>
<span class="attribution"><span class="source">Kentaro Toyama</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>However, when we tried to take the idea to other schools, we were stymied. </p>
<p>One problem we often encountered was that teachers would be overwhelmed with the mechanics of the technology. Without a dedicated IT staff or significant training themselves, they’d spend the first 15-20 minutes of a 50-minute class fiddling with the PCs to set them up. </p>
<p>Whatever the technology’s potential, in actuality, time was diverted from learning. </p>
<h2>Technology’s law of amplification</h2>
<p>Similar things happened in dozens of other projects we ran in <a href="http://dl.acm.org/citation.cfm?id=2369220.2369248">education</a>, <a href="http://itidjournal.org/itid/article/view/327">agriculture</a>, <a href="http://dl.acm.org/citation.cfm?id=2369253">healthcare</a> and so on. Despite our best efforts at good design, computing technology did not, in and of itself, lower costs, improve pedagogy, or make organizations more efficient.</p>
<p>Teachers didn’t improve just by using digital content; administrators didn’t become better managers through clever gadgets; and budgets didn’t grow with the use of supposedly cost-saving machines. </p>
<p>Anurag Behar, CEO of a nonprofit we worked with, <a href="http://www.livemint.com/Opinion/Y3Rhb5CXMkGuUIyg4nrc3I/Limits-of-ICT-in-education.html">put it</a> succinctly: </p>
<blockquote>
<p>“At its best, the fascination with [digital technology] as a solution distracts from the real issues.”</p>
</blockquote>
<p>Contrary to Silicon Valley hype, machines don’t add a fixed benefit wherever they’re used. Instead, <a href="http://dl.acm.org/citation.cfm?id=1940772">technology amplifies underlying human forces</a> – the unproductive ones as much as the beneficial ones. My book, <a href="http://geekheresy.org">Geek Heresy: Rescuing Social Change from the Cult of Technology</a>, explains in detail why technology by itself doesn’t solve deep social problems. </p>
<p>Other researchers have found a similar pattern. University of California, Irvine, researcher, <a href="http://education.uci.edu/person/warschauer_m/warschauer_m_bio.php">Mark Warschauer</a>, along with colleagues <a href="http://www.montclair.edu/profilepages/view_profile.php?username=knobelm">Michele Knobel</a> and Leeann Stone, sums up this challenge in <a href="http://epx.sagepub.com/content/18/4/562.short">his paper</a>:</p>
<blockquote>
<p>Placing computers and internet connections in [low-income] schools, in and of itself, does little to address the serious educational challenges faced by these schools. To the extent that an emphasis on provision of equipment draws attention away from other important resources and interventions, such an emphasis can in fact be counterproductive.</p>
</blockquote>
<p>In other words, while digital tools can augment the efforts of a well-run learning environment, they harm dysfunctional schools by distracting them from their goals. </p>
<p>The amplification principle also applies at the individual level.</p>
<p>Children have both a drive to learn and an affinity for quick rewards – digital aids amplify both. Few people would imagine that children left in a room with an encyclopedia and enticing toys (even educational ones) could, on their own, summit the intellectual mountain that is a K-12 education. </p>
<p>Handing students a computing device and expecting them to teach themselves is the virtual equivalent of being left in such a room. Rigorous research by <a href="http://economics.ucsc.edu/faculty/singleton.php?&singleton=true&cruz_id=rfairlie">economists Robert Fairlie</a> and <a href="http://people.ucsc.edu/%7Ejmrtwo/">Jonathan Robinson</a> finds that laptops provided free to students result in no educational gains of any kind. </p>
<p>In other words, while technology can amplify good pedagogy, there is no way around quality adult guidance for real learning.</p>
<h2>People first, technology second</h2>
<p>At TAF, I was lucky to have a good manager and several terrific teachers as role models. They recommended that I set some rules. For example, I asked students to close their screens any time I was doing a demonstration. I prohibited free time with the laptops if they came early, so that they wouldn’t start off with games. And anyone caught playing video games during class was sent to my manager for a few words of discipline. </p>
<p>Implementing these rules was a challenge at first, but young children are mercifully responsive to firm adult direction. Within a couple of classes, the students got used to the new class culture, and they started focusing on the learning activities. </p>
<p>What I learned was that even in a class about computers, maximizing screen time wasn’t the goal. The first requirement is the proper mindset – focused motivation in students and capable adult supervision. </p>
<p>If technology amplifies human forces, then a good outcome with technology requires that the right human forces be in place first.</p><img src="https://counter.theconversation.com/content/42738/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kentaro Toyama is affiliated with Digital Green (board chair), Village Health Works (board member), Humanosphere (board member), Innovations for Poverty Action(board member), Grameen Foundation (advisory board), IICD (advisory board).</span></em></p>How can we prepare children for a tech world while fighting the distractions it inevitably brings?Kentaro Toyama, Associate Professor, Technology and Global Development, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/379612015-03-09T18:50:21Z2015-03-09T18:50:21ZTo stop the machines taking over we need to think about fuzzy logic<figure><img src="https://images.theconversation.com/files/74030/original/image-20150306-3317-w7ieka.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A model of the Terminator from the popular movie series where machines take over the world.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/shutterjunkie/3877277138">Flickr/Edwin Montufar</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span></figcaption></figure><p>Amid all the <a href="https://theconversation.com/is-stephen-hawking-right-could-ai-lead-to-the-end-of-humankind-34967">dire warnings</a> that machines run by artificial intelligence (<a href="https://theconversation.com/au/topics/artificial-intelligence">AI</a>) will one day take over from humans we need to think more about how we program them in the first place.</p>
<p>The technology may be too far off to seriously entertain these worries – for now – but much of the distrust surrounding AI arises from misunderstandings in what it means to say a machine is “thinking”.</p>
<p>One of the current aims of AI research is to design machines, algorithms, input/output processes or mathematical functions that can mimic human thinking as much as possible.</p>
<p>We want to better understand what goes on in human thinking, especially when it comes to decisions that cannot be justified other than by drawing on our “intuition” and “gut-feelings” – the decisions we can only make after learning from experience.</p>
<p>Consider the human that hires you after first comparing you to other job-applicants in terms of your work history, skills and presentation. This human-manager is able to make a decision identifying the successful candidate.</p>
<p>If we can design a computer program that takes exactly the same inputs as the human-manager and can reproduce its outputs, then we can make inferences about what the human-manager really values, even if he or she cannot articulate their decision on who to appoint other than to say “it comes down to experience”.</p>
<p>This kind of research is being <a href="http://www.sciencedirect.com/science/article/pii/S0165011406002612">carried out today</a> and applied to understand risk-aversion and risk-seeking behaviour of financial consultants. It’s also being looked at in the field of <a href="http://www.britannica.com/EBchecked/topic/400270/MYCIN">medical diagnosis</a>.</p>
<p>These human-emulating systems are not yet being asked to make decisions, but they are certainly being used to help guide human decisions and reduce the level of human error and inconsistency.</p>
<h2>Fuzzy sets and AI</h2>
<p>One promising area of research is to utilise the framework of <a href="https://www.calvin.edu/%7Epribeiro/othrlnks/Fuzzy/fuzzysets.htm">fuzzy sets</a>. Fuzzy sets and fuzzy logic were formalised by <a href="http://www.sciencedirect.com/science/article/pii/S1026309811000666">Lotfi Zadeh</a> in 1965 and can be used to mathematically represent our knowledge pertaining to a given subject.</p>
<p>In everyday language what we mean when accusing someone of “fuzzy logic” or “fuzzy thinking” is that their ideas are contradictory, biased or perhaps just not very well thought out.</p>
<p>But in mathematics and logic, “fuzzy” is a name for a research area that has quite a sound and straightforward basis.</p>
<p>The starting point for fuzzy sets is this: many decision processes that can be managed by computers traditionally involve truth values that are binary: something is true or false, and any action is based on the answer (in computing this is typically encoded by 0 or 1).</p>
<p>For example, our human-manager from the earlier example may say to human resources:</p>
<ul>
<li>IF the job applicant is aged 25 to 30</li>
<li>AND has a qualification in philosophy OR literature</li>
<li>THEN arrange an interview.</li>
</ul>
<p>This information can all be written into a hiring algorithm, based on true or false answers, because an applicant either is between 25 and 30 or is not, they either do have the qualification or they do not.</p>
<p>But what if the human-manager is somewhat more vague in expressing their requirements? Instead, the human-manager says:</p>
<ul>
<li>IF the applicant is tall</li>
<li>AND attractive</li>
<li>THEN the salary offered should be higher.</li>
</ul>
<p>The problem HR faces in encoding these requests into the hiring algorithm is that it involves a number of subjective concepts. Even though height is something we can objectively measure, how tall should someone be before we call them tall?</p>
<p>Attractiveness is also subjective, even if we only account for the taste of the single human-manager.</p>
<h2>Grey areas and fuzzy sets</h2>
<p>In fuzzy sets research we say that such characteristics are fuzzy. By this we mean that whether something belongs to a set or not, whether a statement is true or false, can gradually increase from 0 to 1 over a given range of values.</p>
<p>One of the hardest things in any fuzzy-based software application is how best to convert observed inputs (someone’s height) into a fuzzy degree of membership, and then further establish the rules governing the use of connectives such as AND and OR for that fuzzy set.</p>
<p>To this day, and likely in years or decades into the future, the rules for this transition are human-defined. For example, to specify how tall someone is, I could design a function that says a 190cm person is tall (with a truth value of 1) and a 140cm person is not tall (or tall with a truth value of 0).</p>
<p>Then from 140cm, for every increase of 5cm in height the truth value increases by 0.1. So a key feature of any AI system is that we, normal old humans, still govern all the rules concerning how values or words are defined. More importantly, we define all the actions that the AI system can take – the “THEN” statements. </p>
<h2>Human–robot symbiosis</h2>
<p>An area called <a href="http://www.cs.berkeley.edu/%7Ezadeh/papers/What%20Computing%20with%20Words%20Means%20to%20Me-CIM%202010.pdf">computing with words</a>, takes the idea further by aiming for seamless communication between a human user and an AI computer algorithm.</p>
<p>For the moment, we still need to come up with mathematical representations of subjective terms such as “tall”, “attractive”, “good” and “fast”. Then we need to design a function for combining such comments or commands, followed by another mathematical definition for turning the result we get back into an output like “yes he is tall”.</p>
<p>In conceiving the idea of computing with words, researchers envisage a time where we might have more access to base-level expressions of these terms, such as the brain activity and readings when we use the term “tall”.</p>
<p>This would be an amazing leap, although mainly in terms of the technology required to observe such phenomena (the number of neurons in the brain, let alone synapses between them, is somewhere near the number of galaxies in the universe).</p>
<p>Even so, designing machines and algorithms that can emulate human behaviour to the point of mimicking communication with us is still a long way off.</p>
<p>In the end, any system we design will behave as it is expected to, according to the rules we have designed and program that governs it.</p>
<h2>An irrational fear?</h2>
<p>This brings us back to the big fear of AI machines turning on us in the future.</p>
<p>The real danger is not in the birth of genuine artificial intelligence –- that we will somehow manage to create a program that can become self-aware such as HAL 9000 in the movie 2001: A Space Odyssey or Skynet in the Terminator series.</p>
<p>The real danger is that we make errors in encoding our algorithms or that we put machines in situations without properly considering how they will interact with their environment.</p>
<p>These risks, however, are the same that come with any human-made system or object.</p>
<p>So if we were to entrust, say, the decision to fire a weapon to AI algorithms (rather than just the guidance system), then we might have something to fear.</p>
<p>Not a fear that these intelligent weapons will one day turn on us, but rather that we programmed them – given a series of subjective options – to decide the wrong thing and turn on us.</p>
<p>Even if there is some uncertainty about the future of “thinking” machines and what role they will have in our society, a sure thing is that we will be making the final decisions about what they are capable of.</p>
<p>When programming artificial intelligence, the onus is on us (as it is when we design skyscrapers, build machinery, develop pharmaceutical drugs or draft civil laws), to make sure it will do what we really want it to.</p><img src="https://counter.theconversation.com/content/37961/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Simon James does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If machines run by artificial intelligence take over the world it’s only because we programmed them to do so. So how can fuzzy logic help us prevent that?Simon James, Lecturer in Mathematics, Deakin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/279482014-06-12T17:19:22Z2014-06-12T17:19:22ZHow the love of one teenager brought Tweetdeck to its knees<figure><img src="https://images.theconversation.com/files/50958/original/5jb7h36t-1402577304.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Not so tight Florian!</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/79586895@N00/948979876/sizes/o/">ladyb</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>TweetDeck, a Twitter app with millions of users, is back online after a rather surprising security scare. For several hours, the service was taken down all because a 19-year-old user tried to add a cute heart to his messages.</p>
<p>It seems that <a href="https://twitter.com/firoxl">Florian</a>, a budding young programmer from Austria, had run a small amount of code in the TweetDeck interface in an attempt to add a heart icon at the end of his tweets.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=327&fit=crop&dpr=1 600w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=327&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=327&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=411&fit=crop&dpr=1 754w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=411&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/50961/original/spv9dpfc-1402578181.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=411&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Florian reveals his discovery.</span>
<span class="attribution"><a class="source" href="https://twitter.com/firoxl">Twitter</a></span>
</figcaption>
</figure>
<p>Once Florian realised he had found a weakness in TweetDeck that would allow him to introduce a heart, he announced it triumphantly to the world. He says that he <a href="http://www.theregister.co.uk/2014/06/12/tweetdeck_xss_vuln_uncovered_by_heart_hunting_teenager/">tried to alert</a> Twitter, which owns the service, to the weakness but received no response.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=319&fit=crop&dpr=1 600w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=319&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=319&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=401&fit=crop&dpr=1 754w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=401&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/50966/original/ydj9fztk-1402580792.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=401&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A lot of retweets for @derGeruhn.</span>
<span class="attribution"><span class="source">Twitter</span></span>
</figcaption>
</figure>
<p>That worm sent out a line of code out as a tweet from a Twitter account and caused tens of thousands of users to automatically retweet without realising. The account that had the original tweet, @derGeruhn, is owned by a German student called <a href="http://www.washingtonpost.com/news/the-intersect/wp/2014/06/11/who-is-dergeruhn-the-twitter-account-that-40000-tweetdeck-users-just-involuntarily-retweeted/">Andy Perdana</a>. It’s not known if he was deliberately involved or had his account hijacked.</p>
<p>Tweetdeck picked up the tweet and retweeted it to anyone with the app open on their machine. It was then retweeted around 80,000 times, including by the BBC, which retweeted to ten million followers.</p>
<p>It was just like the old days, when worms would infect systems and hog them to the point that they became unusable. In this case, Twitter stepped in, and switched off the function that allowed the messages to be retweeted.</p>
<h2>What’s up with the web?</h2>
<p>At the moment it seems security threats are emerging on some of the biggest sites every day and this is at least in part due to how we run websites these days.</p>
<p>As more and more services are hosted in the cloud and more code is run on web servers, we are using HTML and JavaScript more than ever. In the past, software development teams would spend a great deal of time testing their programs to destruction to spot weaknesses but these programming languages were never designed to be secure.</p>
<p>To make things worse, the teams who are writing web-based code often have little training in how to actually test their applications. Code that is run on Windows or Mac programs are rigorously tested but those run on the web are not. Programmers who test their own programs often do not exercise them in a way that will make them break so they miss important problems.</p>
<p>The TweetDeck hack was about as simple they come, it just exploited a flaw that had been overlooked in testing. As Florian himself pointed out, he should never have been allowed to introduce his loved-up code in the first place.</p>
<p>The code that runs on web servers is often quite messy so security needs to be taught from day one. Software development teams must learn how to secure their code, especially by checking data input at the gate. They should know that users should never be allowed to add code without it being checked first. </p>
<p>For some reason, we often don’t teach security to software developers, especially in an understanding of how to handle exceptions in user input or from external systems, and how we encrypt data. This lack of understanding often leads to passwords and user credentials not be stored in a secure way.</p>
<p>This has got to change. Luckily, in this case, there was no real damage done, but if a single teenager can prompt the collapse of one of the biggest names on the web, we should really be taking away a serious warning about security. </p><img src="https://counter.theconversation.com/content/27948/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bill Buchanan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>TweetDeck, a Twitter app with millions of users, is back online after a rather surprising security scare. For several hours, the service was taken down all because a 19-year-old user tried to add a cute…Bill Buchanan, Head, Centre for Distributed Computing, Networks and Security, Edinburgh Napier UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/276852014-06-06T04:48:39Z2014-06-06T04:48:39ZSwift: how Apple’s new coding language lives up to its name<figure><img src="https://images.theconversation.com/files/50442/original/t68nkh6y-1402024164.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Developers will, according to Apple, be able to code faster and more efficiently than ever before, thanks to Swift.</span> <span class="attribution"><a class="source" href="http://www.flickr.com/photos/hackny/6890140478">HackNY.org/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>As Apple’s Worldwide Developers Conference (<a href="https://developer.apple.com/wwdc/">WWDC</a>) winds up in San Francisco today, 1,000 Apple engineers and 5,000 developers will return to their parts of the world armed with Apple’s own programming language.</p>
<p>In his keynote on Monday, Apple CEO Tim Cook unveiled – among other <a href="https://theconversation.com/what-apple-did-and-didnt-say-at-wwdc-2014-27446">new developments</a> – programming language <a href="https://developer.apple.com/swift/">Swift</a> and claimed it to be a significantly faster code for development across iOS and OSX.</p>
<p>Apple is the latest tech firm to produce their own programming language (Google and Microsoft also have their own languages) and Swift can be used by Apple developers as of today with <a href="https://itunes.apple.com/us/book/the-swift-programming-language/id881256329?mt=11">677 pages of documentation</a> available in the iBooks store.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/l62x8Oq_QP4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A Swift programming demo at WWDC.</span></figcaption>
</figure>
<p>But why would a company want their own programing language – especially when existing, general purpose codes such as <a href="https://developer.apple.com/library/mac/documentation/cocoa/conceptual/ProgrammingWithObjectiveC/Introduction/Introduction.html">Objective-C</a> and <a href="http://www.cprogramming.com/">C</a> have been successfully used for 20 years?</p>
<h2>So what’s so good about Swift?</h2>
<p>It pretty much comes down to speed.</p>
<p>While Apple (and other companies) supply the hardware, developers ultimately bring the most utility value out of technologies. The faster developers can code, the more apps can be created. </p>
<p>So let’s have a look at why Swift is the next big thing (and why developers should take the time to <a href="http://www.mobileinnovation-lab.com/">learn a new language</a>, as it were):</p>
<p><strong>Swift is much easier to code with.</strong> Swift looks much “cleaner” than traditional code. In addition to getting rid of nested brackets and semicolons (which makes code look very complex and harder to maintain), programmers can now use <a href="https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/Swift_Programming_Language/TheBasics.html#//apple_ref/doc/uid/TP40014097-CH5-XID_418">inferred types</a>, which means that variables and constants can be declared without necessarily specifying the data type. </p>
<p>Developers can reduce debugging time over mundane and trivial errors (if you’re interested in the nitty-gritty, Swift manages unsafe codes by self-managing memory, preventing overflows – in arrays, for example – and properly handling nil objects). </p>
<p>It also means that new developers can be spared the need to learn Objective-C’s complex and verbose syntaxes (but Swift will sit alongside existing Objective-C and C codes).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/50444/original/gv4rj3cs-1402025300.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Young developers at WWDC with Apple CEO Tim Cook.</span>
<span class="attribution"><span class="source">Jimmy Ti</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p><strong>Swift is fast and powerful.</strong> Fast programming is a key ingredient in Apple’s new hardware and software capabilities. Swift codes will be compiled using the same high-performance compiler, and it will be run natively to combine the best features from Objective-C and C. </p>
<p>Based on the presentation in WWDC, we saw statistics showing complex algorithms can be run much faster than Objective-C.</p>
<p><strong>Swift supports “interactive playgrounds”.</strong> “Interactive playgrounds” allow developers to immediately see the results of changing codes and keep track of progress timelines. This is particularly useful for debugging complex loops, algorithms and animations.</p>
<h2>Speaking of new developments …</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=760&fit=crop&dpr=1 600w, https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=760&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=760&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=955&fit=crop&dpr=1 754w, https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=955&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/50446/original/7dm33xb8-1402026046.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=955&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="http://www.flickr.com/photos/chealion/579303408">Michael J/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>As widely expected, Apple joins Google and Microsoft’s moves towards delivering health and home automation applications, as well as supporting stronger integration between native features (such as Siri and Notification View) and third-party apps and sensors.</p>
<p>The <a href="http://www.apple.com/ios/ios8/health/">Health</a> app joins Samsung’s Gear Fit, Nike and Fitbit to bring health and fitness data, measured by mobile and wearable devices, into our palms. </p>
<p>A new tool for developers called <a href="https://developer.apple.com/healthkit/">HealthKit</a> adds to the standard activity, heart rate and diet measurements by allowing developers to create third-party apps and sensors to measure factors such as blood pressure and sleep patterns. </p>
<p>Users can also create emergency cards with important health information such as allergies and blood types, accessible from the lock screen and emergency call screen. </p>
<p>Another development tool – <a href="https://developer.apple.com/homekit/">HomeKit</a> – will let us control aspects of our homes (such as lights and temperature) using our phones. </p>
<p>To enable natural interactions with our phone for home and health apps, iOS has evolved to allow Siri be hands free, similar to its Android counterpart Google Now. </p>
<p>We could say: “Hey Siri, I’m ready for bed”, then the lights will automatically dim for sleep and the phone will go into “do not disturb” mode – perhaps even playing our favourite relaxing music. </p>
<p>With the introduction of Swift, we can expect to see more apps than ever – truly building upon Apple’s 2007 slogan, “There’s an app for everything”.</p><img src="https://counter.theconversation.com/content/27685/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dian Tjondronegoro is a member of IEEE and ACM.</span></em></p>As Apple’s Worldwide Developers Conference (WWDC) winds up in San Francisco today, 1,000 Apple engineers and 5,000 developers will return to their parts of the world armed with Apple’s own programming…Dian Tjondronegoro, Associate Professor of Mobile Multimedia, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/159682013-07-11T05:43:24Z2013-07-11T05:43:24ZKeep it creative to get kids into coding<figure><img src="https://images.theconversation.com/files/27250/original/f4jkh28d-1373468856.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Exciting times for school kids</span> <span class="attribution"><span class="source">Lupuca</span></span></figcaption></figure><p>A new subject is to be introduced in England to kick start our technological future. Instead of teaching ICT, the national curriculum published this week calls on schools to teach computing.</p>
<p>This new way of looking at information technology in schools embodies a significant shift that is taking place all over the world. There is much to welcome in the move to computing, which addresses <a href="http://www.guardian.co.uk/education/2012/mar/31/why-kids-should-be-taught-code">widely voiced</a> concerns that children are not learning the skills needed for an increasingly digital world. The intention is to empower children as creators of digital media rather than passive consumers.</p>
<p>Nevertheless, coding needs to be taught in a creative way to keep children interested and introducing it universally into classrooms needs to be approached with care. </p>
<p>As the electronics industry gets more powerful by the day, we have started to think of coding as inherently exciting and creative. Michael Gove had been extoling its virtues for a long time before finally introducing it into the curriculum this week.</p>
<p>Mathematics can also be exciting and creative, yet we know that many children would disagree. Making a robot move or an animation respond certainly can be an exciting experience. But for many, the subsequent steps of writing and debugging programs can be less so. We currently have a “self selection” bias: many of the children learning to code out of school have chosen to do so. Maintaining this excitement for all children as coding is translated to a compulsory curriculum subject will be a challenge. This is particularly pertinent when we consider how coding will be assessed. It is no secret that assessment procedures have diminished the potential for subjects such as maths to excite pupils.</p>
<p>Ultimately, the greatest influence in how children experience coding will be teachers. Being able to integrate coding creatively into the classroom will require confidence and a certain amount of subject knowledge. Yet it is not clear what support is on offer, particularly with the aim of introducing this curriculum in 2014. Whilst there are an increasing number of impressive learning tools such as <a href="http://scratch.mit.edu/">Scratch</a>, even these could lose their excitement if presented in a formulaic step-by-step style. It would be nice to see more creative solutions for how to support teachers with this major curriculum development, for example, drawing on the skills of older children.</p>
<p>With the emphasis on change, there also seems to be a lack of reflection on educational and research debates around programming from the past. For more than 30 years we have had tools such as <a href="http://logo.codeplex.com/">Logo</a> which offer children an accessible, creative, environment to program instructions to an onscreen (and floor) robot. One of the questions raised was the extent to which children’s learning generalised to other areas – such as mathematics. This concern is highly relevant when considering the various claims we currently hear for programming. Who would not want to develop children’s logical reasoning, communication, thinking skills, literacy, or creativity? Yet there remains limited evidence that these broader abilities are developed through learning to code.</p>
<p>It would be interesting, therefore, to see how coding is related to other subject areas: <a href="http://ase.tufts.edu/devtech/publications/aera%20handout%20sequencing.pdf">research</a> has looked at the link to structuring a story for example. This is particularly important because we don’t yet know how the programming skills to be taught in schools will actually be used by children in their future lives and careers. Indeed, an increasing number of tools, such as web creation packages, are available that allow us to be digitally creative without needing to know how to code. Coding, of course, does offer a powerful way to re-engineer these tools, yet we should continuously evaluate the need for this to be a universally taught skill. Here I draw on my own experience creating a <a href="http://plingtoys.com/magic_cloud.htm">digital product</a>. It was important to understand the principles of coding, and the ability to communicate effectively with coders, but in truth, I am still pushed to create the simplest “hello world” program. </p>
<p>My greatest concern is that the wrong curriculum approach to coding could minimise its most appealing feature. Children are increasingly immersed in a programmed environment: from audio books and games to streetlights, taps and automatic doors. What is exciting is understanding how these digital artefacts have been designed to respond to the actions of those that use them. And what is empowering is the confidence to think about how these artefacts can be re-designed and told to respond differently to different actions. Computing as a subject has great potential to give children the understanding and confidence to think of changing the world around them from their earliest years. But this means that programming has to be meaningful. What a shame if this opportunity is lost through too great a focus on learning the procedural building blocks.</p><img src="https://counter.theconversation.com/content/15968/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrew Manches receives funding from Economic Social Research Council. He part owns and co-directs PlingToys Ltd.</span></em></p>A new subject is to be introduced in England to kick start our technological future. Instead of teaching ICT, the national curriculum published this week calls on schools to teach computing. This new way…Andrew Manches, ESRC Future Research Leader Fellow, The University of EdinburghLicensed as Creative Commons – attribution, no derivatives.