tag:theconversation.com,2011:/global/topics/information-technology-3215/articlesInformation technology – The Conversation2024-02-27T16:31:31Ztag:theconversation.com,2011:article/2241252024-02-27T16:31:31Z2024-02-27T16:31:31ZMaps shape our lives – showing us not just where we are, but who we are<p>Maps and everyday life are now so intertwined for most people that it’s difficult to imagine a world without them. Most of us use at least one map every day. Some of us use many, especially now they have become one of the dominant interfaces of our digital society, alongside the scrolling screen, camera view and search engine.</p>
<p>We are also being mapped – subtly or overtly – through the GPS and location data traces we leave, the journeys we make, and the kinds of activities that we get up to as we go about our daily business.</p>
<p>Then there are the other, more analogue ways that maps are part of our lives: childhood pirate treasure maps and atlases that reveal a world ripe for adventure; maps on railway platforms or bike docking stations; and maps on the back of flyers posted through the door.</p>
<p>Maps also have other, less practical uses too. They are proudly hung in our homes and offices, used to decorate things like coffee mugs and mouse pads, and even create <a href="https://www.bbc.co.uk/news/uk-england-york-north-yorkshire-36783325">fashion</a>.</p>
<p>Cartography has become one of the most successful technologies we have developed for understanding the world around us. At the same time, maps have become important cultural and artistic objects that we value greatly. They can be both useful and pragmatic, beautiful and poetic, political and powerful, meaningful as well as mundane.</p>
<h2>Shaping social and cultural life</h2>
<p>Over the last ten years, culminating in my book <a href="https://reaktionbooks.co.uk/work/all-mapped-out">All Mapped Out</a>, my work has led me to question what maps mean for people as they go about their daily lives, and in turn how maps shape their experiences.</p>
<p>Maps have received a lot of attention from researchers and industry over the years, mostly with the aim of producing the most accurate and usable map for a given purpose, or by studying how <a href="https://iai.tv/articles/maps-are-guided-by-power-not-truth-mike-duggan-auid-2703">powerful interests are reflected on maps</a>.</p>
<p>Professional cartographers, once working with pencil and paper and now with advanced geo-spatial technologies, aim to produce ever-more detailed maps for ever-more uses, while the sub-field of critical cartography has revealed that what ends up on a map <a href="https://acme-journal.org/index.php/acme/article/view/723">reflects the world views of their makers</a>.</p>
<p>But it is only relatively recently that work has <a href="https://www.routledge.com/The-Routledge-Handbook-of-Cartographic-Humanities/Rossetto-Lo-Presti/p/book/9781032355931">begun to explore</a> what they do to shape social and cultural life.</p>
<p>Maps and what we do with them cannot be defined universally. Ideals and ideas about maps frequently clash with the reality of how and why maps are used. By bringing together my own <a href="https://pure.royalholloway.ac.uk/en/publications/mapping-interfaces-an-ethnography-of-everyday-digital-mapping-pra">research</a> studying map users in London, and the work of others who have researched mapping practices around the world, I want to show how uses of maps are shaped by different cultures, communities, contexts and technology.</p>
<p>One way of exploring this is by looking at the impact that GPS technology has had on mapping our movements. Today, millions of people use this technology to reveal their exercise routines, which in turn supports an <a href="https://www.prnewswire.co.uk/news-releases/location-analytics-market-size-worth--49-12-billion-globally-by-2030-at-13-93-cagr-verified-market-research-301877524.html">industry worth billions</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/iWaZEXBbQL0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>But self-tracking is not just about maps and measurement. These maps become meaningful as objects inscribed with personal cartography. It feels good to see where we have been; it’s a sign that we have achieved something.</p>
<p>Some people have taken this further by using fitness-tracking devices as tools for artworks, wielding the GPS functions to inscribe pictures and words on the map through their movement across the land. <a href="https://www.nytimes.com/2022/09/24/technology/gps-art-strava-running.html">GPS art</a>, as it has come to be known, is growing in popularity as people realise self-tracking’s potential outside purely mapping exercise for personal goals.</p>
<p>It began long before the proliferation of the smartphone and fitness-tracking apps, when in 2000 the artist <a href="http://www.gpsdrawing.com/">Jeremy Wood</a> set to work recording and mapping his movements using a handheld GPS device. This included tracing his daily travel and even recording his <a href="http://www.gpsdrawing.com/gallery/experiments/lawn/mowing.html">lawn-mowing routes through the seasons</a>. This reveals how a popular mapping technology – GPS – has many impacts beyond those it was intended for.</p>
<h2>Mapping Contexts</h2>
<p>In my work there are several overlapping themes that chart how maps have become tied to culture and society. I want to do more than identify <a href="https://www.theatlantic.com/international/archive/2013/12/12-maps-that-changed-the-world/282666/">maps that have changed the world</a>, or lay out the history of <a href="https://www.uclpress.co.uk/products/108697">maps and society</a>. Instead, I want to show that all maps have the potential to change the world and shape society. It’s just a matter of where you look and whose world you are interested in.</p>
<p>With my book I hope to inspire another look at maps, first through the lens of navigation, perhaps the activity mostly strongly associated with maps, then through movement and how maps shape our perception of it.</p>
<p>I also look at the power and politics of maps that reveal whose interests are served by particular maps, and investigate the cultures of map-making today. With easy-to-use digital mapping tools now available online, alongside the proliferation of advanced mapping technologies now used by professionals, the power of map-making and the cultures that develop around maps are more diverse than ever.</p>
<p>That maps and map-makers are always changing makes studying what we do with maps an exciting area for development. It means that our understanding of maps must evolve with how they continue to shape society. </p>
<p>So it’s high time for a rethink. There remains a prevailing view that maps are neutral and objective, once paper and now digital, accurate and functional, despite the now well-used line that <a href="https://theconversation.com/drafts/223404/edit">maps are arguments made about the world</a>. Why is this? And how do we move beyond it?</p>
<p>My hope is to create a conversation – one that so far is only being had in a small corner of map studies – encouraging people to think beyond the assumptions society has about maps and how we use them.</p><img src="https://counter.theconversation.com/content/224125/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mike Duggan receives funding from the British Academy and Leverhulme Trust, the EPSRC and King's College London. He is affiliated with the Livingmaps Network </span></em></p>Cartography has become one of the most successful technologies for understanding the world around us. But like the world itself, maps and map-making are constantly evolving.Mike Duggan, Lecturer in Digital Culture, Socety and Economy, King's College LondonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2214942024-02-04T13:33:47Z2024-02-04T13:33:47ZHow better and cheaper software could save millions of dollars while improving Canada’s health-care system<figure><img src="https://images.theconversation.com/files/570446/original/file-20240120-27145-3rmndw.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5613%2C3681&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A Canada-wide health information technology system based on open-source software could save billions for the health-care system.</span> <span class="attribution"><a class="source" href="https://www.pexels.com/photo/nurse-labeling-test-tubes-6285380/">(Gustavo Fring/Pexels)</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>Billions of Canadian tax dollars have been funnelled to private companies <a href="https://doi.org/10.13162/hro-ors.v2i3.1179">to develop proprietary medical software</a>. More tax dollars were then paid to the same companies to use the software to run our medical system. </p>
<p>This might not have seemed like a big deal at a time when Canadians could easily get a doctor and our medical system <a href="https://nationalpost.com/news/canada-doctor-ratios-what-happened">had one of the best doctor-patient ratios in the world</a>. </p>
<p>Fast forward to today, when <a href="https://globalnews.ca/news/9901922/canadians-family-doctor-shortage-cma-survey/">one-fifth of Canadians cannot find a doctor and more than half “battle” for appointments</a>. You can now easily spend an <a href="https://globalnews.ca/news/10218446/canada-emergency-rooms-overwhelmed-cma/">entire day waiting when you visit the emergency room</a>. Wait time for surgeries and diagnostic tests such as MRIs are much longer now, and <a href="https://toronto.citynews.ca/2023/12/09/canadians-die-waiting-surgery-report/">over 17,000 Canadians died waiting</a> for health care in 2023. </p>
<p>The once-great Canadian health-care system is being <a href="https://globalnews.ca/news/10218446/canada-emergency-rooms-overwhelmed-cma/">pushed to its limits</a>, and as a result, is “<a href="https://www.cfpc.ca/en/canada-s-health-care-system-on-verge-of-collapse-family-doctors-warn">failing</a>.” Add Canada’s <a href="https://www.bbc.com/news/world-us-canada-65047436">recent population growth</a> into the equation, and you have an under-resourced system that is stretched too thin.</p>
<p>The health system might be better prepared for these challenges if literally billions of dollars had not been squandered on proprietary software development. A <a href="https://doi.org/10.1007/s10916-023-01949-w">new study</a> I wrote with my colleague Jack Peplinski at Western University shows how embracing open-source development saves millions and could help rescue Canada’s broken health-care system.</p>
<h2>Undoing waste</h2>
<figure class="align-center ">
<img alt="A woman in a white coat and stethoscope with an iPad" src="https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/572988/original/file-20240202-25-7vmjxi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">On top of the cost of development, with proprietary software, each doctor’s office as well as each hospital has to pay for its own electronic health record licence.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Although the Canadian federal government has invested <a href="https://doi.org/10.13162/hro-ors.v2i3.1179">over $2.1 billion developing health information technology (HIT)</a>, all 10 provinces still have their own separate HIT systems. Besides being an obvious source of redundancy and waste, these systems: </p>
<ul>
<li>do not work together, </li>
<li>are expensive and </li>
<li>are inconsistent. </li>
</ul>
<p>After first reviewing how these systems operate, <a href="https://doi.org/10.1007/s10916-023-01949-w">we analyzed</a> the economic costs and savings of integrating some of the functions of the software. We chose something easy and straightforward that all the provinces needed and settled on the common billing, lab results and diagnostic imaging (BLD) functions of these separate systems. </p>
<p>Then we proposed using a free and open-source software system called HermesAPI to provide BLD for Canada. Our results provide a glimmer of hope for our struggling health-care system.</p>
<h2>Proprietary software vs. open source</h2>
<p>To understand how money is best spent on software development, you have to understand a little bit about licensing. </p>
<p>The HIT software that has been bleeding Canada dry is proprietary. No one other than the company that made it knows how it works, and each province pays these companies a licence fee to use their software even if they originally paid to develop it. No one can share the software either (for example, Ontario cannot legally share the software it helped fund with Alberta or vice versa). </p>
<p>That means each province must fund companies that pay employees to maintain nearly identical software, 10 times over. Each doctor’s office and each hospital has to pay for its own electronic health record licence. </p>
<p>Worse yet, the Canada Health Act states that health care should be portable, <a href="https://doi.org/10.1503/cmaj.181647">but because these HIT systems are separate, it is not</a>. <a href="https://www.cbc.ca/news/canada/toronto/ehealth-scandal-a-1b-waste-auditor-1.808640">The Auditor General of Ontario’s 2009 report on electronic health records (EHRs) found more than a billion dollars of waste</a>. </p>
<p>Another approach that immediately eliminates that waste is called <a href="https://itsfoss.com/what-is-foss/">free and open source software</a> (FOSS). FOSS is available in source code (open source) form, and can be used, studied, copied, modified and redistributed with restrictions that only ensure that further recipients have the same rights as those under which it was obtained. </p>
<p>That last bit is the core viral idea of open source development: if anyone makes an improvement in the software, they must share it back with the community. This is how FOSS evolves and the rapid churn in <a href="https://doi.org/10.1109/MRA.2016.2646748">innovation</a> in a <a href="https://www.appropedia.org/Create,_Share,_and_Save_Money_Using_Open-Source_Projects">wide array of areas</a> is the result.</p>
<p>Not surprisingly, industry loves open source. Ninety per cent <a href="https://fortune.com/2013/05/06/how-linux-conquered-the-fortune-500/">of the Fortune Global 500 use open-source development</a>. In fact, today, open source software is the dominant way to develop software in industry because it tends to be <a href="https://www.doi.org/10.1257/0895330054048678">technically superior</a> and <a href="https://doi.org/10.1109/52.951496">more secure</a>. </p>
<p>The evidence for this is that FOSS is in <a href="https://www.zdnet.com/article/supercomputers-all-linux-all-the-time/">100 per cent of the world’s supercomputers</a>, <a href="https://www.rackspace.com/en-gb/blog/realising-the-value-of-cloud-computing-with-linux">90 per cent of cloud servers</a>, <a href="https://www.idc.com/promo/smartphone-market-share">82 per cent of smartphones</a> and <a href="https://spectrum.ieee.org/open-source-ai">most artificial intelligence</a>. </p>
<p>Every internet company you use, from Facebook to Amazon to Wikipedia, is built on a stack of open source software.</p>
<h2>A better way to develop medical software</h2>
<p>Currently, eight provincial governments representing over 95 per cent of Canada’s population allow private companies to create their own electronic medical records (EMR) system and integrate with provincial BLD systems.</p>
<p>Our study found the cost to develop and maintain HermesAPI would be about $610,000, but would prevent $120,000 per software development company per province in development costs, for a savings of $6.4 million. This approach would lower barriers to entry for the HIT industry to increase competition, improve the quality of HIT products and ultimately patient care.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/L1yGlEGfH4g?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Jack Peplinski, software engineering and business administration student at Western University and co-author of a study on the potential impact of open-source health software, describes HermesAPI.</span></figcaption>
</figure>
<p>The real secret of open source software is that it encourages competition in capitalism. FOSS prevents vendor lock-in and monopolistic companies, both of which are common with our current proprietary software model. For example, <a href="https://doi.org/10.1503/cmaj.109-5765">90 per cent of EMRs in Canada</a> are the products of just three United States-based companies.</p>
<p>Our study looked just at BLD, but there are many other such opportunities in our health-care system. The open-source approach is one option towards building a more interoperable, less expensive and more consistent HIT system for Canada. </p>
<p>Yes, this means we will be sending less money to prop-up American software companies, but the return on investment of open source is likely to be very high. Fifteen years ago, Ontario’s Auditor General found that by implementing a unified medical records system, we could save at <a href="https://www.cbc.ca/news/canada/toronto/ehealth-scandal-a-1b-waste-auditor-1.808640">least $6 billion</a>. It is far more than that now. </p>
<p>This time, we could do it right and instead of subsidizing proprietary U.S. companies, we can ensure every Canadian dollar invested in software is open source so we can save our loonies for doctors, nurses and hospital beds to keep up with our burgeoning population.</p><img src="https://counter.theconversation.com/content/221494/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Joshua M. Pearce has received funding for research from the Natural Sciences and Engineering Research Council of Canada, the Canada Foundation for Innovation, Mitacs, the U.S. Department of Energy (DOE) and the Advanced Research Projects Agency-Energy (ARPA-E), U.S. Department of Defense, The Defense Advanced Research Projects Agency (DARPA), and the National Science Foundation (NSF). In addition, his past and present consulting work and research is funded by the United Nations, the National Academies of Science, Engineering and Medicine, many non-profits and for-profit companies. He has no direct conflicts of interests.</span></em></p>Canada has spent billions on health-care software that does not even communicate province to province. Free and open-source software would be a technically superior and far less expensive option.Joshua M. Pearce, John M. Thompson Chair in Information Technology and Innovation and Professor, Western UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2173742023-11-30T01:09:07Z2023-11-30T01:09:07ZArtificial intelligence is already in our hospitals. 5 questions people want answered<figure><img src="https://images.theconversation.com/files/560122/original/file-20231117-23-mms70g.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C1000%2C666&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/female-face-matrix-digital-numbers-artifical-2268966863">Shutterstock</a></span></figcaption></figure><p>Artificial intelligence (AI) is already being used in health care. AI can look for patterns in <a href="https://journal.achsm.org.au/index.php/achsm/article/view/861">medical images</a> to help diagnose disease. It can help predict who in a hospital ward might <a href="https://www.jmir.org/2021/9/e28209">deteriorate</a>. It can <a href="https://elicit.com/">rapidly summarise</a> medical research papers to help doctors stay up-to-date with the latest evidence.</p>
<p>These are examples of AI making <a href="https://theconversation.com/artificial-intelligence-wont-replace-a-doctor-any-time-soon-but-it-can-help-with-diagnosis-83353">or shaping</a> decisions health professionals previously made. More applications are being developed.</p>
<p>But what do consumers think of using AI in health care? And how should their answers shape how it’s used in the future?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-is-already-being-used-in-healthcare-but-not-all-of-it-is-medical-grade-207912">AI is already being used in healthcare. But not all of it is 'medical grade'</a>
</strong>
</em>
</p>
<hr>
<h2>What do consumers think?</h2>
<p>AI systems are trained to look for patterns in large amounts of data. Based on these patterns, AI systems can make recommendations, suggest diagnoses, or initiate actions. They can potentially continually learn, becoming better at tasks over time.</p>
<p>If we draw together <a href="https://www.sciencedirect.com/science/article/pii/S0277953623007141#appsec1">international</a> evidence, including <a href="https://www.uow.edu.au/the-arts-social-sciences-humanities/research/acheev/artificial-intelligence-in-health/">our own</a> <a href="https://journal.achsm.org.au/index.php/achsm/article/view/861">and that</a> <a href="https://humanfactors.jmir.org/2022/3/e34514/authors">of others</a>, it seems most consumers accept the potential value of AI in health care. </p>
<p>This value could include, for example, increasing the <a href="https://www.jmir.org/2022/8/e37611/">accuracy of diagnoses</a> or improving <a href="https://mental.jmir.org/2019/11/e12942/">access to care</a>. At present, these are largely potential, rather than proven, benefits. </p>
<p>But consumers say their acceptance is conditional. They still have serious concerns.</p>
<p><strong>1. Does the AI work?</strong></p>
<p>A baseline expectation is AI tools should work well. Often, consumers say AI should be at least as good as a <a href="https://journal.achsm.org.au/index.php/achsm/article/view/861">human doctor</a> at the tasks it performs. They say we should not use AI if it will lead to more incorrect diagnoses or medical errors.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ai-chatbots-are-still-far-from-replacing-human-therapists-201084">AI chatbots are still far from replacing human therapists</a>
</strong>
</em>
</p>
<hr>
<p><strong>2. Who’s responsible if AI gets it wrong?</strong></p>
<p>Consumers also worry that if AI systems generate decisions – such as diagnoses or treatment plans – without human input, it may be unclear who is responsible for errors. So people often want clinicians to remain responsible for the final decisions, and for <a href="https://www.nature.com/articles/s41746-021-00509-1">protecting patients</a> from harms.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/who-will-write-the-rules-for-ai-how-nations-are-racing-to-regulate-artificial-intelligence-216900">Who will write the rules for AI? How nations are racing to regulate artificial intelligence</a>
</strong>
</em>
</p>
<hr>
<p><strong>3. Will AI make health care less fair?</strong></p>
<p>If health services are <a href="https://theconversation.com/ms-dhu-coronial-findings-show-importance-of-teaching-doctors-and-nurses-about-unconscious-bias-60319">already discriminatory</a>, AI systems can learn these patterns from data and <a href="https://www.science.org/doi/10.1126/science.aax2342">repeat or worsen</a> the discrimination. So AI used in health care can make health inequities worse. In our studies consumers said this <a href="https://journals.sagepub.com/doi/pdf/10.1177/20552076231191057">is not OK</a>.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1655077669610962944"}"></div></p>
<p><strong>4. Will AI dehumanise health care?</strong></p>
<p>Consumers are concerned AI will take the “human” elements out of health care, consistently saying AI tools should <a href="https://journals.sagepub.com/doi/full/10.1177/20552076221116772">support rather than replace</a> doctors. Often, this is because AI is perceived to lack important human traits, <a href="https://journals.sagepub.com/doi/full/10.1177/2055207619871808">such as empathy</a>. Consumers say the communication skills, care and touch of a health professional are especially important when feeling vulnerable.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/chatbots-for-medical-advice-three-ways-to-avoid-misleading-information-213266">Chatbots for medical advice: three ways to avoid misleading information</a>
</strong>
</em>
</p>
<hr>
<p><strong>5. Will AI de-skill our health workers?</strong></p>
<p>Consumers value human clinicians and their expertise. In our <a href="https://journals.sagepub.com/doi/full/10.1177/20552076231191057">research with women</a> about AI in breast screening, women were concerned about the potential effect on radiologists’ skills and expertise. Women saw this expertise as a precious shared resource: too much dependence on AI tools, and this resource might be lost.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"926426239119716352"}"></div></p>
<h2>Consumers and communities need a say</h2>
<p>The Australian health-care system cannot focus only on the technical elements of AI tools. Social and ethical considerations, including high-quality engagement with consumers and communities, are essential to shape AI use in health care.</p>
<p>Communities need opportunities to develop <a href="https://theconversation.com/chatbots-for-medical-advice-three-ways-to-avoid-misleading-information-213266">digital health literacy</a>: <a href="https://www.goodthingsfoundation.org.au/the-digital-divide/digital-health/">digital skills</a> to access reliable, trustworthy health information, services and resources. </p>
<p>Respectful engagement with Aboriginal and Torres Strait Islander communities must be central. This includes upholding Indigenous data sovereignty, which the Australian Institute of Aboriginal and Torres Strait Islander Studies <a href="https://aiatsis.gov.au/publication/116530">describes as</a>:</p>
<blockquote>
<p>the right of Indigenous peoples to govern the collection, ownership and application of data about Indigenous communities, peoples, lands, and resources.</p>
</blockquote>
<p>This includes any use of data to create AI. </p>
<p>This critically important consumer and community engagement needs to take place before managers design (more) AI into health systems, before <a href="https://theconversation.com/who-will-write-the-rules-for-ai-how-nations-are-racing-to-regulate-artificial-intelligence-216900">regulators</a> create guidance for how AI should and shouldn’t be used, and before clinicians consider buying a new AI tool for their practice.</p>
<p>We’re making some progress. Earlier this year, we ran a <a href="https://www.uow.edu.au/the-arts-social-sciences-humanities/research/acheev/artificial-intelligence-in-health/">citizens’ jury on AI in health care</a>. We supported 30 diverse Australians, from every state and territory, to spend three weeks learning about AI in health care, and developing recommendations for policymakers.</p>
<p>Their recommendations, which will be published in an upcoming issue of the Medical Journal of Australia, have informed a recently released <a href="https://aihealthalliance.org/">national roadmap</a> for using AI in health care.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/worried-about-ai-you-might-have-ai-nxiety-heres-how-to-cope-205874">Worried about AI? You might have AI-nxiety – here's how to cope</a>
</strong>
</em>
</p>
<hr>
<h2>That’s not all</h2>
<p>Health professionals also need to be upskilled and supported to use AI in health care. They need to learn to be critical users of digital health tools, including understanding their pros and cons.</p>
<p>Our <a href="https://pubmed.ncbi.nlm.nih.gov/37071804/">analysis</a> of safety events reported to the Food and Drug Administration shows the most serious harms reported to the US regulator came not from a faulty device, but from the way consumers and clinicians used the device.</p>
<p>We also need to consider when health professionals should tell patients an AI tool is being used in their care, and when health workers should seek informed consent for that use.</p>
<p>Lastly, people involved in every stage of developing and using AI need to get accustomed to asking themselves: do consumers and communities agree this is a justified use of AI? </p>
<p>Only then will we have the AI-enabled health-care system consumers actually want.</p><img src="https://counter.theconversation.com/content/217374/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stacy Carter receives funding from National Health and Medical Research Council, National Breast Cancer Foundation, Medical Research Futures Fund. </span></em></p><p class="fine-print"><em><span>Emma Frost receives funding from the Australian Government Research Training Program and the National Health and Medical Research Council.</span></em></p><p class="fine-print"><em><span>Farah Magrabi receives funding from the National Health and Medical Research Council, the Digital Health CRC and Macquarie University. She is Co-Chair of the Australian Alliance for AI in Healthcare's Safety, Quality and Ethics Working Group. </span></em></p><p class="fine-print"><em><span>Yves Saint James Aquino receives funding from the National Health and Medical Research Council (CRE 2006-545 - WiserHealthcare). He is affiliated with Bellberry Limited, a not-for-profit organisation providing scientific and ethical review of human research projects.</span></em></p>Before AI becomes widespread in health care, we need to ask what matters to consumers.Stacy Carter, Professor and Director, Australian Centre for Health Engagement, Evidence and Values, University of WollongongEmma Frost, PhD candidate, Australian Centre for Health Engagement, Evidence and Values, University of WollongongFarah Magrabi, Professor of Biomedical and Health Informatics at the Australian Institute of Health Innovation, Macquarie UniversityYves Saint James Aquino, Research Fellow, Australian Centre for Health Engagement, Evidence and Values, University of WollongongLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2057342023-06-22T12:31:18Z2023-06-22T12:31:18ZHow will AI affect workers? Tech waves of the past show how unpredictable the path can be<figure><img src="https://images.theconversation.com/files/531702/original/file-20230613-17-juejmq.jpg?ixlib=rb-1.1.0&rect=45%2C122%2C4254%2C2720&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Personal computers started an information technology revolution. Will AI bring similarly dramatic changes? </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/philadelphia-pa-irs-workers-check-through-a-pile-of-1040-news-photo/515361036?adppopup=true">Bettmann via Getty Images</a></span></figcaption></figure><p>The explosion of interest in artificial intelligence has drawn attention not only to the astonishing capacity of algorithms to mimic humans but to the reality that these algorithms could displace many humans in their jobs. The economic and societal consequences could be nothing short of dramatic. </p>
<p>The route to this economic transformation is through the workplace. A <a href="https://www.goldmansachs.com/intelligence/pages/generative-ai-could-raise-global-gdp-by-7-percent.html">widely circulated Goldman Sachs study</a> anticipates that about two-thirds of current occupations over the next decade could be affected and a quarter to a half of the work people do now could be taken over by an algorithm. Up to 300 million jobs worldwide could be affected. The consulting firm McKinsey <a href="https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier#introduction">released its own study</a> predicting an AI-powered boost of US$4.4 trillion to the global economy every year. </p>
<p>The implications of such gigantic numbers are sobering, but how reliable are these predictions?</p>
<p>I lead a research program called <a href="https://sites.tufts.edu/digitalplanet/">Digital Planet</a> that studies the impact of digital technologies on lives and livelihoods around the world and how this impact changes over time. A look at how previous waves of such digital technologies as personal computers and the internet affected workers offers some insight into AI’s potential impact in the years to come. But if the history of the future of work is any guide, we should be prepared for some surprises. </p>
<h2>The IT revolution and the productivity paradox</h2>
<p>A key metric for tracking the consequences of technology on the economy is growth in <a href="https://www.bls.gov/k12/productivity-101/content/what-is-productivity/what-is-labor-productivity.htm">worker productivity</a> – defined as how much output of work an employee can generate per hour. This seemingly dry statistic matters to every working individual, because it ties directly to how much a worker can expect to earn for every hour of work. Said another way, higher productivity is expected to <a href="https://insight.kellogg.northwestern.edu/article/worker-productivity-minimum-wage-increase">lead to higher wages</a>. </p>
<p>Generative AI products are capable of producing written, graphic and audio content or software programs with minimal human involvement. Professions such as advertising, entertainment and creative and analytical work could be among the first to feel the effects. Individuals in those fields may worry that companies will use <a href="https://www.washingtonpost.com/technology/2023/06/02/ai-taking-jobs/">generative AI to do jobs they once did</a>, but economists see great potential to boost productivity of the workforce as a whole. </p>
<p>The Goldman Sachs study predicts productivity will grow by 1.5% per year because of the adoption of generative AI alone, which would be <a href="https://www.bls.gov/opub/mlr/2021/article/the-us-productivity-slowdown-the-economy-wide-and-industry-level-analysis.htm#:%7E:text=In%20the%20years%20since%202005,percent%20from%202010%20to%202018">nearly double the rate from 2010 and 2018</a>. McKinsey is even more aggressive, saying this technology and other forms of automation will usher in the “<a href="https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier#introduction">next productivity frontier</a>,” pushing it as high as 3.3% a year by 2040.</p>
<p>That sort of productivity boost, which would approach rates of previous years, would be welcomed by both economists and, in theory, workers as well. </p>
<p>If we were to trace the 20th-century history of productivity growth in the U.S., it galloped along at <a href="https://www.imf.org/external/pubs/ft/fandd/2016/06/gordon.htm#:%7E:text=Measures%20and%20mismeasures%20of%20progress&text=The%20growth%20rate%20of%20labor,extends%20from%201970%20to%202014.">about 3%</a> annually from 1920 to 1970, lifting real wages and living standards. Interestingly, productivity growth slowed in the 1970s and 1980s, coinciding with the introduction of computers and early digital technologies. This “<a href="https://cs.stanford.edu/people/eroberts/cs181/projects/productivity-paradox/index.html">productivity paradox</a>” was famously captured in a <a href="https://www.brookings.edu/articles/the-solow-productivity-paradox-what-do-computers-do-to-productivity/">comment from MIT economist Bob Solow</a>: You can see the computer age everywhere <a href="http://ccs.mit.edu/papers/CCSWP130/ccswp130.html">but in the productivity statistics</a>. </p>
<p><iframe id="i96wK" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/i96wK/10/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Digital technology skeptics blamed “unproductive” time spent on social media or shopping and argued that earlier transformations, such as the introductions of electricity or the internal combustion engine, had a <a href="https://www.nytimes.com/2016/01/31/books/review/the-powers-that-were.html">bigger role in fundamentally altering the nature of work</a>. Techno-optimists disagreed; they argued that new digital technologies <a href="https://wwnorton.com/books/The-Second-Machine-Age/">needed time to translate</a> into productivity growth, because other complementary changes would need to evolve in parallel. Yet others <a href="https://www.brookings.edu/articles/the-solow-productivity-paradox-what-do-computers-do-to-productivity/">worried that productivity measures were not adequate</a> in capturing the value of computers.</p>
<p>For a while, it seemed that the optimists would be vindicated. In the second half of the 1990s, around the time the World Wide Web emerged, productivity growth in the U.S. <a href="https://www.epi.org/publication/webfeatures_viewpoints_l-t_growth_lessons/">doubled</a>, from 1.5% per year in the first half of that decade to 3% in the second. Again, there were disagreements about what was really going on, further muddying the waters as to whether the paradox had been resolved. Some <a href="https://www.epi.org/publication/webfeatures_viewpoints_l-t_growth_lessons/">argued</a> that, indeed, the investments in digital technologies were finally paying off, while an <a href="https://www.mckinsey.com/featured-insights/employment-and-growth/whats-right-with-the-us-economy">alternative view</a> was that managerial and technological innovations in a few key industries were the main drivers. </p>
<p>Regardless of the explanation, just as mysteriously as it began, that late 1990s surge was short-lived. So despite massive corporate investment in computers and the internet – changes that transformed the workplace – how much the economy and workers’ wages benefited from technology remained uncertain.</p>
<h2>Early 2000s: New slump, new hype, new hopes</h2>
<p>While the start of the 21st century coincided with the <a href="https://www.goldmansachs.com/our-firm/history/moments/2000-dot-com-bubble.html">bursting of the so-called dot-com bubble</a>, the year 2007 was marked by the arrival of another technology revolution: <a href="https://www.youtube.com/watch?v=x7qPAY9JqE4">the Apple iPhone</a>, which consumers bought by the millions and which companies deployed in countless ways. Yet labor productivity growth started stalling again in the mid-2000s, <a href="https://www.nber.org/papers/w30267">ticking up briefly in 2009</a> during the Great Recession, only to return to a slump from 2010 to 2019. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A person looking at video of dog at desk in office" src="https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=377&fit=crop&dpr=1 600w, https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=377&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=377&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=473&fit=crop&dpr=1 754w, https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=473&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/531420/original/file-20230612-63747-rjscts.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=473&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Smartphones have led to millions of apps and consumer services but have also kept many workers more closely tethered to their workplaces.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/photos/smart-phone-workplace?assettype=image&agreements=&family=editorial&locations=61907&page=5&phrase=smart%20phone%20workplace&sort=newest">San Francisco Chronicle/Hearst Newspapers via Getty Images</a></span>
</figcaption>
</figure>
<p>Throughout this new slump, techno-optimists were anticipating new winds of change. AI and automation were becoming all the rage and were expected to transform work and worker productivity. Beyond traditional industrial automation, drones and advanced robots, capital and talent were pouring into many would-be <a href="https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/disruptive-technologies">game-changing technologies</a>, including autonomous vehicles, automated checkouts in grocery stores and even <a href="https://www.businessinsider.com/zume-pizza-robot-expansion-2017-6">pizza-making robots</a>. AI and automation were projected to push productivity growth <a href="https://www.mckinsey.com/featured-insights/regions-in-focus/solving-the-productivity-puzzle">above 2%</a> annually in a decade, up from the 2010-2014 lows of <a href="https://www.mckinsey.com/featured-insights/future-of-work/ai-automation-and-the-future-of-work-ten-things-to-solve-for">0.4%</a>. </p>
<p>But before we could get there and gauge how these new technologies would ripple through the workplace, a new surprise hit: the COVID-19 pandemic. </p>
<h2>The pandemic productivity push – then bust</h2>
<p>Devastating as the pandemic was, worker productivity <a href="https://www.ilo.org/wcmsp5/groups/public/---dgreports/---dcomm/documents/briefingnote/wcms_824092.pdf">surged after it began in 2020</a>; output per hour worked globally hit 4.9%, the highest recorded since data has been available. </p>
<p>Much of this steep rise was facilitated by technology: larger knowledge-intensive companies – inherently the more productive ones – switched to remote work, <a href="https://www.oecd-ilibrary.org/sites/54337c24-en/index.html?itemId=/content/component/54337c24-en">maintaining continuity</a> through digital technologies such as videoconferencing and communications technologies such as Slack, and <a href="https://www.cnbc.com/2022/11/29/remote-workers-reclaimed-60-million-hours-of-commuting-time.html">saving on commuting time and focusing on well-being</a>.</p>
<p>While it was clear digital technologies helped boost productivity of knowledge workers, there was an <a href="https://apnews.com/article/technology-business-health-coronavirus-pandemic-d935b29f631f1ae36e964d23881f77bd">accelerated shift to greater automation</a> in many other sectors, as workers had to remain home for their own safety and comply with lockdowns. Companies in industries ranging from meat processing to operations in restaurants, retail and hospitality <a href="https://www.latimes.com/politics/story/2021-05-04/covid-automation-robots-trends-effects-on-workers">invested in automation</a>, such as robots and automated order-processing and customer service, which helped boost their productivity. </p>
<p>But then there was yet another turn in the journey along the technology landscape. </p>
<p>The 2020-2021 surge in investments in the <a href="https://www.cnn.com/2022/11/14/investing/tech-stocks-faangs/index.html">tech sector collapsed</a>, as did the hype about autonomous vehicles and pizza-making robots. Other frothy promises, such as the <a href="https://hbr.org/2022/04/how-the-metaverse-could-change-work">metaverse’s revolutionizing remote work or training</a>, also seemed to fade into the background. </p>
<p>In parallel, with little warning, “generative AI” <a href="https://www.marketing-interactive.com/how-chatgpt-exploded-on-to-the-scene-with-so-little-marketing-spend">burst onto the scene</a>, with an even more direct potential to enhance productivity while affecting jobs – at massive scale. The hype cycle around new technology restarted. </p>
<h2>Looking ahead: Social factors on technology’s arc</h2>
<p>Given the number of plot twists thus far, what might we expect from here on out? Here are four issues for consideration. </p>
<p>First, the future of work is about more than just raw numbers of workers, the technical tools they use or the work they do; one should consider how AI affects factors such as workplace diversity and social inequities, which in turn have a profound impact on economic opportunity and workplace culture.</p>
<p>For example, while the broad shift toward remote work <a href="https://foreignpolicy.com/2022/01/30/big-tech-diversity-recruiting-silicon-valley/">could help</a> promote diversity with more flexible hiring, I see the increasing use of AI as likely to have the opposite effect. Black and Hispanic workers are <a href="https://doi.org/10.21033/wp-2023-06">overrepresented</a> in the 30 occupations with the highest exposure to automation and <a href="https://doi.org/10.21033/wp-2023-06">underrepresented</a> in the 30 occupations with the lowest exposure. While AI might help workers get more done in less time, and this increased productivity could increase wages of those employed, it could lead to a severe loss of wages for those whose jobs are displaced. A 2021 paper found that <a href="https://www.imf.org/en/Publications/WP/Issues/2021/01/15/Pandemics-and-Automation-Will-the-Lost-Jobs-Come-Back-50000">wage inequality tended to increase the most</a> in countries in which companies already relied a lot on robots and that were quick to adopt the latest robotic technologies. </p>
<p>Second, as the post-COVID-19 workplace seeks a balance between in-person and remote working, the effects on productivity – and opinions on the subject – will remain uncertain and fluid. A <a href="https://econofact.org/is-remote-work-working-out">2022 study</a> showed improved efficiencies for remote work as companies and employees grew more comfortable with work-from-home arrangements, but according to a separate 2023 study, managers and employees <a href="https://hbr.org/2023/01/research-where-managers-and-employees-disagree-about-remote-work">disagree</a> about the impact: The former believe that remote working reduces productivity, while employees believe the opposite.</p>
<p>Third, society’s reaction to the spread of generative AI could greatly affect its course and ultimate impact. Analyses suggest that generative AI can boost worker productivity on specific jobs – for example, one 2023 study found the staggered introduction of a generative AI-based conversational assistant <a href="https://www.nber.org/papers/w31161">increased productivity of customer service personnel by 14%</a>. Yet there are already <a href="https://www.safe.ai/statement-on-ai-risk">growing calls</a> to consider generative AI’s most severe risks and to take them seriously. On top of that, recognition of the astronomical <a href="https://www.washingtonpost.com/technology/2023/06/05/chatgpt-hidden-cost-gpu-compute/">computing</a> and <a href="https://theconversation.com/is-generative-ai-bad-for-the-environment-a-computer-scientist-explains-the-carbon-footprint-of-chatgpt-and-its-cousins-204096">environmental costs</a> of generative AI could limit its development and use. </p>
<p>Finally, given how wrong economists and other experts have been in the past, it is safe to say that many of today’s predictions about AI technology’s impact on work and worker productivity will prove to be wrong as well. Numbers such as 300 million jobs affected or $4.4 trillion annual boosts to the global economy are eye-catching, yet I think people tend to give them greater credibility than warranted.</p>
<p>Also, “jobs affected” does not mean jobs lost; it could mean jobs augmented or even a transition to new jobs. It is best to use the analyses, such as Goldman’s or McKinsey’s, to spark our imaginations about the plausible scenarios about the future of work and of workers. It’s better, in my view, to then proactively brainstorm the many factors that could affect which one actually comes to pass, look for early warning signs and prepare accordingly.</p>
<p>The history of the future of work has been full of surprises; don’t be shocked if tomorrow’s technologies are equally confounding.</p>
<p></p><hr><p></p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/528404/original/file-20230525-19537-m9iltu.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em>Learn what you need to know about artificial intelligence by <a href="https://memberservices.theconversation.com/newsletters/?nl=ai&source=inline-promo">signing up for our newsletter series of four emails</a> delivered over the course of a week. You can read all our stories on generative AI at <a href="https://theconversation.com/topics/generative-ai-133426">TheConversation.com</a>.</em></p><img src="https://counter.theconversation.com/content/205734/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti founded and directs Fletcher's Institute for Business in the Global Context and its Digital Planet research program that has received funding from Mastercard, Microsoft, the Gates Foundation, Rockefeller Foundation and Omidyar Network. </span></em></p>New digital technologies have been a constant for workers over the past few decades, with a mixed record on the economy and individuals’ daily lives. AI’s effect will likely be just as unpredictable.Bhaskar Chakravorti, Dean of Global Business, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2065112023-05-31T12:35:38Z2023-05-31T12:35:38ZRemembering South Africa’s “Grand Geek” Barry Dwolatzky - engineer and programming pioneer<figure><img src="https://images.theconversation.com/files/528576/original/file-20230526-21-3xadfe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Professor Barry Dwolatzky was passionate about innovation in all that he did.</span> <span class="attribution"><span class="source">Wits University</span></span></figcaption></figure><p>To some of his former students, Professor Barry Dwolatzky was the “Grand Geek” – a name of which he was very proud. But Barry, who passed away in Johannesburg, South Africa on 16 May 2023, was much more than a computer geek. He was also a leader and a visionary in the field of software engineering in South Africa.</p>
<p>At the time of his passing he was 71 years old. He was by then retired from academia and held the title of Emeritus Professor at the University of the Witwatersrand (Wits), where he spent much of his career. </p>
<p>But he didn’t really slow down: he remained the director of the Joburg Centre for Software Engineering (JCSE), a role he’d held since 2007. During the COVID lockdown in 2020, he started a podcast called <a href="https://iono.fm/c/4965">Optimizing – Leading Africa’s Digital Future</a> and produced eight episodes. He also wrote an autobiography called <a href="https://www.wits.ac.za/future/stories/looking-ahead-from-a-life-of-new-beginnings.html">Coded History – My Life of New Beginnings</a>, which was launched in November 2022.</p>
<h2>A pioneer in programming</h2>
<p>An alumnus of the School of Electrical and Information Engineering at Wits University, Barry graduated with a Bachelor of Science in Electrical Engineering in 1975. He then started a master’s degree, which he converted to a PhD.</p>
<p>After obtaining his PhD in 1979, he did post-doctoral research at the University of Manchester’s Institute of Science and Technology and at Imperial College in London. Thereafter, he worked as a senior research associate at the GEC-Marconi Centre in the UK.</p>
<p>I first met Barry in 1989 when he returned to South Africa as a senior lecturer in the School of Electrical Engineering at Wits. I was an undergraduate in his class that year. When I returned to Wits in 1998, he was my MSc supervisor and, when I was appointed as a lecturer in the School of Electrical Engineering, we were colleagues and friends.</p>
<p>When he joined the School, there was only one programming course, Engineering Applied Computing, taught to second-year electrical, civil and mechanical engineering students. Barry identified the growing importance of programming and information technology in engineering fields before anyone else in South Africa really had. Today, the School of Electrical & Information Engineering’s curriculum contains two second-year programming courses and a third-year course that is compulsory for all electrical and information engineering students. Barry was instrumental in introducing all these courses.</p>
<p>He was also the driving force behind the school’s name change: “Information Engineering” was added in the year 2000 with the introduction of a software stream that would be distinct from the electrical engineering stream.</p>
<p>The idea didn’t come from the blue. Talking to people in various companies, Barry realised that most of the school’s graduates went into the information and communications technology (ICT) sector rather than into the classical electrical engineering fields like electrical generation, transmission and distribution, high voltage engineering and control engineering. </p>
<p>That’s what prompted the development and introduction of the software stream. At that time, computers were becoming more common in many industries and the mobile phone sector was starting to take off.</p>
<h2>Software to drive development</h2>
<p>In the late 1980s, the then CEO of Eskom, South Africa’s national electricity utility, announced a mass roll-out of electrification called Electricity for All. Between 1990 and 2000, about 2.5 million houses were connected to the national grid. At that time, Barry started working on a software programme that would assist engineers in planning the electrification of townships, historically black urban residential areas. </p>
<p>A number of postgraduate students under his supervision worked on aspects of this software. He <a href="https://ieeexplore.ieee.org/document/624520">called the program CART</a> (Computer-Aided Reticulation of Townships). In 1997, he took a year-long sabbatical and worked full time on CART, developing it into a viable commercial product that was used to aid in the design of the electrification of many townships.</p>
<p>In 2005, Barry launched the <a href="https://jcse.org.za/">Joburg Centre for Software Engineering</a>. He became its director in 2007. It was the work he did through the centre that established him as an important thought leader in the software and IT space. Among other things, the centre hosted masterclasses with world renowned software experts.</p>
<h2>Innovation champion</h2>
<p>In 2012, Barry identified some old buildings owned by Wits University in Braamfontein, a high-rise downtown area of Johannesburg, as an ideal site for an innovation hub. Many people speak fondly of how Barry took them into a derelict disco with only the light from his mobile phone and enthusiastically explained how this was going to be a tech co-working space. He raised funding and transformed the rundown buildings into the innovation hub that is today one of the university’s flagship projects.</p>
<p>It is called the Tshimologong Digital Innovation Precinct. <a href="https://tshimologong.joburg/">Tshimologong</a> (a seTswana word for “place of new beginnings”) provides a space for digital start-ups, as well as training in digital technologies, and is used as a co-working space. Barry was Tshimologong’s first director and was honoured for this visionary project with the Vice Chancellor’s Award for Research and Teaching in 2016. </p>
<p>Even after retiring, Barry remained committed to and driven by the idea of innovation. He worked alongside Wits University’s deputy vice-chancellor, Professor Lynn Morris, to establish the <a href="https://www.wits.ac.za/innovation/wits-innovation-centre/">Wits Innovation Centre</a>. It was launched on 17 April 2023. </p>
<p>He passed away in a Johannesburg hospital on 16 May with his wife Rina and his children Leslie and Jodie at his side.</p><img src="https://counter.theconversation.com/content/206511/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Estelle Trengove does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many speak fondly of how Barry Dwolatzky took them into a derelict disco and enthusiastically explained the tech co-working space he envisioned there.Estelle Trengove, Associate professor in electrical engineering, University of the WitwatersrandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1937012022-12-05T21:29:31Z2022-12-05T21:29:31ZHow hiring more women IT experts improves cybersecurity risk management<figure><img src="https://images.theconversation.com/files/496840/original/file-20221122-22-gpfi1q.jpg?ixlib=rb-1.1.0&rect=284%2C25%2C5190%2C3207&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">When women are present on boards of directors, cyber risk management improves.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><iframe style="width: 100%; height: 100px; border: none; position: relative; z-index: 1;" allowtransparency="" allow="clipboard-read; clipboard-write" src="https://narrations.ad-auris.com/widget/the-conversation-canada/how-hiring-more-women-it-experts-improves-cybersecurity-risk-management" width="100%" height="400"></iframe>
<p>Despite the contributions women have made to the information and technology field, they continue to be underrepresented. <a href="https://www.famousscientists.org/ada-lovelace/">Ada Lovelace</a>, for example, was the world’s first computer programmer. <a href="https://www.famousscientists.org/grace-murray-hopper/">Grace Murray Hopper</a> developed the first compiler. And <a href="https://www.famousscientists.org/hedy-lamarr/">Hedy Lamarr</a> co-invented the <a href="https://www.sciencedirect.com/topics/engineering/spread-spectrum-communications">modern spread-spectrum communication technology</a>, which is found in Bluetooth, Wi-Fi and GPS technology.</p>
<p>Today, the leading figures in the IT field are all men. Although 39 per cent of the board members of Silicon Valley’s biggest tech companies are women, all the chairpersons and CEOs are men: <a href="https://www.apple.com/ca/newsroom/2011/11/15enCA-Apple-Names-Arthur-D-Levinson-Chairman-of-the-Board/">Arthur D. Levinson</a> and <a href="https://www.britannica.com/biography/Tim-Cook">Tim Cook</a> for Apple, <a href="https://www.britannica.com/biography/Satya-Nadella">Satya Nadella</a> for Microsoft, <a href="https://www.britannica.com/biography/Jeff-Bezos">Jeff Bezos</a> and <a href="https://ir.aboutamazon.com/officers-and-directors/person-details/default.aspx?ItemId=7601ef7b-4732-44e2-84ef-2cccb54ac11a">Andrew Jassy</a> for Amazon, <a href="https://www.britannica.com/biography/Mark-Zuckerberg">Mark Zuckerberg</a> for Meta, and <a href="https://hennessy.stanford.edu/biography/">John L. Hennessy</a> and <a href="https://www.britannica.com/biography/Sundar-Pichai">Sundar Pichai</a> for Google. </p>
<figure class="align-right ">
<img alt="A watercolour painting of a woman dressed in 19th century fashion" src="https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=862&fit=crop&dpr=1 600w, https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=862&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=862&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1083&fit=crop&dpr=1 754w, https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1083&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/496839/original/file-20221122-25-freza.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1083&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A watercolour portrait of Ada King, Countess of Lovelace, circa 1840.</span>
<span class="attribution"><span class="source">(Science Museum Group)</span></span>
</figcaption>
</figure>
<p>But progress is being made. A study from Osler, a business law firm, found that <a href="https://www.osler.com/osler/media/Osler/reports/corporate-governance/Osler-Diversity-Disclosure-Practices-report-2022.pdf">23 per cent of S&P/TSX 60 company boards seats were held by women</a>. This is an increase from data we — as accounting researchers — collected <a href="https://doi.org/10.1007/s10551-020-04717-9">on Toronto Stock Exchange companies between 2014 and 2018</a> that found the following: 11.7 per cent of companies had one woman on the board of directors, 27.7 per cent had two women, and 56.3 per cent had at least three women. </p>
<p>But when it came to the number of women IT experts on boards, the number was even lower. Only 22 out of 683 board members in 2018 were women IT experts. Although this number had doubled since 2014, it remained very low. It’s important to increase the number of women working in IT — not just for equality reasons, but because women improve key organizational outcomes.</p>
<h2>Cybersecurity is key for success</h2>
<p>Our recent research on <a href="https://doi.org/10.1007/s10551-020-04717-9">the impact of board gender diversity on how corporations respond to cyber risk</a> shows that, when women are present on boards of directors, cyber risk management improves. Proper cyber risk management is key to the success of tech companies.</p>
<p>Cybersecurity involves taking appropriate actions and making ethical decisions to mitigate cyber risks. In particular, it addresses the financial and technical risk caused by <a href="https://home.kpmg/us/en/home/insights/2020/09/digital-acceleration.html">digital acceleration</a> — the increased rate of digital transformation caused by the pandemic. </p>
<p>Because of digital acceleration, organizations are more vulnerable to unethical uses of technology. Facebook and Google’s history of <a href="https://sloanreview.mit.edu/article/business-technology-and-ethics-the-need-for-better-conversations/">inappropriate and unethical uses or suppression of information</a> has shined a spotlight on the importance of an ethical approach to cybersecurity. The most high-profile example of this occurred when <a href="https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html">Facebook sold data to companies that were trying to influence the 2016 U.S. presidential election</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ada-lovelace-and-others-inspire-women-in-tech-but-we-must-make-careers-worth-their-while-32975">Ada Lovelace and others inspire women in tech, but we must make careers worth their while</a>
</strong>
</em>
</p>
<hr>
<p>Organizations <a href="https://theconversation.com/a-unified-cybersecurity-strategy-is-the-key-to-protecting-businesses-182405">should construct cybersecurity based on ethical principles</a> concerning privacy, data collection, deposit and use, artificial intelligence and algorithms development and profiling.</p>
<p>One way to approach cybersecurity is through a <a href="https://global.oup.com/academic/product/corporate-governance-4e-9780198809869?q=bob%20tricker&lang=en&cc=ca">board of directors</a>. Boards represent stakeholder interests, monitor firm management and troubleshoot any problems that arise between the shareholders that own publicly listed firms and the firm’s management. They also have a duty to ensure their companies adopt appropriate and effective cybersecurity measures.</p>
<h2>Women improve cybersecurity</h2>
<p>Our study found a positive association between the level of cybersecurity disclosure and board gender diversity. In other words, the presence of women IT experts on boards resulted in improved cyber risk management — board monitoring, management supervision and corporate governance in particular. </p>
<p>Women brought new perspectives to the decision-making process and added a greater variety of skills and capabilities, which in turn, improved boards’ decision-making. </p>
<figure class="align-center ">
<img alt="A woman sitting at the head of a board meeting" src="https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/496838/original/file-20221122-11-qexdt7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The presence of women IT experts on boards resulted in improved cyber risk management.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Women are more informative, meaning they tended to value communication and disclosures more than men did, and collaborated better with stakeholders. Women also <a href="https://www.wolterskluwer.com/en/expert-insights/risk-appetite-and-risk-tolerance-whats-the-difference">had lower risk tolerance</a>, enhanced ethical practices and engaged less in fraudulent practices. </p>
<p>These specific skills, combined with their IT expertise, meant women improved the cybersecurity risk monitoring of their companies. Ultimately, having more women IT experts on boards could result in a more integrative cybersecurity approach that brings technological, business and ethical perspectives together.</p>
<h2>Suggestions for improving equality</h2>
<p>To close the gender gap, there must be a concerted effort to provide girls and women with IT-related education and skills. Firms should develop programs to promote the presence of women with IT skills and fund scholarships and grants for women. </p>
<p>Women should be encouraged to choose IT-related education and careers. At the earliest stage, schools should motivate tech-related curiosity and interest in children. While there are universities that offer graduate programs, diplomas and certificates in cybersecurity, more should be created. NGOs can also be a part of the solution by embracing and championing women IT experts. </p>
<p>Another way to close the gender gap is to promote more women to executive positions. As of 2020, the <a href="https://laws-lois.justice.gc.ca/eng/annualstatutes/2018_8/page-2.html#docCont">Canada Business Corporations Act requires public companies</a> to provide information on policies and practices related to diversity on boards and within senior management. More young women should be promoted to IT leadership positions to feed the pool of potential candidates for the board.</p>
<p>Updating the skills of existing board members should also be a priority. Ethics and cybersecurity should be a training priority for all board members. As such, <a href="https://corpgov.law.harvard.edu/2020/10/07/board-practices-quarterly-diversity-equity-and-inclusion/">updating ethics and cybersecurity skills of all board members</a> is a step towards improving the skills of women on boards.</p><img src="https://counter.theconversation.com/content/193701/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Camélia Radu receives funding from Social Sciencies and Humanities Research Council of Canada and Canadian Academic Accounting Association. </span></em></p><p class="fine-print"><em><span>Nadia Smaili does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A new study finds that women improve cyber risk management by bringing new perspectives and skills to the decision-making process of company boards.Camélia Radu, Associate Professor in Accounting, Université du Québec à Montréal (UQAM)Nadia Smaili, Professor in Accounting (forensic accounting), Université du Québec à Montréal (UQAM)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1793522022-04-14T12:14:16Z2022-04-14T12:14:16ZThe information age is starting to transform fishing worldwide<figure><img src="https://images.theconversation.com/files/457990/original/file-20220413-20-ppujj7.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5118%2C3399&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A researcher at the advocacy group Oceana uses GPS data to trace the activity of fishing boats. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/lacey-malarky-an-oceana-campaign-manager-on-illegal-fishing-news-photo/1149646104">Eric Baradat/AFP via Getty Images</a></span></figcaption></figure><p>People in the world’s developed nations live in a post-industrial era, working mainly in service or knowledge industries. Manufacturers increasingly rely on sensors, robots, artificial intelligence and machine learning to replace human labor or make it more efficient. Farmers can <a href="https://theconversation.com/farmers-of-the-future-will-utilize-drones-robots-and-gps-37739">monitor crop health via satellite</a> and <a href="https://www.businessinsider.com/agricultural-drones-precision-mapping-spraying">apply pesticides and fertilizers with drones</a>.</p>
<p>Commercial fishing, one of the oldest industries in the world, is a stark exception. <a href="http://www.oceansatlas.org/subtopic/en/c/1303/">Industrial fishing</a>, with <a href="https://www.britannica.com/technology/factory-ship">factory ships</a> and deep-sea trawlers that land thousands of tons of fish at a time, are still the dominant hunting mode in <a href="http://dx.doi.org/10.1126/science.aao5646">much of the world</a>. </p>
<p>This approach has led to <a href="https://www.fao.org/state-of-fisheries-aquaculture">overfishing, stock depletions</a>, <a href="https://www.amnh.org/explore/videos/biodiversity/will-the-fish-return/trawling-takes-a-toll">habitat destruction</a>, the senseless killing of unwanted <a href="https://www.fisheries.noaa.gov/insight/understanding-bycatch">by-catch</a> and wastage of as much as <a href="https://doi.org/10.1016/j.gloenvcha.2015.08.013">30% to 40% of landed fish</a>. Industrial fishing has <a href="https://www.seaaroundus.org/high-impact-fishing-dominates-catches-in-many-parts-of-the-world/#more-19241">devastated artisanal pre-industrial fleets</a> in Asia, Africa and the the Pacific. </p>
<p>The end product is largely a commodity that travels around the world like a manufactured part or digital currency, rather than fresh domestic produce from the sea. An average fish <a href="https://slowfoodusa.org/slow-fish/">travels 5,000 miles before reaching a plate</a>, according to sustainable-fishing advocates. Some is frozen, shipped to Asia for processing, then <a href="https://www.usda.gov/media/blog/2016/12/05/tale-fish-two-countries">refrozen and returned to the U.S.</a></p>
<p>But these patterns are starting to change. In my new book, “<a href="https://islandpress.org/books/blue-revolution">The Blue Revolution: Hunting, Harvesting, and Farming Seafood in the Information Age</a>,” I describe how commercial fishing has begun an encouraging shift toward a less destructive, more transparent post-industrial era. This is true in the U.S., Scandinavia, most of the European Union, Iceland, New Zealand, Australia, South Korea, the Philippines and much of South America.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/ZizIpLCQ_oM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Sustainable fishing limits catches at or below levels that fisheries can replace at their natural reproductive pace.</span></figcaption>
</figure>
<h2>Fishing with data</h2>
<p>Changes in behavior, technology and policy are occurring throughout the fishing industry. Here are some examples: </p>
<ul>
<li><p><a href="https://globalfishingwatch.org">Global Fishing Watch</a>, an international nonprofit, monitors and creates open-access visualizations of global fishing activity on the internet with a 72-hour delay. This transparency breakthrough has led to the arrest and conviction of owners and captains of <a href="https://globalfishingwatch.org/transparency/the-capture-of-the-mv-nika-a-case-of-illicit-fishing-and-a-showcase-for-how-to-beat-it/">boats fishing illegally</a>. </p></li>
<li><p><a href="https://traceability-dialogue.org/what-is-the-global-dialogue/">The Global Dialogue on Seafood Traceability</a>, an international business-to-business initiative, creates voluntary industry standards for seafood traceability. These standards are designed to help harmonize various systems that <a href="https://fishchoice.com/traceability-providers">track seafood through the supply chain</a>, so they all collect the same key information and rely on the same data sources. This information lets buyers know where their seafood comes from and whether it was produced sustainably. </p></li>
<li><p>Fishing boats in New Bedford, Massachusetts – the <a href="https://www.wpri.com/news/local-news/se-mass/new-bedford-is-americas-most-lucrative-fishing-port-for-20th-straight-year/">top U.S. fishing port</a>, based on total catch value – are rigged with sensors to develop a <a href="https://www.sphericalanalytics.io/news/2019/5/21/spherical-analytics-launches-marine-databank-with-port-of-new-bedford">Marine Data Bank</a> that will give fishermen data on ocean temperature, salinity and oxygen levels. Linking this data to actual stock behavior and catch levels is expected to help fishermen target certain species and avoid unintentional bycatch.</p></li>
<li><p><a href="https://www.fisheries.noaa.gov/southeast/sustainable-fisheries/frequent-questions-annual-catch-limit-monitoring">Annual catch limits</a>, divvied up through individual quotas for each fisherman, have helped curb overfishing. Imposing catch shares can be <a href="https://www.npr.org/sections/thesalt/2017/04/05/522731573/the-race-to-fish-slows-down-why-thats-good-for-fish-fishermen-and-diners">highly controversial</a>, but since the year 2000, 47 U.S. stocks that were overfished and shut down have been <a href="https://media.fisheries.noaa.gov/2022-01/q4-2021-rebuilt-map.png">rebuilt and reopened for fishing</a>, thanks to policy judgments based on the best available science. Examples include Bering Sea snow crab, North Atlantic swordfish and red grouper in the Gulf of Mexico.</p></li>
<li><p>A growing “fishie” movement that mirrors the widespread “foodie” locavore movement has been gaining steam for more than a decade. Taking a page from agriculture, subscribers to <a href="https://marketyourcatch.msi.ucsb.edu/alternative-market-types/community-supported-fisheries-csfs">community-supported fisheries</a> pay in advance for regular deliveries from local fishermen. Such engagement between consumers and producers is beginning to shape buying patterns and introduce consumers to new types of fish that are abundant but not iconic like the cod of yore.</p></li>
</ul>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1422202287150227465"}"></div></p>
<h2>Growing fish on land</h2>
<p>Aquaculture is the fastest-growing form of food production in the world, led by China. The U.S., which has <a href="https://pubs.usgs.gov/ds/2006/182/basemaps/useez/useezmeta.htm">exclusive jurisdiction over 3.4 million square miles of ocean</a>, has a mere 1% share of the global market. </p>
<p>But aquaculture, mostly shellfish and kelp, <a href="https://www.fisheries.noaa.gov/new-england-mid-atlantic/aquaculture/aquaculture-new-england-and-mid-atlantic">is the third-largest fisheries sector in the Greater Atlantic region</a>, after lobsters and scallops. Entrepreneurs are also raising finfish – including salmon, branzino, barramundi, steelhead, eels and kingfish – mostly in large, land-based <a href="https://www.aquacultureid.com/recirculating-aquaculture-system/">recirculating systems</a> that reuse 95% or more of their water. </p>
<p>Industrial-scale ocean salmon farming in Norway in the 1990s was largely responsible for the perception that farmed fish were <a href="https://www.bbc.com/news/uk-scotland-48266480">bad for wild fish and ocean habitats</a>. Today this industry has moved to less dense <a href="https://www.theexplorer.no/solutions/ocean-farm-1--moving-fish-farms-out-to-sea/">deep-water offshore pens</a> or land-based recirculating systems.</p>
<p>Virtually all new salmon farms in the U.S. – in Florida, Wisconsin, Indiana, and several planned for Maine and California – are <a href="https://www.bbc.com/news/business-56829129">land-based</a>. In some cases, water from the fish tanks circulates through greenhouses to grow vegetables or hemp, a system called <a href="https://www.nal.usda.gov/legacy/afsic/aquaponics">aquaponics</a>.</p>
<p>There is <a href="https://modernfarmer.com/2021/11/offshore-aquaculture-bill/">heated debate</a> over proposals to open U.S. federal waters, between 3 and 200 miles offshore, for ocean aquaculture. Whatever the outcome, it’s clear that without a growing mariculture industry, the U.S. won’t be able to reduce and may even widen its <a href="https://www.fisheries.noaa.gov/national/aquaculture/us-aquaculture">$17 billion seafood trade deficit</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vsafviTKsqs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Vancouver, Canada-based Willowfield Enterprises raises coho salmon in recirculating tanks on land.</span></figcaption>
</figure>
<h2>A voracious China</h2>
<p>This kind of progress isn’t uniform throughout the fishing industry. Notably, China is the <a href="https://www.fao.org/documents/card/en/c/ca9229en/">world’s top seafood producer</a>, accounting for 15% of the global wild catch as well as 60% of aquaculture production. Chinese fishing <a href="https://e360.yale.edu/features/how-chinas-expanding-fishing-fleet-is-depleting-worlds-oceans">exerts huge influence on the oceans</a>. Observers estimate that China’s fishing fleet may be <a href="https://www.penguinrandomhouse.com/books/538736/the-outlaw-ocean-by-ian-urbina/">as large as 800,000 vessels</a> and its distant-water fleet may include up to 17,000 vessels, compared to 300 for the U.S.</p>
<p>According to a study by the nonprofit advocacy group <a href="https://oceana.org/">Oceana</a> using Global Fishing Watch data, between 2019 and 2021 Chinese boats carried out <a href="https://usa.oceana.org/blog/far-reaching-fishing-the-global-footprint-of-chinas">47 million hours of fishing activity</a>. More than 20% of this activity was on the high seas or inside the <a href="https://oceanservice.noaa.gov/facts/eez.html">200-mile exclusive economic zones</a> of more than 80 other nations. Fishing in other countries’ waters without authorization, as some Chinese boats do, <a href="https://www.theguardian.com/environment/2020/aug/25/can-anyone-stop-china-vast-armada-of-fishing-boats-galapagos-ecuador">is illegal</a>. Chinese ships often target <a href="https://e360.yale.edu/features/how-chinas-expanding-fishing-fleet-is-depleting-worlds-oceans">West African, South American, Mexican and Korean waters</a>. </p>
<p>Most Chinese distant-water ships are so large that they scoop up as many fish in one week as local boats from Senegal or Mexico <a href="https://www.penguinrandomhouse.com/books/538736/the-outlaw-ocean-by-ian-urbina/">might catch in a year</a>. Much of this fishing would not be profitable <a href="http://dx.doi.org/10.1126/sciadv.aat2504">without government subsidies</a>. Clearly, holding China to higher standards is a priority for maintaining healthy global fisheries.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1441500224141160449"}"></div></p>
<h2>The ocean’s restorative power</h2>
<p>There is no shortage of gloomy information about how overfishing, along with other stresses like climate change, is <a href="https://news.un.org/en/story/2021/01/1081742">affecting the world’s oceans</a>. Nonetheless, I believe it bears emphasizing that over 78% of current marine fish landings <a href="https://www.fao.org/documents/card/en/c/ca9229en/">come from biologically sustainable stocks</a>, according to the United Nations. And overharvested fisheries often can rebound with smart management.</p>
<p>For example, the U.S. east coast scallop fishery, which was essentially defunct in the mid-1990s, is now a <a href="https://www.fisheries.noaa.gov/story-map/atlantic-sea-scallop-fishery-success-story">sustainable US$570 million a year industry</a>. </p>
<p>Another success story is <a href="http://www.cabopulmopark.com/maps.html">Cabo Pulmo</a>, a five-mile stretch of coast at the southeast end of Mexico’s Baja Peninsula. Once a vital fishing ground, Cabo Pulmo was barren in the early 1990s after intense overfishing. Then local communities persuaded the Mexican government to turn the area into a marine park where fishing was barred.</p>
<p>[<em>Over 150,000 readers rely on The Conversation’s newsletters to understand the world.</em> <a href="https://memberservices.theconversation.com/newsletters/?source=inline-150ksignup">Sign up today</a>.]</p>
<p>“In 1999, Cabo Pulmo was an underwater desert. Ten years later, it was a kaleidoscope of life and color,” ecologist <a href="https://www.nationalgeographic.org/find-explorers/enric-sala">Enric Sala</a>, director of National Geographic’s <a href="https://www.nationalgeographic.org/projects/pristine-seas/">Pristine Seas Project</a>, <a href="https://www.scribd.com/podcast/463577688/Let-s-turn-the-high-seas-into-the-world-s-largest-nature-reserve-Enric-Sala-Let-s-turn-the-high-seas-into-the-world-s-largest-nature-reserve-Enri">observed in 2018</a>. </p>
<p><div data-react-class="InstagramEmbed" data-react-props="{"url":"https://www.instagram.com/p/Cb8Lx-nhDiq/?utm_source=ig_web_copy_link","accessToken":"127105130696839|b4b75090c9688d81dfd245afe6052f20"}"></div></p>
<p>Scientists say that thanks to effective management, marine life in Cabo Pulmo has recovered to a level that makes the reserve <a href="https://ocean.si.edu/conservation/solutions-success-stories/cabo-pulmo-protected-area">comparable to remote, pristine sites</a> that have never been fished. Fishing outside of the refuge has also rebounded, showing that conservation and fishing are not incompatible. In my view, that’s a good benchmark for a post-industrial ocean future.</p><img src="https://counter.theconversation.com/content/179352/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nicholas P. Sullivan is a member of the national Local Catch Network.</span></em></p>One of the oldest industries, fishing, is entering the world of advanced analytics and data-driven planning. With oceans under stress and key fish stocks dwindling, can precision fishing help?Nicholas P. Sullivan, Senior Research Fellow, Fletcher Maritime Studies Program, and Senior Fellow, Council on Emerging Market Enterprises, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1670172022-02-23T13:36:55Z2022-02-23T13:36:55ZHow AI is shaping the cybersecurity arms race<figure><img src="https://images.theconversation.com/files/447353/original/file-20220218-45245-1hgu9fk.jpg?ixlib=rb-1.1.0&rect=51%2C0%2C5700%2C3771&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Defending against cyberattacks increasingly means looking for patterns in large amounts of data – a task AI was made for.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/artificial-intelligence-robot-control-futuristic-royalty-free-image/1328784596">Yuichiro Chino/Moment via Getty Images</a></span></figcaption></figure><p>The average business receives <a href="https://www.fortinet.com/blog/industry-trends/overcoming-the-challenges-of-rapid-and-effective-incident-response">10,000 alerts every day</a> from the various software tools it uses to monitor for intruders, malware and other threats. Cybersecurity staff often find themselves inundated with data they need to sort through to manage their cyber defenses.</p>
<p>The stakes are high. Cyberattacks are increasing and affect <a href="https://www.verizon.com/about/news/verizon-2021-data-breach-investigations-report">thousands of organizations</a> and <a href="https://www.cisa.gov/be-cyber-smart/facts">millions of people</a> in the U.S. alone.</p>
<p>These challenges underscore the need for better ways to stem the tide of cyber-breaches. Artificial intelligence is particularly well suited to finding patterns in huge amounts of data. As a researcher who <a href="https://scholar.google.com/citations?user=jdFquF4AAAAJ&hl=en">studies AI and cybersecurity</a>, I find that AI is emerging as a much-needed tool in the cybersecurity toolkit.</p>
<h2>Helping humans</h2>
<p>There are two main ways AI is bolstering cybersecurity. First, AI can help automate many tasks that a human analyst would often handle manually. These include automatically detecting unknown workstations, servers, code repositories and other hardware and software on a network. It can also determine how best to allocate security defenses. These are data-intensive tasks, and AI has the potential to sift through terabytes of data much more efficiently and effectively than a human could ever do. </p>
<p>Second, AI can help detect patterns within large quantities of data that human analysts can’t see. For example, AI could detect the key linguistic patterns of hackers posting emerging threats in the <a href="https://theconversation.com/illuminating-the-dark-web-105542">dark web</a> and alert analysts.</p>
<p>More specifically, AI-enabled analytics can help discern the jargon and code words hackers develop to refer to their new tools, techniques and procedures. One example is using the name Mirai to mean botnet. Hackers developed the term to hide the botnet topic from law enforcement and cyberthreat intelligence professionals.</p>
<p>AI has already seen some early successes in cybersecurity. Increasingly, companies such as FireEye, Microsoft and Google are developing innovative AI approaches to detect malware, stymie phishing campaigns and monitor the spread of disinformation. One notable success is <a href="https://news.microsoft.com/cyber-signals/">Microsoft’s Cyber Signals</a> program that uses AI to analyze 24 trillion security signals, 40 nation-state groups and 140 hacker groups to produce cyberthreat intelligence for C-level executives. </p>
<p>Federal funding agencies such as the Department of Defense and the National Science Foundation recognize the potential of AI for cybersecurity and have invested tens of millions of dollars to develop advanced AI tools for extracting insights from data generated from the dark web and open-source software platforms such as <a href="https://github.com/">GitHub</a>, a global software development code repository where hackers, too, can share code.</p>
<h2>Downsides of AI</h2>
<p>Despite the significant benefits of AI for cybersecurity, cybersecurity professionals have questions and concerns about AI’s role. Companies might be thinking about replacing their human analysts with AI systems, but might be worried about how much they can trust automated systems. It’s also not clear whether and how the well-documented AI <a href="https://theconversation.com/ftc-warns-the-ai-industry-dont-discriminate-or-else-159622">problems of bias, fairness, transparency and ethics</a> will emerge in AI-based cybersecurity systems.</p>
<p>Also, AI is useful not only for cybersecurity professionals trying to turn the tide against cyberattacks, but also for malicious hackers. Attackers are using methods like reinforcement learning and <a href="https://developers.google.com/machine-learning/gan">generative adversarial networks</a>, which generate new content or software based on limited examples, to produce new types of cyberattacks that can evade cyber defenses.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/XOxxPcy5Gr4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Just as AI can generate realistic-looking fake faces from photos of real people, the software can be used to create new forms of malware based on existing code.</span></figcaption>
</figure>
<p>Researchers and cybersecurity professionals are still learning all the ways malicious hackers are using AI. </p>
<h2>The road ahead</h2>
<p>Looking forward, there is significant room for growth for AI in cybersecurity. In particular, the predictions AI systems make based on the patterns they identify will help analysts respond to emerging threats. AI is an intriguing tool that could help stem the tide of cyberattacks and, with careful cultivation, could become a required tool for the next generation of cybersecurity professionals.</p>
<p>The current pace of innovation in AI, however, indicates that fully automated cyber battles between AI attackers and AI defenders is likely years away.</p>
<p>[<em>Climate change, AI, vaccines, black holes and much more.</em> <a href="https://memberservices.theconversation.com/newsletters/?nl=science&source=inline-science-various">Get The Conversation’s best science and health coverage</a>.]</p><img src="https://counter.theconversation.com/content/167017/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sagar Samtani works for Indiana University. </span></em></p>Artificial intelligence is emerging as a key cybersecurity tool for both attackers and defenders.Sagar Samtani, Assistant Professor of Operations and Decision Technologies, Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1600542021-10-07T10:45:33Z2021-10-07T10:45:33ZInstagram Kids: tech development must move from usability to safety<p>Facebook <a href="https://www.theguardian.com/technology/2021/sep/27/facebook-pauses-instagram-kids-teen-mental-health-concerns">has announced</a> that it is halting development on its <a href="https://theconversation.com/instagrams-privacy-updates-for-kids-are-positive-but-plans-for-an-under-13s-app-means-profits-still-take-precedence-165323">Instagram Kids</a> project. This <a href="https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739">follows reports</a> that the social media giant had commissioned – and kept secret – internal research that found Instagram was detrimental to young people’s mental health. </p>
<p>The study’s findings, not to mention the fact that they were withheld, have only bolstered <a href="https://www.nytimes.com/2021/04/15/technology/Facebook-cancel-Instagram-children.html?action=click&module=RelatedLinks&pgtype=Article">the heavy criticism</a> the project initially came in for. “Instagram for kids,” <a href="https://www.theguardian.com/technology/shortcuts/2021/may/11/instagram-for-kids-the-social-media-site-no-one-asked-for">ran one headline</a> early on, “the social media site no one asked for”. </p>
<p>Quite who has asked for what, in information technology development, is an interesting question. In the late 1980s, <a href="https://www.princeton.edu/%7Ehos/mike/articles/hcht.pdf">research had already highlighted</a> that the history of computers was arguably one of creating demand more than responding to need. And social media is no different: it has gone from being the thing we didn’t know we wanted to being <a href="https://www.journalofdemocracy.org/articles/the-road-to-digital-unfreedom-three-painful-truths-about-social-media/">embedded</a> in all that we do. Research increasingly confirms it can be a source of <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0248406">harm</a> too. </p>
<p>Children are at the heart of this battle between usefulness and safety. They’re the future designers of our tech – they will inherit our messes – but they’re also using it right now. And they’re tech companies’ future customers. Head of Instagram Adam Mosseri has been quick to defend <a href="https://about.instagram.com/blog/announcements/pausing-instagram-kids">the value and importance</a> of a kids’ version of the app. But can we trust big tech to give us what we actually need as opposed to manipulating us into consuming what they need to sell?</p>
<h2>The advent of usability</h2>
<p>The concept of user experience now dominates information technology thinking. But the earliest <a href="https://www.historyextra.com/period/20th-century/a-brave-new-world-the-1980s-home-computer-boom/">home computers</a> were anything but useful, or usable, for the average person. That is primarily because they were still being designed for <a href="https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/usability-evaluation">trained specialists</a>: they assumed competence in whomever switched them on. </p>
<p>From the early 1980s, parents were encouraged to embrace the educational potential of home computing. They saw the devices as a boost to their children’s <a href="http://www.computinghistory.org.uk/det/182/Acorn-BBC-Micro-Model-B/">learning</a> and future <a href="https://www.washingtonpost.com/archive/business/1981/12/20/jobs-growth-in-80s-linked-to-computer/a93ec635-24f7-4bfe-b18d-adfb7e39105c/">employability</a>. But this uptake in early devices was still more conceptual than practical.</p>
<p>By the end of the 1980s, however, the idea of <a href="https://www.tandfonline.com/doi/full/10.1080/0144929X.2018.1541255">usability</a> started to gain traction. IT design started focusing more on how average people might effectively and efficiently use their products, with computer scientists homing in on <a href="https://dl.acm.org/doi/10.5555/523237">human-computer interaction</a> and <a href="https://link.springer.com/article/10.1007/BF02032391">user-centered</a> design. </p>
<h2>From user experience to user safety</h2>
<p>Technology, of course, now enables how we live, how we communicate, how we interact, how we work. Households are filled with devices and applications which are usable, useful and being used. Indeed, keeping devices and all they contain in use is central to IT design: the user is a customer and the tech is designed to nurture – sollicit, even – that custom. </p>
<p>Figuring out how to provide a meaningful and relevant experience for someone using a digital product or service, from devices to social media platforms, is what is known as <a href="https://www.interaction-design.org/literature/topics/ux-design">user experience</a> design. <a href="https://www.youtube.com/watch?v=-hxX_Q5CnaA&t=204s">Tech giants</a> talk about meeting our expectations even before we know them ourselves. And the way designers know what we want before we want it comes down to the data they collect on us – and our children. </p>
<p>A flurry of recent lawsuits, however, highlight the line, in terms of harm to the user, that such digital innovation driven by profit and shaped by our personal data has crossed. These include the <a href="https://tiktokdataclaim.uk/">case</a> launched by the former children’s commissioner for England, Anne Longfield, against TikTok. </p>
<p>Longfield’s case alleges that the video-sharing platform harvests the personal information of its under-age users for targeted advertising purposes: from date of birth, email and phone number to location data, religious or political beliefs and browsing history.</p>
<p>The concern these days is that privacy is <a href="https://theconversation.com/instagrams-privacy-updates-for-kids-are-positive-but-plans-for-an-under-13s-app-means-profits-still-take-precedence-165323">under threat</a> because profits take precedence over safety. </p>
<p>The usability movement which started in the late 1980s therefore now needs to make way for what computer scientists term <a href="https://www.cc.gatech.edu/%7Ekeith/pubs/ieee-intro-usable-security.pdf">usable security</a>: human-centric design, where safety <a href="https://www.cybok.org/">takes precedence</a>. <a href="https://repository.cardiffmet.ac.uk/bitstream/handle/10369/11472/British_HCI_conference_positionPaper.pdf?sequence=3&isAllowed=y">Our research shows</a> that many online applications are not fit for use. They fail to find the balance between usability and security (and privacy). </p>
<p>We need to further explore the potential of <a href="https://theconversation.com/open-source-hardware-could-defend-against-the-next-generation-of-hacking-104473">open-source</a> designs – those not driven by profit – as alternatives. And we need to foster ethical awareness around technology in young minds: they are tomorrow’s programmers. As important as learning to code is understanding the ethical implications of what is being coded.</p><img src="https://counter.theconversation.com/content/160054/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Children are at the heart of the battle between usefulness and security. Can we trust Big Tech to find solutions to making computers, and life online, safer for them?Fiona Carroll, Reader in Human Computer Interaction, Cardiff Metropolitan UniversityAna Calderon, Senior Lecutrer in Computer Science, Cardiff Metropolitan UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1536142021-03-02T13:24:24Z2021-03-02T13:24:24ZCOVID-19 revealed how sick the US health care delivery system really is<figure><img src="https://images.theconversation.com/files/380135/original/file-20210122-17-1q4kx0c.jpg?ixlib=rb-1.1.0&rect=8%2C62%2C5982%2C3925&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Many U.S. hospitals and clinics are behind when it comes to sharing information.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/men-point-to-digital-tablets-with-virtual-screen-royalty-free-image/1257966144?adppopup=true">Teera Konakan/Moment via Getty Images</a></span></figcaption></figure><p>If you got the COVID-19 shot, you likely received a little paper card that shows you’ve been vaccinated. Make sure you keep that card in a safe place. There is no coordinated way to share information about who has been vaccinated and who has not.</p>
<p>That is just one of the glaring flaws that COVID-19 has revealed about the U.S. health care system: It does not <a href="https://www.healthit.gov/topic/interoperability/">share health information</a> well. Coordination between public health agencies and medical providers <a href="https://www.healthit.gov/topic/health-it-health-care-settings/public-health/">is lacking</a>. Technical and regulatory restrictions <a href="https://doi.org/10.31478/201810a">impede use</a> of digital technologies. To put it bluntly, our health care delivery system is failing patients. <a href="https://www.healthaffairs.org/do/10.1377/hblog20200721.330502/full/">Prolonged disputes</a> about the Affordable Care Act and rising health care costs have done little to help; the problems go beyond insurance and access.</p>
<p>I have spent most of my career within the domain of information technology and IT-based innovation and systems engineering. As a <a href="https://scholar.google.com/citations?user=pQuX8m0AAAAJ&hl=en.">professor of health informatics</a>, I have focused on health care transformation. For two years, I served on the Health Innovation Committee at <a href="https://www.himss.org/">HIMSS</a>, the preeminent global health information and technology organization. In short, I have studied these problems for decades, and I can tell you that most of them aren’t about medicine or technology. Rather, they are about the inability of our delivery system to meet the evolving needs of patients.</p>
<h2>We need a high-performance system</h2>
<p>In reality, the U.S. health care sector is not a system at all. Instead, it is an underperforming conglomerate of <a href="https://www.nap.edu/catalog/11378/building-a-better-delivery-system-a-new-engineeringhealth-care-partnership/">independent entities</a>: hospitals, clinics, community health and urgent care centers, individual practitioners, small group practices, pharmacy and retail outlets, and more, most of which compete for profits and in some cases pay sky-high salaries to executives.</p>
<figure class="align-center ">
<img alt="A nurse making a computerized medical report." src="https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/380139/original/file-20210122-19-18dl21h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The U.S. transition to a high-performing health care delivery system has been a slow one.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/side-view-of-female-nurse-making-medical-report-in-royalty-free-image/1188432112?adppopup=true">Maskot via Getty Images</a></span>
</figcaption>
</figure>
<p>These entities often function in silos. Errors, gaps, duplication of services and poor patient outcomes <a href="https://www.annfammed.org/content/7/2/100">are often the result</a>. </p>
<p>Here’s an example: A heart surgery patient, still on oxygen and in intensive care just two days earlier, is referred to her primary care physician for follow-up, and to a rehabilitation center for therapy. Neither her doctor nor the facility knows the patient was even hospitalized, nor do they have access to her records or medication list. </p>
<h2>Shopping for doctors</h2>
<p>For patients, this might mean a <a href="https://www.annfammed.org/content/7/2/100">disjointed set of services</a> that don’t offer a coordinated plan of care or even a timely or comprehensive diagnosis of their health problems. Patients with chronic conditions often see more than 10 different doctors <a href="https://www.rand.org/content/dam/rand/pubs/tools/TL200/TL221/RAND_TL221.pdf">during dozens of office visits per year</a>.</p>
<p>The specialist may not even be aware when the patient does not return. Patient information is seldom shared; specialists are often associated with different medical systems that don’t share records. And even when they try, accurately matching patient IDs in different systems can be problematic. </p>
<p>The challenge now is to transform the status quo into a high-performance system, a true 21st-century health care delivery system. Bringing systems engineering and information technologies to medical practice can help make that happen, but doing that requires a holistic approach. </p>
<p>Let’s start with electronic health records. More than 20 years ago, the <a href="https://www.nap.edu/read/12709/chapter/2">Institute of Medicine</a> called for the transition from paper to digital health records. This would allow patients to easily share lab, imaging and other test results with different providers. Nearly a decade went by before action occurred on the recommendation. In 2009, the <a href="https://www.asha.org/practice/reimbursement/hipaa/hitech-act/#:%7E:text=The%20Health%20Information%20Technology%20for,(EHR)%20systems%20among%20providers.">HITECH Act</a> was passed, which provided US$30 billion of incentives for the transition. </p>
<p>Yet now, 12 years down the road, we’re still a long way from a patient’s electronic health records becoming universally available at the point of care. Connectivity across systems and networks remains fragmented, and a lack of trust between organizations, along with anti-competitive behavior, results in an unwillingness to share patient information. </p>
<figure class="align-center ">
<img alt="A patient talks to a doctor on her digital tablet." src="https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/380141/original/file-20210122-13-1ox81wj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Although telemedicine conferences have become popular during the pandemic, the technology is still not.
up to speed for many patients.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/she-doesnt-need-a-prescription-for-this-tablet-royalty-free-image/1062130130?adppopup=true">AJ_Watt/E+ via Getty Images</a></span>
</figcaption>
</figure>
<h2>Unsafe medical treatment</h2>
<p><a href="https://www.pewtrusts.org/-/media/assets/2018/09/healthit_enhancedpatientmatching_report_final.pdf">One failure</a> of the system is an inability to accurately identify and match patient records. Few standards exist for collecting patient information. With hundreds of vendors and thousands of hospitals, doctor’s offices, pharmacies and other facilities participating in the process, variation is huge. Is John Doe at 250 Park Ridge Drive the same as John E. Doe at 250 Parkridge? </p>
<p>In 2017, the American Hospital Association estimated 45% of large hospitals <a href="https://www.aha.org/news/headline/2019-09-06-aha-urges-senate-appropriators-allow-funding-unique-patient-identifier/">reported difficulties in correctly identifying patients</a> across information technology systems. This means, on occasions at least, clinicians are making decisions that lead to increased chances of misdiagnosis, unsafe medical treatment and duplicate testing.</p>
<p>During a public health emergency like COVID-19, accurate ID’s of patients is one of the most <a href="http://patientidnow.org/wp-content/uploads/2020/09/Patient-ID-Now-Provides-Testimony-on-Coronavirus-Response-Efforts.pdf">difficult operational issues that a hospital faces</a>. Accurate COVID-19 test results <a href="http://patientidnow.org/wp-content/uploads/2020/11/Senate-Appropriations-Statement-11.10.20.pdf">are hampered</a> when specimens, sent to public health labs, are accompanied by patient misidentification and inadequate demographic data. Results can be sent to the wrong patient, or at best, get backlogged. </p>
<p>These mistakes also are costly. More than one-third of all denied claims <a href="https://www.ponemon.org/">result directly</a> from inaccurate patient identification or information that’s wrong or incomplete. This costs the average U.S. health care facility <a href="http://promos.hcpro.com/pdf/2016-national-report-misidentification-report.pdf">$1.2 million per year</a>. </p>
<h2>Congress needs to act</h2>
<p>For nearly two decades, the Department of Health and Human Services has been restricted from spending federal dollars to adopt a unique health identifier for patients. To remedy the problem, the U.S. House of Representatives in July 2020 unanimously adopted an amendment allowing HHS to evaluate patient identification solutions that still protect patient privacy. But the Senate chose not to address the issue. Still, many health care leaders are advocating for the new Congress to take action. Health care proponents are hopeful the new Senate majority leader will be more receptive to addressing the issue.</p>
<p>A bright spot in all of this is that many health care systems saw the advantages of telemedicine during the pandemic. It’s convenient for patients, it saves money and it meets the needs of patients who have <a href="https://doi.org/10.1177/0025817220926926">difficulty traveling</a>. Telemedicine could be just the beginning; with an ever-growing array of mobile health devices, physicians can monitor a patient at home, rather than in an institution. More must be done, however. Throughout the pandemic, some patients, with a lack of broadband access or poor Wi-Fi, <a href="https://www.pewresearch.org/internet/2020/04/30/53-of-americans-say-the-internet-has-been-essential-during-the-covid-19-outbreak/">had something less</a> than a rich and uninterrupted visit. </p>
<p>Health IT advocates have long envisioned a health care system that seamlessly uses connected care to improve patient outcomes while costing less. When the pandemic subsides, the waivers and policies temporarily adopted will require not a sudden termination, but a transition to such a system. </p>
<p>Over the past year, doctors, nurses and health care systems have learned lessons out of necessity. Instead of abandoning our new knowledge, I believe we need to double down on a modern, stable and value-based health delivery system with equity for all. And at its heart must be one certainty: that accurate and comprehensive patient records are always available at the point of care.</p>
<p>[<em>Get facts about coronavirus and the latest research.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=coronavirus-facts">Sign up for The Conversation’s newsletter.</a>]</p><img src="https://counter.theconversation.com/content/153614/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth A. Regan receives funding from the National Science Foundation. She is affiliated with the Health Information Management Systems Society (HIMSS) and its local South Carolina chapter.</span></em></p>With outdated delivery systems at many hospitals and clinics, mistakes can lead to costly duplication of services and poor patient outcomes. But there are ways to fix the current system.Elizabeth A. Regan, Dept. Chair Integrated Information Technology and Professor of Health Informatics, University of South CarolinaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1535392021-03-01T20:01:36Z2021-03-01T20:01:36ZOntario’s digital health program has a data quality problem, despite billions in spending<figure><img src="https://images.theconversation.com/files/386757/original/file-20210226-13-12g63gf.jpg?ixlib=rb-1.1.0&rect=395%2C12%2C6851%2C5039&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Digital health technology, such as electronic health records, is believed to enhance patient-centred care, improve integrated care and ensure financially sustainable health care.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>Digital health is about applying advanced information technologies to enable free flow of patient information across the circle of care. For patients, that means every health-care provider they see at different locations should be able to access relevant health record information quickly and efficiently. </p>
<p><a href="https://www.infoway-inforoute.ca/en/component/edocman/supporting-documents/partnership/3198-keynote-ontario-s-digital-health-strategy">Digital health technology</a>, such as electronic health records, is believed to enhance patient-centred care, improve integrated care and ensure financial sustainability of our health-care system. However, Ontarians are facing the tough reality that their health data are still fragmented, despite <a href="https://www.auditor.on.ca/en/content/news/16_newsreleases/2016news_3.03.pdf">billions of dollars spent over the last two decades</a> to enable fast and secure exchange of health information. The COVID-19 pandemic has brought to light even more data quality issues. </p>
<p>As noted in a recent <a href="https://nationalpost.com/news/canadas-public-data-on-covid-19-is-mostly-a-mess-heres-how-to-find-the-useful-info"><em>National Post</em> article</a>, much of the public data on COVID-19 is a mess. Not only are data on infected cases and deaths delayed, they are also incomplete. Ontario reportedly offered inconsistent counts between provincial medical officials and local public health units. No wonder the Ministry of Health admits that “<a href="http://health.gov.on.ca/en/pro/programs/connectedcare/oht/docs/dig_health_playbook_en.pdf">consistent standards are lacking across sectors — making it extremely difficult to integrate patient records or to integrate local systems with provincial ones</a>.” </p>
<p>It is a tough pill to swallow <a href="https://www.infoway-inforoute.ca/en/what-we-do/news-events/newsroom/2011-news-releases/137-infoway-invests-380-million-to-help-physicians-and-nurse-practitioners-implement-electronic-medical-record-emr-systems">after years of investment</a> aimed at enabling fast and secure health data exchange.</p>
<h2>Neither sustainable nor effective</h2>
<p>The Ontario government is taking two approaches to improving data quality, examples of which include accuracy and timeliness of data reported across different service providers. The first approach centres on improving health data exchange across heterogeneous systems (systems developed by different vendors and requiring different hardware and software configurations to operate) by using <a href="https://infocentral.infoway-inforoute.ca/en/standards/canadian/fhir">common communication standards</a>. </p>
<p>However, this approach is neither scalable nor sustainable as communications across these systems become increasingly complex, time-consuming and error-prone when more systems are added to the mix of systems. Inconsistent counts of COVID-19 infected cases and deaths provided by different levels of governments is a case in point. Not to mention that these standards evolve rapidly and <a href="https://orionhealth.com/media/4894/orion-health-interoperability-high-level-report-final-1.pdf">even previous versions of the same standard cannot be easily mapped and migrated to current ones</a>. </p>
<p>The second approach relies on the minimum common data set proposed in the <a href="http://health.gov.on.ca/en/pro/programs/connectedcare/oht/docs/dig_health_playbook_en.pdf">Digital Health Playbook</a>, a resource intended to guide health-care organizations to build their digital systems. The minimum data set contains data classes (such as individual patients) and their corresponding elements (such as date of birth) for clinical notes, laboratory information, medications, vital signs, patient demographics and procedures, to name a few uses. </p>
<figure class="align-center ">
<img alt="Illustration of a tablet showing patient information" src="https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=476&fit=crop&dpr=1 600w, https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=476&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=476&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=598&fit=crop&dpr=1 754w, https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=598&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/386758/original/file-20210226-15-m13tol.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=598&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Health-care providers need fast, secure access to medical records, including clinical notes, lab information, medications, vital signs, patient demographics and procedures.</span>
<span class="attribution"><span class="source">(Pixabay)</span></span>
</figcaption>
</figure>
<p>These data sets, while appropriate for the requirements of family physicians whose main responsibility is disease control and prevention, are not sufficient for treating complex patients who suffer from multiple health issues, which <a href="https://theconversation.com/good-governance-is-the-missing-prescription-for-better-digital-health-care-128375">demand a vast amount of health data from various health-care providers</a>. </p>
<p>These two approaches adopted by the Ontario government to address data quality issues are <a href="https://ipolitics.ca/2020/06/19/ontario-health-system-will-continue-to-fail-us-if-new-data-efforts-dont-translate-to-changes-for-patients-philpott/">neither sustainable nor effective</a>, so can hardly serve as a strategy guiding health digitalization. </p>
<p>As researchers focusing on IT in health governance, we propose that a data strategy encompass four pillars: </p>
<h2>1. Data quality standards</h2>
<p>First, data quality is an umbrella term that encompasses <a href="https://www.cihi.ca/sites/default/files/document/iqf-summary-july-26-2017-en-web_0.pdf">multiple dimensions</a> that include things like accuracy, accessibility and timeliness. And there are trade-offs among these dimensions. For example, increasing timely data reports may affect data comprehensiveness, which demands time to cover all the required data. </p>
<p>While “fit for use” (meaning the quality of data fits the requirements of their intended users) is considered appropriate and pragmatic, it needs to be clearly spelled out what quality standards need to be reinforced. Given the limited resources and increasing pressures to curb health-care costs, it becomes increasingly urgent to decide which data quality standards should be the focus. </p>
<h2>2. Sustainable, scalable, patient-centric platform</h2>
<p>Second, the health-care sector is not alone in dealing with decades-old systems and the low-quality data — such as <a href="https://ipolitics.ca/2020/06/19/ontario-health-system-will-continue-to-fail-us-if-new-data-efforts-dont-translate-to-changes-for-patients-philpott/">inaccurate COVID-19 case counts</a> — generated by these systems. Drawing on experiences from banks and other organizations, the health-care sector could <a href="https://www.himss.org/resources/electronic-health-record-data-governance-and-data-quality-real-world">create an open data platform</a> that enables data sharing across health-care providers and allows patients to share data from their social media and mobile and wearable devices. Countries such as <a href="https://www.ey.com/en_gl/health/how-will-you-design-information-architecture-to-unlock-the-power">the United Kingdom and Germany have started implementing the open data platform idea</a>.</p>
<h2>3. Measurable indicators of improvement</h2>
<p>Third, measurable outcomes pertaining to data quality improvement efforts need to be defined. Improvement efforts could include <a href="https://www.himss.org/resources/electronic-health-record-data-governance-and-data-quality-real-world">training programs on best practices</a> related to data entry, and introducing system features that enable data quality checking (for example, completeness or consistency). Measurable outcomes would ensure accountability and the achievement of the intended objectives, and inform future funding decisions. </p>
<h2>4. Improvement process adopted by providers</h2>
<p>Lastly, a data strategy needs to clearly define a data quality improvement and monitoring process where the quality of the data is continuously monitored and assessed to ensure that data support patient care and research. Data quality <a href="https://www.healthit.gov/playbook/pddq-framework/data-quality/data-quality-planning/">is a shared responsibility</a>, so the quality assurance process needs to take place collectively across providers but also within each provider. </p>
<p>To define and implement the data strategy, meaningful engagement with all stakeholders is key. For example, patients and providers need to be involved to identify the data required to treat the diseases that claim the most of our health-care budget, define quality dimensions of the data, and specify roles and responsibilities of maintaining the quality of data.</p>
<p>In contrast to the Band-Aid approach adopted by the Ontario government, the four-pillar data strategy is long-term, focused and holistic. It would ensure that data quality is placed at the front and centre of Ontario’s effort in health digitization. Following the strategy, our health-care system would develop a sustainable mechanism and a scalable capability to continuously improve data quality. </p>
<p>Without such a data strategy, Ontarians will stand to lose another decade and billions more.</p><img src="https://counter.theconversation.com/content/153539/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Linying Dong is affiliated with Ryerson University, and volunteers at the Board of Directors of Carefirst.</span></em></p><p class="fine-print"><em><span>Karim Keshavjee is the CEO and majority shareholder of InfoClin Inc, an organization that provides consulting on data extraction, data quality and data analytics. He has received funding from the College of Family Physicians of Canada, Diabetes Action Canada, the University of Calgary, Ryerson University and McMaster University. </span></em></p>Digital health can improve care, but in Ontario, health data are still fragmented, despite billions of dollars spent over the last two decades to enable fast and secure exchange of health information.Linying Dong, Professor, Ted Rogers School of Information Technology Management, Toronto Metropolitan UniversityKarim Keshavjee, Assistant Professor, Institute of Health Policy, Management and Evaluation, University of TorontoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1542532021-02-19T03:30:08Z2021-02-19T03:30:08ZAustralia, fighting Facebook, is the latest country to struggle against foreign influence on journalism<figure><img src="https://images.theconversation.com/files/385174/original/file-20210218-23-ws2you.jpg?ixlib=rb-1.1.0&rect=49%2C65%2C5414%2C3571&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The New York Times Facebook site on Feb. 18, 2021 as seen in Melbourne, Australia: Empty. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com.mx/detail/fotografía-de-noticias/in-this-photo-illustration-the-new-york-times-fotografía-de-noticias/1302738747?adppopup=true">Robert Cianflone/Getty Images</a></span></figcaption></figure><p>Facebook has <a href="https://edition.cnn.com/2021/02/17/media/facebook-australia-news-ban/index.html">barred Australians from finding or sharing news on its platform</a>, in response to an Australian government proposal to require social media networks to pay journalism organizations for their content. The move is already <a href="https://au.finance.yahoo.com/news/australian-news-sites-traffic-falls-004636684.html">reducing online readership</a> of Australian news sites.</p>
<p>Similar to what happened when <a href="https://www.theverge.com/2021/1/21/22242616/facebook-refers-decision-suspend-trump-oversight-board">Facebook suspended Donald Trump’s account in January</a>, the fight with Australia is again raising debate around social media networks’ enormous control over people’s access to information. Australia’s prime minister, Scott Morrison, says his country “<a href="https://www.bbc.com/news/world-australia-56109036">will not be intimidated</a>” by an American tech company.</p>
<p>My <a href="https://academic.oup.com/dh/article-abstract/45/1/72/6033978">research in the history of international media politics</a> has <a href="https://www.cambridge.org/core/journals/journal-of-global-history/article/abs/emancipation-of-media-latin-american-advocacy-for-a-new-international-information-order-in-the-1970s/6A12A3D8F1680E00E0709AF2CAB0191E">shown</a> that a handful of rich countries have long exerted undue influence over how the rest of the world gets its news.</p>
<p>Facebook has <a href="https://ourworldindata.org/rise-of-social-media">2.26 billion</a> users, and most of them live <a href="https://www.statista.com/statistics/268136/top-15-countries-based-on-number-of-facebook-users/">outside of the United States</a>, according to the company. India, Indonesia, Brazil, Mexico and the Philippines are home to the most Facebook users outside the U.S.</p>
<p>Facebook’s share of the global social media market is staggering, but the company is not alone. Eight <a href="https://ourworldindata.org/rise-of-social-media">of the world’s 11 most popular social media companies are based in the U.S.</a>. These include YouTube and Tumblr, as well as Instagram, which is owned by Facebook.</p>
<p>The geographic concentration of information technology puts these billions of non-American social media users and their government officials in a subservient position. </p>
<p>The business decisions of Big Tech can effectively dictate free speech around the world.</p>
<h2>Imperial origins of international news</h2>
<p>Reliance on foreign media has long been a problem in the Global South – so-called developing countries with a shared history of colonial rule.</p>
<p>It began, in many ways, 150 years ago, with the development of wire services — the news wholesalers that send correspondents around the world to deliver stories via wire feed to subscribers. Each service chronicled news in its home country’s <a href="https://muse.jhu.edu/article/572612/summary?casa_token=rEmZRuHG1jEAAAAA:LWjGZRDQ1XSwxohbb2kctqngVGHfu_SCpnO_wzQDFMt5y6GVfRtN9mlXUa8yNiNrUQtV1CKOh90">respective colonies or spheres of influence</a>, so Britain’s Reuters would file stories from Bombay and Cape Town, for example, and France’s Havas from Algiers. </p>
<p>The <a href="https://www.ap.org/about/our-story/">Associated Press</a>, based in the U.S., became a force in the global news business in the early 20th century.</p>
<p>These companies cornered the global market for news production, generating most of the content that people worldwide read in the international section of any newspaper. This meant, for example, that a Bolivian reading about events in neighboring Peru would typically receive the news from a U.S. or French correspondent.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Man in sunglasses and hat sits on a camel" src="https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=939&fit=crop&dpr=1 600w, https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=939&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=939&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1180&fit=crop&dpr=1 754w, https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1180&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/385187/original/file-20210219-18-1yhm4ik.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1180&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Foreign correspondents on a sightseeing tour in Egypt in 1953.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com.mx/detail/fotografía-de-noticias/fleet-street-journalist-hannen-swaffer-on-a-fotografía-de-noticias/3420279?adppopup=true">Ronald Startup/Picture Post/Hulton Archive/Getty Images</a></span>
</figcaption>
</figure>
<p>The news monopolies of former colonial powers continued into the 20th century. Some Latin American countries, such as Argentina and Mexico, developed their own strong newspapers that reported on local and national events, but they could not afford to send many correspondents abroad. </p>
<p>In the 1970s, North Atlantic wire services still provided as much as 75% of international news printed and broadcast in Latin America, according to <a href="https://www.cambridge.org/core/journals/journal-of-global-history/article/abs/emancipation-of-media-latin-american-advocacy-for-a-new-international-information-order-in-the-1970s/6A12A3D8F1680E00E0709AF2CAB0191E">my research</a>. </p>
<h2>Cold War problems</h2>
<p>Separately, many world leaders outside of the U.S. and Europe also worried that those foreign powers would intervene in their countries’ domestic affairs by covertly using their countries’ media. </p>
<p>That happened during the Cold War. In the lead-up to a 1954 <a href="https://www.sup.org/books/title/?id=10654">CIA-supported coup</a> in Guatemala, the agency secretly used the Guatemalan radio waves and <a href="https://history.state.gov/historicaldocuments/frus1952-54Guat/d287">planted local news stories</a> to convince the Guatemalan military and public that the overthrow of their democratically elected president was inevitable.</p>
<p>After Guatemala, in the late 1950s and early 1960s, many leaders in the “third world” – countries that aligned with neither the U.S. nor the Soviet Union – began creating news and radio services of their own.</p>
<p>Cuban leader Fidel Castro established a state-run international news service, Prensa Latina, to allow Latin Americans “<a href="https://www.mitpressjournals.org/doi/full/10.1162/jcws_a_00895?mobileUi=0">to know the truth and not be victims of lies</a>.” He also created Radio Havana Cuba, which <a href="https://uncpress.org/book/9780807849231/radio-free-dixie/">broadcast</a> revolutionary programming across the Americas, including in the U.S. South. These were government agencies, not independent news organizations.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Black-and-white image of a young woman in fatigues reading the paper" src="https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=397&fit=crop&dpr=1 600w, https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=397&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=397&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=499&fit=crop&dpr=1 754w, https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=499&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/385177/original/file-20210218-17-dgj4lf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=499&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A Cuban soldier reads the government-run newspaper Granma.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com.mx/detail/fotografía-de-noticias/jeune-femme-soldat-lisant-le-journal-granma-fotografía-de-noticias/955427456?adppopup=true">Lily Franey/Gamma-Rapho via Getty Images</a></span>
</figcaption>
</figure>
<p>Global South leaders also wanted to shape the international portrayal of their countries. North Atlantic news services often depicted the third world as backward and chaotic, justifying the need for outside <a href="https://www.google.com/books/edition/Coups_and_Earthquakes/YCcIAAAAIAAJ?hl=en&gbpv=1&bsq=coups+and+earthquakes+journalism&dq=coups+and+earthquakes+journalism&printsec=frontcover">intervention</a>. </p>
<p>This tendency was so common that it earned the moniker “<a href="https://www.globalissues.org/article/420/a-bad-press">coups and earthquakes</a>” journalism.</p>
<h2>Taking control</h2>
<p>Global South leaders also lacked full access to communications technology, especially satellites, which were controlled by the U.S. and Soviet-dominated organizations.</p>
<p>In the 1970s, Global South leaders took their concerns about information inequities to <a href="https://www.cambridge.org/core/journals/journal-of-global-history/article/abs/emancipation-of-media-latin-american-advocacy-for-a-new-international-information-order-in-the-1970s/6A12A3D8F1680E00E0709AF2CAB0191E">UNESCO</a>, lobbying for binding United Nations regulations that would prohibit direct foreign broadcasts by satellite. It was a quixotic quest to persuade dominant powers to relinquish their control over communications technology, and they didn’t get far. </p>
<p>But those decades-old proposals recognized the imbalances in global information that remain in place today. </p>
<p>In recent decades, other countries have created their own news networks with the express aim of challenging biased representations of their regions. </p>
<p>One result is <a href="https://archive.org/details/aljazeerainsides00mile/page/346/mode/2up">Al Jazeera, created</a> in 1996 by the Qatari emir to challenge U.S. and British depictions of the Middle East.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two journalists sit in a newsroom with old desktop computers and a world map on the wall" src="https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/385179/original/file-20210218-24-jxhwzj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Doha, Qatar offices of Al-Jazeera, Oct. 10, 2001.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com.mx/detail/fotografía-de-noticias/al-jazeera-staff-work-at-the-satellite-channels-fotografía-de-noticias/51391252?adppopup=true">Joseph Barrak/AFP via Getty Images</a></span>
</figcaption>
</figure>
<p>Another is TeleSur, founded by Venezuela in partnership with other Latin American nations in 2005, which aims to counterbalance U.S. influence in the region. It was created after the 2002 coup attempt against Venezuelan president Hugo Chávez, which was supported by the <a href="https://www.theguardian.com/world/2002/apr/21/usa.venezuela">U.S. government</a> and <a href="https://mondediplo.com/2002/08/10venezuela">powerful Venezuelan broadcasters</a>.</p>
<p>[<em>Get the best of The Conversation, every weekend.</em> <a href="https://theconversation.com/us/newsletters/weekly-highlights-61?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=weeklybest">Sign up for our weekly newsletter</a>.]</p>
<h2>Why media matters</h2>
<p>State-sponsored media outlets have faced accusations – some well-founded – of <a href="https://newrepublic.com/article/143410/al-jazeera-do">coverage biased</a> in favor of their government sponsors. But their existence nonetheless underscores that it matters where media is produced, and by whom. </p>
<p>Research suggests this concern extends to social media. Facebook and Google, for example, produce algorithms and policies that reflect the ideas of their creators — who are primarily <a href="https://nyupress.org/9781479837243/algorithms-of-oppression/">white, male</a> and based in Silicon Valley, California. </p>
<p>One study found that this <a href="https://nyupress.org/9781479837243/algorithms-of-oppression/">can result in racist</a> or <a href="https://www.technologyreview.com/2021/01/29/1017065/ai-image-generation-is-racist-sexist/">sexist search engine</a> search results. A 2016 <a href="https://www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race">ProPublica investigation</a> also discovered that Facebook allowed advertisers for housing to target users based on race, violating the Fair Housing Act of 1968.</p>
<p>All of this raises doubts about whether Facebook, or any international company, can make rules regulating speech that are equally appropriate in every country they operate in. Deep knowledge of national politics and culture is necessary to understand <a href="https://www.cigionline.org/articles/dangerous-inconsistencies-digital-platform-policies">which accounts are dangerous enough to suspend, for example, and what comprises misinformation</a>. </p>
<p>Facing such criticism, in 2020 Facebook <a href="https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-court">assembled an independent oversight board</a>, colloquially referred to as its Supreme Court. Comprising media and legal experts from all over the world, the board has a truly diverse membership. But its mandate is to uphold a “constitution” designed by the American company by evaluating a handful of appeals to Facebook’s content removal decisions.</p>
<p>Facebook’s current fight with Australia suggests that equitable control of international news remains very much a work in progress.</p>
<p><em>Editor’s note: This story has been updated to more accurately characterize the U.S. social media companies that operate globally and the nature of Cuba’s government news services. It is published by The Conversation U.S., an independent media nonprofit, one of eight news organizations around the world that share a common mission, brand and publishing platform. <a href="https://theconversation.com/the-conversations-submission-to-the-australian-senate-inquiry-into-the-news-media-bargaining-code-153532">The Conversation Australia</a> has publicly lobbied in support of the Australian government’s proposal.</em></p><img src="https://counter.theconversation.com/content/154253/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Vanessa Freije does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The battle between media companies and foreign governments over who controls the news dates back some 150 years, to when European and US wire services dictated the world’s headlines.Vanessa Freije, Assistant Professor, Henry M. Jackson School of International Studies, University of WashingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1431182020-07-21T17:50:51Z2020-07-21T17:50:51ZPrivacy, perceptions and effectiveness: the challenges of developing coronavirus contact-tracing apps<figure><img src="https://images.theconversation.com/files/348649/original/file-20200721-29-1xh3gv2.jpg?ixlib=rb-1.1.0&rect=0%2C12%2C4240%2C2811&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Are you ready to start sharing your personal information with an app developed by Google and Apple?</span> <span class="attribution"><a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>To control the spread of the Covid-19 pandemic, more than <a href="https://www.technologyreview.com/2020/05/07/1000961/launching-mittr-covid-tracing-tracker/">50 countries</a> have implemented applications to trace the contacts of people who may be infected. </p>
<p>The installation and use of these applications are voluntary in the majority of countries, but in others they’re mandatory. <a href="https://futurism.com/contact-tracing-apps-china-coronavirus">China is a notable example</a>, but use of the application is also required in countries such as India, Indonesia and Vietnam. In Turkey, those who have been infected with the virus are <a href="https://www.hurriyetdailynews.com/virus-case-tracking-app-launched-in-turkey-154005">required to download the application</a>, which shares information with security forces. </p>
<p>Even in fully democratic countries that promise that users’ data will be kept private – for example in <a href="https://www.dutchnews.nl/news/2020/06/dutch-coronavirus-tracing-app-will-undergo-trials-in-twente/">the Netherlands</a> and <a href="https://www.rfi.fr/en/science-and-technology/20200527-france-coronavirus-stop-covid-mobile-phone-app-technology-controversy-law-vote-privacy-concerns-surveillance-human-rights">France</a> – there are concerns that the application could be used as a surveillance tool.</p>
<h2>Identity and privacy</h2>
<p>Beyond the cultural and political differences between countries, two main points are at stake when it comes to privacy: </p>
<ul>
<li><p><strong>Users’ identities</strong>: Most countries have implemented an anonymization or pseudonymization approach, which can be fulfilled by the Bluetooth connection. The few countries that opted for an approach that does not respect privacy, such as Kuwait, have implemented a geolocation app. </p></li>
<li><p><strong>Data structure and storage</strong>: All applications require a data architecture – the internal structure for recording, storing and processing information – and developers must choose between a centralized or decentralized approach. With a centralized architecture, data is uploaded to a server controlled by the government health authority rather than stored locally on users’ devices.</p></li>
</ul>
<p>France’s StopCovid application uses a centralized architecture, while Germany finally adopted a decentralized approach developed by <a href="https://www.apple.com/covid19/contacttracing">Google and Apple</a>, inspired by the technique developed by the European consortium <a href="https://github.com/DP-3T/documents">Decentralized Privacy-Preserving Proximity Tracing</a> (DP-3T). The United Kingdom adopted a centralized approach but, facing increased criticism about the risk to privacy raised by its centralized app, <a href="https://www.bbc.com/news/technology-53095336">switched to the Google-Apple technique</a>. Countries as Japan and Italy made the same choice. </p>
<p>A second question relates to Bluetooth-based applications’ effectiveness in fulfilling the objective pursued. For example, depending on environmental factors, a device could estimate that another is <a href="https://theintercept.com/2020/05/05/coronavirus-bluetooth-contact-tracing/">20 meters away… or 2</a>. Because accurately estimating physical proximity and contact time are essential, any additional level of uncertainty can greatly diminish such applications’ effectiveness. </p>
<h2>Perceptions matter</h2>
<p>This is important not only in terms of functionality, but also because it has an impact on users’ perceptions. In countries where the applications are voluntary, if fewer people think an application is effective, the lower the adoption will be and, in turn, the lower the utility. If potential users’ perceptions are more positive, they’re more likely to install and use the application, and the more useful it will become for others. (This is known as the principle of <a href="https://en.wikipedia.org/wiki/Network_effect">network externalities</a>.) Like the telephone, even the best application cannot be useful and efficient if it is used by only one person. Instead, usefulness depends on the incentives and benefits that users can expect. Beyond the public health objective, users could have access to analytics or relevant information to protect themselves from the virus.</p>
<p>A third point is technical performance. Several apps were criticized for bugs, low performance or poor compatibility with iPhones, as with <a href="https://7news.com.au/lifestyle/health-wellbeing/virus-app-problems-identified-with-android-c-1019598">Australia’s COVIDSafe</a>. These and other issues were reviewed by the independent <a href="https://www.adalovelaceinstitute.org/our-work/covid-19/">Ada Lovelace Institute</a> in in United Kingdom.</p>
<p>Such challenges are well known to managers of corporate information systems. On one hand they are the guarantors of secure access and consequently implement centralized information technologies. On the other, employees are increasingly use their personal telephones and applications originated in the consumer market, such as WhatsApp, and the massive boost in distance work during the epidemic as amplified the trend.</p>
<p>IT managers want to maintain control over the devices and applications used for work, yet employees consider consumer-focused applications to be more efficient, better performing, and more enjoyable than corporate technologies. As with contact-tracing applications, the main issue here is how to implement a governance taking into account individuals’ voices and concerns.
Responding on employees’ needs and expectations, corporate-based technology also provides now availability across borders, and thus facilitates professional and personal mobility.</p>
<h2>GAFAM to the rescue?</h2>
<p>Control is indeed the main issue at stake in the current choices for coronavirus contact tracing applications. Regardless of the country, there are governments and health agencies that want to control or even centralize data. Thus, the majority of them have opted for centralized systems developed at the national level, in a similar way to corporate information systems. As noted, several applications have been criticized for this choice as well as for technical problems. </p>
<p>It is interesting that Google and Apple have proposed an application that is perceived as being more reliable and providing increased security and privacy through anonymization and decentralized architecture. Given the dominance and control that these GAFAM firms exert, one might expect them to be less concerned about users’ privacy than governments. Is because Apple and Google are more virtuous than public health decision-makers? </p>
<p>In fact, the answer is more pragmatic: Apple and Google are talking to consumers, and therefore take their interests and needs into account – at least superficially. They know that the success of their applications relies on user adoption, and that that this adoption would be compromised if the applications developed compete with those implemented by governments. Thus, they offer access to their <a href="https://en.wikipedia.org/wiki/Application_programming_interface">application programming interfaces</a> (APIs) to each country that wishes to implement the application. This allows countries to configure them as they wish, in particularly concerning privacy issues. </p>
<h2>Learning to listen to citizen-consumers</h2>
<p>Policymakers seem to have forgotten that they too must address their citizen-consumers – it’s not just about the centralized control of health data. Citizens’ fear of tracking by authorities or company managers may even be greater than the fear of tracking by Google and Apple for marketing purposes. Therefore, policymakers should reinvent the governance of apps for health and overall for social purposes. </p>
<p>It is should be noted that when coronavirus contact tracing apps are developed, it is generally experts in data privacy or representatives of various public bodies that are consulted. To my knowledge, only Switzerland conducted an opinion poll asking citizens what their attitudes were about such an application
and <a href="https://www.swissinfo.ch/eng/covid-19_poll--70--of-residents-back--swisscovid--tracing-app/45783230">70% of respondees backed the application</a>.</p>
<p>Efforts to educate and learn from citizens and residents about the issues of such apps should be encouraged and used for decision-making. It’s also essential for policymakers to incorporate and emphasis customer experience in employee and citizen experiences. This is not only a matter of adoption but also, more generally, of establishing trust in decision makers, be they in firms or the government.</p><img src="https://counter.theconversation.com/content/143118/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Roxana Ologeanu-Taddei has received funding from the following organizations: Fondation of Social Sciences (Fondation de France), Pikcio Services company, University of Montpellier, Métropole de Montpellier, Occitanie Region, French Ministry of Health. She is a member of the think tank Renaissance numérique.</span></em></p>In response to the Covid-19 epidemic, more than 50 countries have developed tracing applications to help alert citizens and authorities when outbreaks occur. But the process is anything but simple.Roxana Ologeanu-Taddei, Associate professor in management of information systems, Université de MontpellierLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1274492020-07-16T16:01:54Z2020-07-16T16:01:54ZSouth Africa would gain from co-operation among BRICS countries on beneficiation<figure><img src="https://images.theconversation.com/files/343480/original/file-20200623-188891-19556wo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Russian soldiers march during a Victory Day parade. The country makes guns and armoury as its main beneficiation output.</span> <span class="attribution"><span class="source">Photo by Dmitry Korotayev/Epsilon/Getty Images</span></span></figcaption></figure><p>South Africa needs a sounder mineral beneficiation policy to tackle the challenges the country faces, particularly rising unemployment. Its beneficiation policy is not very demanding and isn’t pursued with any vigour.</p>
<p>For example, South Africa should be pursuing trade cooperation with Brazil, Russia, India and China. Together with South Africa they make up the BRICS grouping, which was formed in 2010 with the purpose of pursuing economic development. </p>
<p>China and India are resource-rich economies. But they still need additional raw materials to supplement their production amid a faster industrialisation agenda. </p>
<p>In addition, the BRICS have a rich history of beneficiation. Beneficiation activities in China and Russia have had a positive bearing on the economy and industrialisation. Both countries have developed strong <a href="https://books.google.co.za/books?id=Jk8CXr5GSrIC&pg=PA228&dq=China%E2%80%99s+Military+Strategy.++2017.++Beijing+Review+(23),+4+June+2015&hl">military and ammunition capabilities</a>. For its part <a href="https://books.google.co.za/books?id=C5OHAwAAQBAJ&pg=PA49&dq=IT+in+india&hl=en&sa=X&ved=2ahUKEwjFp8Xy1qDqAhUTilwKHSWOCScQ6AEwAXoECAYQAg#v=onepage&q=IT%20in%20india&f=false">India has focused on information technology</a> and <a href="https://books.google.co.za/books?id=YMvWtOFb5rYC&pg=PA38&dq=brazil+transport+industry&hl=en&sa=X&ved=2ahUKEwij2fvr2aDqAhVTSsAKHc9KBScQ6wEwBHoECAMQAQ#v=onepage&q=brazil%20transport%20industry&f=false">Brazil on the transport industry</a>.</p>
<p>South Africa would benefit greatly from exchanging knowledge and skills with the other BRICS countries. Engagements that seek to do this could enhance cooperation among member countries. They could result in South Africa benefiting from what has worked – and what hasn’t worked – elsewhere. </p>
<p>There has been <a href="http://www.eurare.eu/docs/eres2014/firstSession/XiaoshengYang.pdf">research</a> on how China (a BRICS member state) has used beneficiation to drive economic development. But little has been done on how the BRICS member countries can collectively drive beneficiation and how it could benefit South Africa.</p>
<p>The BRICS countries have never jointly agreed to a beneficiation policy. But work has been done on how <a href="https://www.brandsouthafrica.com/investments-immigration/business/trends/global/mining-270313">BRICS collaboration</a> could be improved. These opportunities have included skills transfer, information sharing and investment in both downstream and upstream beneficiation activities.</p>
<p><a href="http://hdl.handle.net/10321/3286">For my previous research</a> I explored the effects of the BRICS partnership on mineral beneficiation in South Africa by investigating a partnership approach in beneficiation cooperation and commodity trade. </p>
<p>The aim of my study was to explore the effects of the BRICS partnership on mineral beneficiation in South Africa. I concluded by recommending a model which called for gradual beneficiation and experiments in South Africa with the support of the BRICS.</p>
<h2>Findings</h2>
<p>South Africa leads in mining of platinum group metals (PGMs) and gold. In addition, the production of ferrous metals such as manganese and chrome is of world class standard. The iron ore exports in the country have been reported to be incremental. </p>
<p>South Africa can boast of 90% of platinum metals produced in the world, 80% of manganese, 73% of chrome, 45% of vanadium and 41% of the gold extracted on earth. I conducted a survey among mining companies as part of my research. Around 80% were based in Gauteng (Johannesburg and Pretoria). Just under 60% of the companies involved were in the business of mining strategic minerals such as coal, diamonds, gold and platinum. </p>
<p>Skills transfer emerged as a major theme, with 90% of the participants stating that skills training was needed for downstream beneficiation. This pointed to the need for skills transfer among the BRICS. Examples included cutting and polishing of minerals and making craft jewellery. </p>
<p>It was also established that BRICS activities could be improved by collaborative synergies, financial resources provision and a favourable fiscal policy. </p>
<p>The study called for a conversation among stakeholders on the current mining charter and the national mineral beneficiation policy of South Africa. As BRICS countries are interested in commodity trade and mining activities, South Africa could tap into the knowledge of counterparts on beneficiation issues where success stories are visible. <a href="https://books.google.co.za/books?id=uy8iaN7J9vYC&pg=PT9&lpg=PT9&dq=Jones,+S.+2012.+BRICS+and+beyond:+executives+lessons+on+emerging+markets.+West+Sussex:+John+Wiley+%26+Sons">A diversification strategy</a> is increasingly being adopted by the biggest South American conglomerates.</p>
<p>My research found that South Africa was outperformed by its counterparts within the partnership on economic grounds. However, it scored higher on democratic institutions and performance of the banking sector. </p>
<h2>Next steps</h2>
<p>The World Trade Organisation is not in favour of a country forcing local beneficiation onto the market (a participant stated this during the interview). This points to the need for South Africa to take a gradual approach.</p>
<p>It first needs to ensure steady economic growth, identify regional market opportunities and lobby the international partners in the BRICS to improve its chances of having the WTO and other stakeholders support national beneficiation policy.</p>
<p>Other hurdles that need to be cleared include the supply of water and power. </p>
<p>BRICS countries should encourage synergies among them towards a more responsible beneficiation practice. The synergies can be used in a form of “aggregate power” exerted to change and transform the global economy.</p><img src="https://counter.theconversation.com/content/127449/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Byelongo Elisee Isheloke receives funding from NRF for his current Postdoctoral Research Fellowship at the University of Cape Town. The article is based on his relatively recently completed PhD thesis, graduated in September 2018, completed at the Durban University of Technology, for which he got a DUT Scheme Scholarship for 2.5 years and NRF Scholarship only for the final year. In South Africa he has no political affiliation. </span></em></p>South Africa would benefit greatly from the rich beneficiation experience of Brazil, Russia, India and China.Byelongo Elisée Isheloke, Postdoctoral Research Fellow, University of Cape TownLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1382552020-06-07T11:20:23Z2020-06-07T11:20:23ZWith the increase in remote work, businesses need to protect themselves against cyberattacks<figure><img src="https://images.theconversation.com/files/339600/original/file-20200603-130961-oie7bf.jpg?ixlib=rb-1.1.0&rect=56%2C0%2C6240%2C4156&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">As employees work from home, companies are becoming more vulnerable to cyber attacks.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>The COVID-19 pandemic and subsequent lockdown have forever changed how we socialize and conduct business. More and more, our personal and professional lives will be online. </p>
<p>Paradoxically, our office towers sit empty. However, the amount of traffic in the virtual world continues to increase exponentially. Our physical borders are closed, but the virtual ones remain wide open, and relatively undefended. Cybercriminals — callous opportunists of the worst kind — take advantage of crises to engage in even more attempts to penetrate computer networks and extract data.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/coronavirus-pandemic-has-unleashed-a-wave-of-cyber-attacks-heres-how-to-protect-yourself-135057">Coronavirus pandemic has unleashed a wave of cyber attacks – here's how to protect yourself</a>
</strong>
</em>
</p>
<hr>
<p>Phishing, <a href="https://www.csoonline.com/article/3411439/smishing-and-vishing-how-these-cyber-attacks-work-and-how-to-prevent-them.html">smishing (SMS phishing) and vishing (voice phishing)</a> attacks are <a href="https://www.pcmag.com/news/phishing-attacks-increase-350-percent-amid-covid-19-quarantine">all on the rise</a>. Our tendency to click on infected emails has <a href="https://www.cbc.ca/news/technology/phishing-messages-surge-coronavirus-1.5513315">increased with the correspondent increase in email traffic</a> — a two-fold impact on the severity of the threat environment.</p>
<h2>New work spaces</h2>
<p>In the past, knowledge workers might have been centralized into one, or a few locations, with controlled access to information. Now they are are dispersed across thousands of sites that the enterprise has no control over. Face-to-face communications are taking place on open, web-based platforms like <a href="https://zoom.us/">Zoom</a>, <a href="https://help.blackboard.com/Collaborate">bbCollaborate</a>, <a href="https://www.bluejeans.com/">BlueJeans</a>, <a href="https://www.gotomeeting.com/en-ca">GoToMeeting</a>, <a href="https://apps.google.com/meet/">Google Meet</a> and many others, all vying for market share in an attempt to become the industry standard.</p>
<p>Concurrently, managers in organizations are dealing with unseen reductions in business volumes and making the difficult decisions of laying-off employees, shutting down plants and stores, and yet somehow still maintaining some kind of presence and level of customer service in the hope of recovering losses once the pandemic response restrictions are eased.</p>
<p>The challenges for enterprises of all kinds, then, are many: How can they maintain service levels while managing cuts and workarounds?</p>
<p>How do they provide employees with the equipment, tools, resources and information to work from home?</p>
<p>How do they balance restrictions from the lockdown against recovery when it lifts?</p>
<p>How do they support employees and protect them from burnout, exhaustion and other mental health issues? This is especially true for administrative front-line workers like those in information technology (IT) who are now responsible for maintaining secure, fully operational and accessible virtual work environments.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=337&fit=crop&dpr=1 600w, https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=337&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=337&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/339597/original/file-20200603-130940-ye7nqu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Staff who work in information technology for businesses are dealing with extra demands on their time and expertise.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<h2>Adapting for cyber-resiliency</h2>
<p>The “start, stop, continue” approach offers a powerful structure to frame possible answers to the questions and dilemmas surrounding cybersecurity. Here, I offer three things to start, two to stop, and three to continue to ensure strong cyber-resilience is retained.</p>
<p><strong>START:</strong> The most important thing to start is to monitor internal and external security threats and incidents. A few months ago, most of us had not even heard of Zoom, much less used it on a daily basis for both work and social gatherings. Most of us were not used to working from home, accessing work files remotely, uploading and downloading gigabytes of data. Most of us did not have more than rudimentary security on our home routers and networks. Most of us only had a passing knowledge of the IT support staff at work (usually called in a panic). </p>
<p>For managers and executives, this means daily reports on security incidents, their sources (internal or external), their nature and whether new types of attacks and attackers have been observed.</p>
<p>Enterprises also need to start asking themselves about the impact this new work environment has had on customers, employees, suppliers and other stakeholders. Executives should monitor what is being adjusted, and how. For example, to what extent are access permissions (to databases, files, systems and information) being increased? Concurrently, to what extent are insider monitoring programs being deployed to ensure employees do not inadvertently, or deliberately leak confidential or proprietary information?</p>
<p>Finally, the time has come to start enhanced online security protocols and tools, like multi-factor authentication, <a href="https://www.lastpass.com/state-of-the-password/global-password-security-report-2019">which only 57 per cent of enterprises are using</a>.</p>
<p><strong>STOP:</strong> In dealing with the new, distributed and virtual operating environment, organizations should first immediately stop or suspend any non-critical IT projects: this is not the time to continue with replacement of administrative systems, access systems, enterprise networking enhancements, application development or any other project aimed at changing or enhancing business processes. </p>
<p>There are two reasons for this. First, IT staff burnout increases exponentially <a href="https://www.techrepublic.com/article/employee-burnout-on-the-rise-since-covid-19/">in the current situation</a>. They are dealing with a deluge of requests to configure home systems, manage access, provide ad hoc and formal training and deal with emergency shutdowns, not to mention an increased risk of breaches. They are not only at <a href="https://www.cnbc.com/2020/03/16/biggest-mistake-employees-make-at-work-is-even-harder-to-avoid-in-coronavirus-age-says-ceo.html">risk of burning out</a>, but of making critical errors if they are also asked to continue non-essential development work. </p>
<p>The second reason is that hackers and other criminals will deliberately target organizations that are attempting to juggle remote staff support and IT development, perceiving these organizations to be weak, unfocused and inattentive. </p>
<p>Shadow IT are <a href="https://www.cisco.com/c/en/us/products/security/what-is-shadow-it.html">information systems or applications that individuals or departments use without the knowledge or support of IT staff in the organization</a>. For example, a marketing manager may prefer to use privately sourced customer relationship management software that they find more accessible and modifiable, without the need to submit change requests to an IT department. The problem with shadow IT is that it has not been vetted for any potential security vulnerabilities. In the event of a breach, system administrators may not be notified or able to contain the breach if it emerges from a shadow system. </p>
<p><strong>CONTINUE:</strong> Most organizations have well-developed crisis response plans as part of their enterprise risk frameworks. These documents need to be updated to reflect the new circumstances. Organizations need to contact their insurance providers — including for cyber-insurance — and third-party support providers to alert them to their new operating environment. Like the enterprises they serve, these insurers and providers are also trying to cope and may be temporarily overburdened. Finally, organizations must continue to rehearse and update these plans.</p>
<p>Executives need to continue monitoring resources in their organizations, and where necessary, rapidly adjust budgets, staffing levels and other resources, allocating them to those areas that most need them. This might mean re-allocating IT development budgets and staff to cybersecurity or plant and office maintenance to supporting remote work environments.</p>
<p>Finally, executives need to ensure that succession plans for key staff are current. This is especially true for IT and cybersecurity personnel.</p>
<h2>Preparing for the unknown</h2>
<p>COVID-19 will prove to be a generational event with long-lasting and as yet unknown effects on society. By critically considering and discussing what to Start, Stop, or Continue with regards to cyber-resilience, businesses and their employees will be in a better position to anticipate, mitigate and flourish in current conditions and beyond.</p><img src="https://counter.theconversation.com/content/138255/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Parent does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Businesses need to develop their cyber-resiliency by examining their business practices, and stopping, continuing or starting cybersecurity measures.Michael Parent, Professor of Management Information Systems / Fellow - David and Sharon Johnston Centre for Corporate Governance, Rotman School of Management, University of Toronto, Simon Fraser UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1366542020-05-15T12:13:16Z2020-05-15T12:13:16ZThe lack of women in cybersecurity leaves the online world at greater risk<figure><img src="https://images.theconversation.com/files/334516/original/file-20200512-82379-8phx6n.jpg?ixlib=rb-1.1.0&rect=494%2C213%2C4974%2C3549&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Women bring a much-needed change in perspective to cybersecurity.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/rear-view-of-female-computer-hacker-coding-at-desk-royalty-free-image/1159379067?adppopup=true">Maskot/Maskot via Getty Images</a></span></figcaption></figure><p>Women are highly underrepresented in the field of cybersecurity. In 2017, women’s share in the U.S. cybersecurity field was <a href="https://www.pwc.com/us/en/services/consulting/cybersecurity/women-in-cybersecurity.html">14%, compared to 48% in the general workforce</a>. </p>
<p>The problem is more acute outside the U.S. In 2018, <a href="https://www.nature.com/articles/d41586-018-03327-w">women accounted for</a> 10% of the cybersecurity workforce in the Asia-Pacific region, 9% in Africa, 8% in Latin America, 7% in Europe and 5% in the Middle East. </p>
<p>Women are even less well represented in the upper echelons of security leadership. Only <a href="https://www.fifthdomain.com/workforce/2019/01/18/how-more-women-on-cybersecurity-teams-can-create-advantages/">1% of female internet security workers</a> are in senior management positions.</p>
<p><a href="https://scholar.google.com/citations?user=Qx3YMi4AAAAJ&hl=en&oi=ao">I study</a> <a href="https://www.springer.com/gp/book/9783642115219">online crime</a> and <a href="https://theconversation.com/blockchain-voting-is-vulnerable-to-hackers-software-glitches-and-bad-id-photos-among-other-problems-122521">security</a> issues facing <a href="https://ieeexplore.ieee.org/document/9034675">consumers</a>, <a href="https://ieeexplore.ieee.org/abstract/document/8666661">organizations</a> and <a href="https://www.springer.com/gp/book/9783319405537">nations</a>. In my research, I have found that internet security requires <a href="https://link.springer.com/book/10.1057/9781137021946">strategies beyond technical solutions</a>. Women’s representation is important because women tend to offer viewpoints and perspectives that are different from men’s, and these underrepresented perspectives are critical in addressing cyber risks. </p>
<h2>Perception, awareness and bias</h2>
<p>The low representation of women in internet security is linked to the broader problem of their low representation in the science, technology, engineering and mathematics fields. Only <a href="https://www.nsf.gov/news/news_summ.jsp?cntn_id=190924&WT.mc_id=USNSF_51&WT.mc_ev=click">30% of scientists and engineers in the U.S.</a> are women.</p>
<p>The societal view is that internet security is <a href="http://genderandset.open.ac.uk/index.php/genderandset/article/view/449">a job that men do</a>, though there is nothing inherent in gender that predisposes men to be more interested in or more adept at cybersecurity. In addition, the industry mistakenly gives potential employees the impression that <a href="https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/women-working-cybersecurity-gender-gap.aspx">only technical skills matter in cybersecurity</a>, which can give women the impression that the field is overly technical or even boring. </p>
<p>Women are also generally not presented with opportunities in information technology fields. In a survey of women pursuing careers outside of IT fields, <a href="https://www.computerweekly.com/news/450420822/PaloAlto-Networks-partners-with-US-Girl-Scouts-on-security-skills">69% indicated that</a> the main reason they didn’t pursue opportunities in IT was because they were unaware of them.</p>
<p>Organizations often fail to try to recruit women to work in cybersecurity. According to a survey conducted by IT security company Tessian, <a href="https://www.helpnetsecurity.com/2020/03/12/cybersecurity-gender-gap/">only about half of the respondents</a> said that their organizations were doing enough to recruit women into cybersecurity roles. </p>
<p>Gender bias in job ads further discourages women from applying. Online cybersecurity job ads <a href="https://www.csoonline.com/article/3490417/gender-diversity-in-cybersecurity-matters-to-the-business.html">often lack gender-neutral language</a>. </p>
<h2>Good security and good business</h2>
<p>Boosting women’s involvement in information security makes both security and business sense. Female leaders in this area tend to prioritize important areas that males often overlook. This is partly due to their backgrounds. Forty-four percent of women in information security fields <a href="https://www.nature.com/articles/d41586-018-03327-w">have degrees in business and social sciences</a>, compared to 30% of men. </p>
<p>Female internet security professionals put a <a href="https://1c7fab3im83f5gqiow2qqs2k-wpengine.netdna-ssl.com/wp-content/uploads/2019/03/Women-in-the-Information-Security-Profession-GISWS-Subreport.pdf">higher priority on internal training and education</a> in security and risk management. Women are also stronger advocates for online training, which is a flexible, low-cost way of increasing employees’ awareness of security issues. </p>
<p>Female internet security professionals are also <a href="https://1c7fab3im83f5gqiow2qqs2k-wpengine.netdna-ssl.com/wp-content/uploads/2019/03/Women-in-the-Information-Security-Profession-GISWS-Subreport.pdf">adept at selecting partner organizations</a> to develop secure software. Women tend to pay more attention to partner organizations’ qualifications and personnel, and they assess partners’ ability to meet contractual obligations. They also prefer partners that are willing to perform independent security tests. </p>
<p>Increasing women’s participation in cybersecurity is a <a href="https://www.scmagazine.com/home/sc-corporate-news/help-sc-honor-women-and-diversity-in-cybersecurity-with-your-recommendations/">business issue</a> as well as a gender issue. According to an Ernst & Young report, by 2028 women will control <a href="http://www.ey.com/GL/en/Issues/Driving-growth/Growing-Beyond---High-Achievers---Women-make-all-the-difference-in-the-world">75% of discretionary consumer spending worldwide</a>. Security considerations like encryption, fraud detection and biometrics are becoming important in <a href="http://www.itproportal.com/2016/05/11/high-profile-data-breaches-affecting-consumer-trust-in-big-brands/">consumers’ buying decisions</a>. Product designs require a trade-off between cybersecurity and usability. Female cybersecurity professionals can make better-informed decisions about such trade-offs for products that are targeted at female customers.</p>
<h2>Attracting women to cybersecurity</h2>
<p>Attracting more women to cybersecurity requires governments, nonprofit organizations, professional and trade associations and the private sector to work together. Public-private partnership projects could help solve the problem in the long run. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/334520/original/file-20200512-82383-19c1j21.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A computer science teacher, center, helps fifth grade students learn programming.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Girls-Tech-Scores/f038776721b740dcb797dce201f86061/10/0">AP Photo/Elaine Thompson</a></span>
</figcaption>
</figure>
<p>One example is Israel’s <a href="https://www.rashi.org.il/cybergirlz">Shift community</a>, previously known as the CyberGirlz program, which is jointly financed by the country’s Defense Ministry, the Rashi Foundation and Start-Up Nation Central. It identifies high school girls with aptitude, desire and natural curiosity to learn IT and and helps them develop those skills. </p>
<p>The girls participate in hackathons and training programs, and get advice, guidance and support from female mentors. Some of the mentors are from elite technology units of the country’s military. The participants learn hacking skills, network analysis and the Python programming language. They also practice simulating cyber-attacks to find potential vulnerabilities. By 2018, <a href="https://www.jta.org/2018/09/26/israel/new-program-recruiting-israeli-girls-cyber-warfare-high-tech-futures">about 2,000 girls participated</a> in the CyberGirlz Club and the CyberGirlz Community. </p>
<p>In 2017, cybersecurity firm Palo Alto Networks <a href="https://www.computerweekly.com/news/450420822/PaloAlto-Networks-partners-with-US-Girl-Scouts-on-security-skills">teamed up with the Girl Scouts of the USA</a> to develop cybersecurity badges. The goal is to foster cybersecurity knowledge and develop interest in the profession. The curriculum includes the basics of <a href="https://www.nbcnews.com/tech/tech-news/girl-scouts-fight-cybercrime-new-cybersecurity-badge-n852971">computer networks, cyberattacks and online safety</a>. </p>
<p>Professional associations can also foster interest in cybersecurity and help women develop relevant knowledge. For example, <a href="https://www.wics.es/proyectos/mentoring/">Women in Cybersecurity of Spain</a> has started a mentoring program that supports <a href="https://www.bbva.com/en/female-cybersecurity-experts-take-the-floor-at-bbva/">female cybersecurity professionals early in their careers</a>.</p>
<p>Some industry groups have collaborated with big companies. In 2018, Microsoft India and the Data Security Council of India launched the CyberShikshaa program in order to create <a href="https://cybersecurityventures.com/women-in-cybersecurity/">a pool of skilled female cybersecurity professionals</a>. </p>
<p>Some technology companies have launched programs to foster women’s interest in and confidence to pursue internet security careers. One example is <a href="https://www.forbes.com/sites/georgenehuang/2016/10/04/why-women-in-tech-should-consider-a-career-in-cybersecurity/#59522d033e6f">IBM Security’s Women in Security Excelling program</a>, formed in 2015. </p>
<p>Attracting more women to the cybersecurity field requires a range of efforts. Cybersecurity job ads should be written so that female professionals feel welcome to apply. Recruitment efforts should focus on academic institutions with high female enrollment. Corporations should ensure that female employees see cybersecurity as a good option for internal career changes. And governments should work with the private sector and academic institutions to get young girls interested in cybersecurity. </p>
<p>Increasing women’s participation in cybersecurity is good for women, good for business and good for society.</p>
<p>[<em>Insight, in your inbox each day.</em> <a href="https://theconversation.com/us/newsletters?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=insight">You can get it with The Conversation’s email newsletter</a>.]</p><img src="https://counter.theconversation.com/content/136654/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Nir Kshetri does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Women are underrepresented in technology fields, but especially so in cybersecurity. It’s not just a matter of fairness. Women are better than men at key aspects of keeping the internet safe.Nir Kshetri, Professor of Management, University of North Carolina – GreensboroLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1380682020-05-11T01:11:13Z2020-05-11T01:11:13ZCoronavirus: the first big test of the information age and what it could mean for privacy<figure><img src="https://images.theconversation.com/files/333401/original/file-20200507-49546-rm1oo9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/back-view-female-employee-speak-talk-1689338029">Shutterstock</a></span></figcaption></figure><p>The late Harvard sociologist <a href="https://www.britannica.com/biography/Daniel-Bell">Daniel Bell</a> long ago predicted the coming of the “information society”, which he said would soon replace industrial society. Bell foresaw scientific experts driving government policy, services taking over from manufacturing and computers becoming the main mode of interaction between people. “What counts”, he wrote in his classic <a href="https://www.basicbooks.com/titles/daniel-bell/the-coming-of-post-industrial-society/9780465097135">The Coming of Post-Industrial Society</a> (1973), “is not raw muscle power, or energy, but information”. The new society, about which he was mildly optimistic, would be a reality by about 2020. </p>
<p>The coronavirus crisis has thrown into sharp relief the trends Bell and other information society thinkers identified. In fact, I believe we are undergoing the first real test of the “information society thesis”. Whole populations have now been driven online. Amazon, Tesco and our favourite takeaways are open while most factories and industrial plants are mothballed. Essential services have been protected. Teaching and marking have migrated to cyberspace. Meetings are now “virtual”. Scientists are indeed calling the shots – onscreen via their laptops. </p>
<p>Information technology is enabling people to read, play and communicate – and, now, help the government <a href="https://www.countypress.co.uk/news/18424253.coronavirus-tracking-phone-app-trialled-isle-wight/">trace the path of the infection via an app</a>. The elderly and other vulnerable groups are under electronic as well as human care. My mother, living alone 400 miles away and recovering from the virus, is buoyed up all day long with face-time calls. My wife’s mother has Alexa at her side in her care home. </p>
<p>Meanwhile, 3D printers in private dwellings are churning out protective equipment. Health messages are getting through. The mass media have been joined by social media in a never-ending supply of information and opinion. There is much to cheer about, then, and not just on Thursday evenings.</p>
<p>All of which begs the question: had coronavirus struck before the mass adoption of information and communication technologies, might we be seeing a much less “United” Kingdom? Who knows what bored teenagers – without Netflix, Playstation and Instagram – would be up to? Perhaps something along the lines of <a href="https://www.anthonyburgess.org/a-clockwork-orange/">A Clockwork Orange</a>, Anthony Burgess’s deeply disturbing dystopian nightmare? So in many ways, the great information age test must be judged a success. The future is here, and it seems to work. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/SDobWSgj1Zc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Data and privacy</h2>
<p>However, we must add a cautionary note: Bell warned that the centralisation of data in the information society could prove a serious threat to privacy. <a href="https://www.britannica.com/biography/Theodore-Roszak">Theodore Roszak</a>, the Californian commentator who coined the term “counter-culture”, was much blunter. In <a href="https://www.ucpress.edu/book/9780520085848/the-cult-of-information">The Cult of Information</a> (1994), he wrote: </p>
<blockquote>
<p>Something very big, new, and threatening is permeating our political life. For the snoops, the sneaks, the meddlers, data glut is a feast. It gives them exactly what they require.</p>
</blockquote>
<p>Many contemporary scholars are saying much the same. Information society studies (as I christened the field back in 2000) has given rise to a whole subfield devoted to the issue – namely <a href="https://www.surveillance-studies.net/?page_id=2">surveillance studies</a>. The specialism’s founder, Professor <a href="https://www.sscqueens.org/people/david-lyon">David Lyon</a>, once told me that the surveillance society is just the flip-side of the information society.</p>
<p>But the issue is existential as well as academic. Cameras are everywhere; drones, facial recognition, roadblocks and the snoop-line numbers that some over-zealous police forces advertise have all become normalised. There can be no doubt that the infrastructure for complete <a href="https://fs.blog/2018/03/hannah-arendt-totalitarianism/">totalitarianism</a> is now in place, should power fall into the wrong hands.</p>
<p>I believe that surveillance has already gone much too far. As someone posted to an online newspaper recently: “George Orwell’s 1984 is a warning, not a bloody instruction manual!” We must of course look after our senior citizens, the main victims of COVID-19, but we must also protect the rights for which their generation fought so bravely. </p>
<p>The most important reference point in all this is no longer 1984, but 9/11. Many of the emergency measures may have been rescinded, but there remains an extremely toxic legacy. Torture was <a href="https://theweek.com/articles/555720/how-george-w-bush-dick-cheney-brought-torture-america">brought back</a> by the George W Bush administration and <a href="https://www.scottishlegal.com/article/d-1">supported</a> by Tony Blair’s government as a “necessary” response to the “unprecedented” danger. Now torture is something the <a href="https://www.psychologytoday.com/us/blog/sex-dawn/201412/did-24-prime-the-pump-torture">“good guys” sometimes do</a>. On this issue, the entire world order is in a worse place than it was.</p>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=904&fit=crop&dpr=1 600w, https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=904&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=904&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1136&fit=crop&dpr=1 754w, https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1136&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/333390/original/file-20200507-49573-12m179j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1136&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Noam Chomsky.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/istanbul-turkey-october-10-reputed-author-125767208">Shutterstock</a></span>
</figcaption>
</figure>
<p>What, then, will be the permanent deposit of lockdown once this new tide goes back in? Greater state surveillance, in all probability. As the <a href="https://www.bbc.co.uk/news/av/technology-52401049/coronavirus-what-is-contact-tracing-and-how-does-it-work">contact-tracing app is rolled out</a>, a UK think tank has already <a href="https://www.bbc.co.uk/news/technology-52401763">proclaimed</a> that an increase in state surveillance is “a price worth paying”.</p>
<p>I do not subscribe to <a href="https://www.sciencedirect.com/topics/computer-science/technological-determinism">technological determinism</a> – which holds that the development of technology determines broad social changes – but <a href="https://www.schneier.com/essays/archives/2010/01/security_and_functio.html">history tends to show</a> that useful technologies end up being deployed far beyond their original functions. The coronavirus tracker apps spreading around the world may well be the proverbial slippery slope. Having meekly observed lockdown, populations are likely to become more submissive to tracking, regimenting and general snooping by the powers that be.</p>
<p>Coronavirus has revealed that much of life continues to function thanks to technology. Society has not fallen apart. Bell and the information society theorists would feel vindicated in their general projections. But we will all need to think harder about the implications for privacy. And to borrow the linguist and political thinker <a href="https://www.britannica.com/biography/Noam-Chomsky">Noam Chomsky’s</a> phrase, it is surely the “<a href="https://chomsky.info/19670223/">responsibility of intellectuals</a>” to lead the way.</p><img src="https://counter.theconversation.com/content/138068/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alistair S. Duff is affiliated with NO2ID Edinburgh.</span></em></p>Technology has made life under coronavirus workable and bearable for a great many. But will it mean further intrusions into our privacy that normally would be unacceptable?Alistair S. Duff, Professor of Information Policy, Edinburgh Napier UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1244302019-10-02T04:27:35Z2019-10-02T04:27:35ZAustralia’s digital competitiveness is slipping. Here’s how we can catch up<figure><img src="https://images.theconversation.com/files/294971/original/file-20191001-173369-3b4f70.jpg?ixlib=rb-1.1.0&rect=14%2C7%2C4648%2C2949&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Communications infrastructure and investing in skills will be vital to success in the 21st century.</span> <span class="attribution"><a class="source" href="https://pixabay.com/photos/smart-city-communication-network-4168483/">Tumisu / Pixabay</a></span></figcaption></figure><p>Australia’s ability to compete with other nations in a technology-enabled world is declining, according to a <a href="https://www.ceda.com.au/News-and-analysis/Media-releases/Australia-digital-competitiveness-slips">report</a> recently released by the Committee for Economic Development of Australia (CEDA).</p>
<p>In 2019 Australia dropped to 14th on the global league table of digital competitiveness, down from 13th last year and ninth in 2015. </p>
<p>The results, from the <a href="https://www.imd.org/wcc/world-competitiveness-center-rankings/world-digital-competitiveness-rankings-2019/">World Digital Competitiveness rankings</a> compiled by the Swiss-based International Institute for Management Development, show that Australia is becoming complacent in areas such as science education, information and communication infrastructure, and digital literacy. </p>
<h2>What is digital competitiveness?</h2>
<p>Digital competitiveness is a standardised measure of a country’s ability to develop cutting-edge digital technologies as well as its willingness to invest in research and development (R&D) and promote digital literacy training to create new knowledge, all of which are key drivers for economic development. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/CPbT8umgaTY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Proactive countries put money and effort into this process, regarding it as nation-building that hedges against future uncertainty. These countries score highly in the rankings. Countries further down the list tend to be reactive, sitting back and letting others go first. </p>
<h2>In what areas are we behind?</h2>
<p>The overall digital competitiveness score has three components: knowledge, technology, and future readiness. </p>
<p>Australia’s <a href="https://www.ceda.com.au/CEDA/media/ResearchCatalogueDocuments/PDFs/2019_AustraliaDigitalCompRanking.pdf">scores</a> across these categories show we need to try much harder in future readiness. Our scores are also falling in the sub-categories of adaptive attitudes, business agility, and IT integration. </p>
<p>In a field of 63 countries, Australia comes 44th on current digital and technological skills and employers’ willingness to train their staff in these areas. </p>
<p><iframe id="gmF6Y" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/gmF6Y/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>Which countries are doing it right?</h2>
<p>The top ten countries in 2019 are the United States, Singapore, Sweden, Denmark, Switzerland, Netherlands, Finland, Hong Kong, Norway, and South Korea. </p>
<p>Looking at the strategic approach of the top five, all emphasise knowledge generation, but beyond that there are different approaches to digital competitiveness. The US and Sweden put equal emphasis on knowledge generation, creating a conducive environment for technology development, and fostering a willingness to innovate. Singapore, Denmark and Switzerland each place heavier emphasis on one or two of the factors.</p>
<h2>More STEM graduates</h2>
<p>At 53rd place, Australia ranks abysmally in the proportion of our university graduates in science and mathematics - the people who do research and development now and will continue to do it in the future. Our universities are among the best in the world, so that is not the problem. If jobs for these graduates existed, universities would be meeting the demand. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australia-must-prepare-for-massive-job-losses-due-to-automation-43321">Australia must prepare for massive job losses due to automation</a>
</strong>
</em>
</p>
<hr>
<p>Australia’s information and communication technologies, including internet infrastructure, also score very poorly at 54th. This will not surprise the many Australian businesses and individuals who put up with slow, patchy internet connections. With more computing services and data moving into the cloud, fast internet is essential. </p>
<p>The news is not all bad though. Australia rates highly as a desirable destination for international students. It also scores well on digital access to government services, and ease of starting a business. </p>
<h2>Why is Australia slipping?</h2>
<p>Australia has grown complacent in certain areas, and we have been unwilling to invest sufficiently in building our digital capability in the areas mentioned. “Sufficient” is the key word. The fact that we are falling behind other countries means we cannot say we are investing enough. </p>
<p>The CEDA report indicates that one key reason for the investment shortfall is the disparity between the public’s and employers’ perspectives on how much it is needed. Industry sees a greater need than the general public does, but government policy tends to align with public sentiment for electoral reasons. </p>
<p>Funding is limited and there are many voices competing for a share of government spending. It is the squeaky wheel that gets the oil.</p>
<h2>Building digital capability</h2>
<p>Nation-building projects at scale need a coordinated approach across public and private sectors. Building the physical infrastructure to meet future needs is no different in principle to building the nation’s digital capabilities, which includes creating the communication technology, the means to develop new knowledge and ways of applying it to good effect. This is no less important than roads, power stations and hospitals for the nation’s future. </p>
<h2>A national conversation</h2>
<p>Australia needs to have a long conversation in national, state and local forums about the importance of investing in our digital future. We need to talk about all the ways R&D can benefit the Australian community, and why businesses need to embrace cutting-edge technology. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/around-50-of-homes-in-sydney-melbourne-and-brisbane-have-the-oldest-nbn-technology-115131">Around 50% of homes in Sydney, Melbourne and Brisbane have the oldest NBN technology</a>
</strong>
</em>
</p>
<hr>
<p>If we don’t get consensus on staying competitive we will fall further and further behind as more proactive countries accelerate their efforts. In time the economy will suffer, unemployment will rise and quality of life decline. It is no legacy to leave our children. </p>
<p>We are indeed a lucky country with our resources, but that will take us only so far in the 21st century. For the sake of future generations we have to make a new kind of luck and level up our digital game.</p><img src="https://counter.theconversation.com/content/124430/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Tuffley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Australia is falling behind other countries in digital skills and infrastructure, but smarter planning and investment can turn things around.David Tuffley, Senior Lecturer in Applied Ethics & CyberSecurity, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1063502018-11-12T19:01:16Z2018-11-12T19:01:16ZUnlocking Australia’s productivity paradox. Why things aren’t that super<figure><img src="https://images.theconversation.com/files/244997/original/file-20181112-38373-3448zu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">At least in the movies, Superman is getting less productive. We are scarcely any more productive than we were two years ago, and it is weighing on wages.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>2018 marks the 40 year anniversary of <a href="https://www.imdb.com/title/tt0078346/">the first Superman film</a>. Starring as Superman, Christopher Reeve fought foes and vanquished villains in an action-packed battle between good and evil. </p>
<p>Four decades on, Superman continues to feature in films, but often not alone.</p>
<p>He now stars alongside Batman, Wonder Woman, The Flash, Aquaman and other superheroes. For the fans of DC Comics, it is a delightful coming together of childhood favourites. </p>
<p>But for economists, it symbolises a worrying decline in productivity. </p>
<h2>Superman needs help</h2>
<p>Where once a single superhero was able to save the world, now two or more are required to complete the same task.</p>
<p>As Oscar Wilde once said, life often imitates art. </p>
<p>Back when the first Superman film was released, <a href="https://www.imf.org/en/Publications/Staff-Discussion-Notes/Issues/2017/04/03/Gone-with-the-Headwinds-Global-Productivity-44758">average annual total factor productivity growth</a> among advanced economies was almost 10 times what it fell to in 2016.</p>
<p>In Australia, it was <a href="http://www.abs.gov.au/AUSSTATS/abs@.nsf/Lookup/5260.0.55.002Main+Features12016-17?OpenDocument">three times higher</a> in 1995-96 than in 2016-17. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australias-productivity-problem-why-it-matters-8584">Australia's productivity problem: why it matters</a>
</strong>
</em>
</p>
<hr>
<p>Real wage growth has been <a href="http://www.abs.gov.au/ausstats/abs@.nsf/mf/6345.0">close to zero in the past two years</a>, in line with <a href="https://www.pc.gov.au/inquiries/completed/productivity-review/report">close to zero productivity growth</a>.</p>
<p>But what is most striking about what has happened is when it has happened. The past 25 years have seen extraordinary advances in technology. </p>
<h2>We ought to be much more productive</h2>
<p>An extra 3.5 billion people have <a href="https://data.worldbank.org/indicator/IT.NET.USER.ZS">gained access to the internet</a>, the processing power of computers has skyrocketed, and we now have smartphones, with almost everything on them, and factories and warehouses that are automated in ways that would have once only been dreamt of. </p>
<p>The sharing economy promises to unlock the full potential of our idle cars, our unused bicycles and empty rooms and houses. The accumulated history of human knowledge is at our fingertips. </p>
<p>So where’s the resulting increase in productivity? </p>
<p>US economist Robert Solow once <a href="http://www.standupeconomist.com/pdf/misc/solow-computer-productivity.pdf">famously remarked</a> that “you can see the computer age everywhere but in the productivity statistics”.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-internet-has-done-a-lot-but-so-far-little-for-economic-growth-105294">The internet has done a lot, but so far little for economic growth</a>
</strong>
</em>
</p>
<hr>
<p>Economists have since put forward a variety of explanations for this paradox, but with little agreement. </p>
<h2>There’s little agreement about why we’re not</h2>
<p>Some, like 2018 Nobel Laureate William Nordhaus, <a href="https://www.nber.org/papers/w21547.pdf">point to</a> historical data showing long lag-times between technological advances and increases in productivity. </p>
<p>For them, a surge in productivity is just around the corner – 10 years away, <a href="https://ieeexplore.ieee.org/document/7951155">according to some estimates</a>. </p>
<p>Others, like Harvard’s Martin Feldstein, <a href="https://www.nber.org/papers/w23306">argue the</a> paradox is driven by measurement failures. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/budget-explainer-the-problem-with-measuring-productivity-56901">Budget explainer: the problem with measuring productivity</a>
</strong>
</em>
</p>
<hr>
<p>Others argue that the productivity improvements from technology have been crowded out by other factors, like the aftershocks of the <a href="https://www.imf.org/en/Publications/Staff-Discussion-Notes/Issues/2017/04/03/Gone-with-the-Headwinds-Global-Productivity-44758">global financial crisis</a>, <a href="https://www.un.org/development/desa/dpad/wp-content/uploads/sites/45/publication/dsp_policy_11.pdf">weak demand and investment</a>, <a href="https://www.un.org/development/desa/dpad/wp-content/uploads/sites/45/publication/dsp_policy_11.pdf">slowing trade</a>, <a href="https://www.ecb.europa.eu/pub/pdf/other/ebbox201707_01.en.pdf?1e019e8433fa8b79b19327c22c8a9286">stalling growth in global value chains</a>, <a href="https://www.imf.org/en/Publications/WP/Issues/2016/12/31/The-Impact-of-Workforce-Aging-on-European-Productivity-44450">ageing populations</a>, <a href="https://www.oecd.org/eco/growth/OECD-2015-The-future-of-productivity-book.pdf">reduced investment in education</a>, the <a href="https://www.nber.org/papers/w20941.pdf">impacts of automation</a> on demand and inequality, <a href="https://www.brookings.edu/blog/up-front/2018/04/05/todays-economic-puzzles-a-tale-of-weakening-competition/">weakening competition</a> and reduced <a href="https://www.oecd.org/economy/growth/The-Walking-Dead-Zombie-Firms-and-Productivity-Performance-in-OECD-Countries.pdf">business dynamism</a>.</p>
<p>Harvard’s Marc Melitz <a href="https://web.stanford.edu/%7Eklenow/Melitz.pdf">suggests that</a> an explanation for the paradox may lie at the firm-level. </p>
<h2>Productivity growth might be hidden</h2>
<p>While some firms have been highly productive, their effects have been offset by laggard firms. The OECD <a href="https://www.oecd.org/eco/growth/OECD-2015-The-future-of-productivity-book.pdf">found that</a> “frontier firms” have consistently achieved productivity growth six times that of laggard firms which have dragged down the average. </p>
<p><a href="https://www.oecd.org/economy/growth/The-Walking-Dead-Zombie-Firms-and-Productivity-Performance-in-OECD-Countries.pdf">Some attribute this</a> to the increased prevalence of “<a href="https://www.oecd.org/economy/growth/The-Walking-Dead-Zombie-Firms-and-Productivity-Performance-in-OECD-Countries.pdf">zombie firms</a>” – unproductive firms kept alive by cheap money, low interest rates and nervous investors. </p>
<p>It’s possible to see this at the industry level. John Fernald, from the Federal Reserve Bank of San Francisco, <a href="https://www.nber.org/papers/w20248">finds that</a> productivity gains from information and communications technology have been concentrated in specific industries, the benefits from which have been netted-out by industries that have failed to adopt new technologies. </p>
<p>Northwestern University’s Robert Gordon, however, <a href="https://www.nber.org/papers/w18315">sees no paradox at all</a>.</p>
<h2>Or technology might be holding it back</h2>
<p>He says, as much as we might like them, the technological advances in recent decades have been no match for the really big advances between 1870 and 1970, such as electricity and the automobile.</p>
<p>Harvard’s Jeff Frankel goes further. </p>
<p>He <a href="https://www.project-syndicate.org/commentary/technological-innovation-hurting-productivity-by-jeffrey-frankel-2018-03?barrier=accesspaylog">points to evidence</a> suggesting the latest advances in technology might be actually cutting productivity by <a href="https://bankunderground.co.uk/2017/11/24/is-the-economy-suffering-from-the-crisis-of-attention/">distracting us and reducing our attention spans</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/why-we-should-approach-claims-of-a-productivity-crisis-with-caution-26000">Why we should approach claims of a productivity crisis with caution </a>
</strong>
</em>
</p>
<hr>
<p>Others are less pessimistic, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=942310">but still conclude</a> that as firms adopt more and more new technology, the extra returns from those extra investments shrink.</p>
<p>With all this uncertainty, what’s the best approach for an incoming or reelected government? </p>
<h2>So what should we do?</h2>
<p>It sounds trite, but the best approach is “flexibility”.</p>
<p>More precisely it is well functioning mechanisms that allow us to adjust things such as exchange rates, interest rates, government spending and industry settings.</p>
<p>On the whole we have these mechanisms. We will also need strong laws that encourage competition; that will enable new or suddenly productive firms to displace old ones that have grown used to large market shares.</p>
<p>If we do turn out to be on the cusp of a new productivity surge, a flexible, competitive economy will enable us to spread the benefits quickly.</p>
<h2>Allow good firms to grow, bad ones to die</h2>
<p>If instead we turn out to be on track for a low productivity future, or if the productivity gains from new technology are crowded out by other effects, then flexibility can also help, redirecting resources away from inefficient firms to more efficient ones. </p>
<p>If it turns out the productivity paradox is no paradox at all but merely a measurement failure, then it is yet another reason to properly fund organisations such as the Australian Bureau of Statistics.</p>
<p>The new government will need to watch, and to some extent it will need to wait. But it will need to be ready.</p>
<p>As American economist <a href="https://www.oecd.org/sdd/productivity-stats/40526851.pdf">Paul Krugman observed</a> a generation ago when productivity growth was much higher than it is today, “productivity isn’t everything, but in the long run it is almost everything”.</p><img src="https://counter.theconversation.com/content/106350/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Adam Triggs is a former advisor to the Hon. Andrew Leigh MP, the Shadow Assistant Treasurer and Shadow Minister for Competition and Productivity.</span></em></p>In the midst of the information technology revolution, Australia’s productivity growth has been slowing. It ought to have been the other way around.Adam Triggs, Research fellow, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1026122018-09-06T23:19:44Z2018-09-06T23:19:44ZMobile platforms can give refugees access to vital information when they arrive in Australia<figure><img src="https://images.theconversation.com/files/234696/original/file-20180903-41717-1lq765c.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The majority of refugees have access to smartphones.</span> <span class="attribution"><span class="source">Tür a Tür Digital Factory, 2017</span>, <span class="license">Author provided</span></span></figcaption></figure><p>Waves of asylum seekers emerging from conflict zones in Myanmar, Syria, Sudan, Iraq, Yemen and elsewhere are expected to add more than one million people to <a href="http://www.unhcr.org/protection/resettlement/593a88f27/unhcr-projected-global-resettlement-needs-2018.html">global resettlement needs</a> this year. </p>
<p>These refugees face a <a href="https://www.sbs.com.au/news/canadian-minister-talks-up-benefits-of-immigration-on-australia-visit">world of closing doors</a>, but they also offer economic opportunities and cultural enrichment to countries that welcome them. While some refugees <a href="https://theconversation.com/refugees-are-integrating-just-fine-in-regional-australia-101188">are integrating well in regional Australia</a>, others still face <a href="http://www.sydwestms.org.au/images/documents/promo-material/2016/Migration-Trajectories-Report_Final.pdf">significant challenges in the capital cities</a>. </p>
<p>As concerned researchers, we are interested in how information technologies could help refugees resettle. Our work with <a href="https://www.business.unsw.edu.au/research/research-strengths/digital-enablement/digital-platforms-for-refugees-sandbox">organisations assisting refugees</a> has shown that having access to timely information about Australian life is essential.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/i-teach-refugees-to-map-their-world-94160">I teach refugees to map their world</a>
</strong>
</em>
</p>
<hr>
<p>We’re in the early stages of building an ecosystem of digital services that aggregates and delivers this kind of information to refugees – and to the organisations involved in supporting, employing, educating and caring for them. To guide our work, and avoid reinventing the wheel, we’ve looked at comparable experiences in Germany, which has a <a href="http://www.bamf.de/SharedDocs/Anlagen/DE/Downloads/Infothek/Statistik/Asyl/aktuelle-zahlen-zu-asyl-april-2018.pdf?__blob=publicationFile">high intake of refugees</a>.</p>
<iframe src="https://datawrapper.dwcdn.net/pCcNX/1/" scrolling="no" frameborder="0" allowtransparency="true" width="100%" height="400"></iframe>
<h2>Information chaos</h2>
<p>In Germany, there are a number of national and international agencies that provide assistance for refugees, each with regulations and responsibilities that differ from region to region. Accessing basic services, such as the internet, money transfer, health care and schooling, presents a new challenge to already traumatised people. </p>
<p>The information refugees need is <a href="https://www.tandfonline.com/doi/abs/10.1080/02681102.2017.1335280?journalCode=titd20">distributed among</a> asylum counsellors, social assistance offices, youth welfare offices, local non-government organisations, volunteers and more. In some cases, this information is quickly outdated. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=582&fit=crop&dpr=1 600w, https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=582&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=582&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=731&fit=crop&dpr=1 754w, https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=731&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/234699/original/file-20180903-41717-1dgk1kd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=731&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Information chaos faced by refugees.</span>
<span class="attribution"><a class="source" href="https://www.tandfonline.com/doi/full/10.1080/02681102.2017.1335280">Schreieck et al. 2017 p.626</a>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Getting access to the right information in a timely manner is difficult given the multitude of information sources, language barriers and geographical complications. This situation encourages new refugees to seek information from those who have arrived earlier, leading to the spread of outdated or misinterpreted information.</p>
<h2>Going mobile</h2>
<p>One difference between this refugee crisis compared to earlier ones is the ubiquity of information technology. Because the overwhelming majority of refugees <a href="https://www.independent.co.uk/voices/comment/surprised-that-syrian-refugees-have-smartphones-well-sorry-to-break-this-to-you-but-youre-an-idiot-10489719.html">have access to smartphones</a>, a number of mobile initiatives have emerged to provide support. </p>
<p><a href="https://en.wikipedia.org/wiki/Hackathon">Hackathon</a> volunteers in Germany built a mobile guide for refugees called <a href="https://markenwerk.net/disziplinen/entwicklung/produkt/moin-refugee/project/">Moin</a>, as well as a tool that helps refugees with administrative processes called <a href="https://www.bureaucrazy.de/">bureaucrazy</a>. Unfortunately, these apps required volunteers to keep the information up to date, which was challenging over an extended period.</p>
<p>Still, some initiatives have produced sustainable outcomes by eliminating the need for third-party updates. Instead, these apps allow information providers to update information themselves. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-refugees-overcome-the-odds-to-become-entrepreneurs-85091">How refugees overcome the odds to become entrepreneurs</a>
</strong>
</em>
</p>
<hr>
<p>For example, the <a href="https://integreat-app.de/en/">Integreat</a> project is a mobile application for refugees living in a particular German municipality. It provides information on the asylum process, local points of contact and aspects of daily life. The municipality and local NGOs maintain Integreat’s information through a content management system accessible via web browsers. </p>
<p>The platform’s design means it can easily be extended to other municipalities, which can mirror existing content and reuse translations into different languages. This further reduces the effort required to gather and maintain relevant information, providing a helpful addition to asylum programs.</p>
<h2>Housing and employment matchmaking</h2>
<p>While applications such as Integreat can help refugees during their first few months in the host country, things get more complicated when refugees try to relocate to permanent housing. </p>
<p>In Germany, language barriers, high demand for apartments among locals and resistance from some property owners who don’t want to rent to refugees has made finding accommodation a significant problem. Some German municipalities invested a substantial effort to house refugees by contacting landlords directly.</p>
<p>In some cases property owners would like to support refugees, but they do not know how to approach them. A digital platform that connects property owners and refugees, such as the Berlin-based digital platform <a href="http://www.fluechtlinge-willkommen.de/">Flüchtlinge-Willkommen</a> (Refugees Welcome), could help alleviate such problems.</p>
<p>Similar matchmaking services have been built to match German employers who have difficulty finding qualified employees with refugees who are looking for work.
<a href="https://workeer.de/">Workeer</a> is available in Germany, and <a href="https://refugeetalent.com/">refugeetalent</a> is a similar initiative operating in Australia.</p>
<p>But matchmaking is only one side of the story. German and Australian labour regulations limit the options for refugees, who might not be legally eligible to work straight away or hold qualifications that aren’t recognised in their new homeland. So digital platforms should also offer information for employers and refugees on labour regulations, vocational training and how to transfer qualifications.</p>
<h2>What else can be done?</h2>
<p>Everyone can help contribute to refugee resettlement solutions. Our work suggests the following actions would be helpful:</p>
<ul>
<li><p>governments should allocate more funding for IT projects that support the resettlement of refugees</p></li>
<li><p>researchers, organisations and volunteers should collaborate to create an ecosystem of digital services that connect and improve current solutions</p></li>
<li><p>information systems researchers should evaluate the impact of proposed solutions. The benefits of new technologies such as blockchain or machine learning, for example, could be evaluated with little risk</p></li>
<li><p>universities should engage with nonprofit refugee organisations to create opportunities for refugees who want to further their studies or skills</p></li>
<li><p>companies – particularly those in the IT industry – should engage in IT projects that support refugees, such as the <a href="https://handbookgermany.de/de.html">Handbook Germany</a>, which was initiated by German telecommunications company Deutsche Telekom.</p></li>
</ul>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-tech-can-bring-dignity-to-refugees-in-humanitarian-crises-94213">How tech can bring dignity to refugees in humanitarian crises</a>
</strong>
</em>
</p>
<hr>
<p>We take inspiration from stories like <a href="https://www.theguardian.com/australia-news/2018/apr/15/this-is-my-country-how-a-melbourne-suburb-defied-the-far-right-to-welcome-refugees">what happened in Eltham</a>. In this Melbourne suburb, residents welcomed the arrival of Syrian refugees and supported them in settling into a different culture, getting a job and learning English. </p>
<p>In doing so, Eltham’s residents created a positive experience for both the refugees and the Eltham community. There is room for hope in our humanitarian responses and we believe we can and should do more.</p><img src="https://counter.theconversation.com/content/102612/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Manuel Wiesche owns shares in the Tür an Tür Digital Factory gGmbH, a not-for-profit organization that develops IT services that support integration of refugees in Germany.
</span></em></p><p class="fine-print"><em><span>Maximilian Schreieck owns shares in the Tür an Tür Digital Factory gGmbH, a not-for-profit organization that develops IT services that support integration of refugees in Germany.</span></em></p><p class="fine-print"><em><span>Walter Daniel Fernandez does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Accessing the right information in a timely manner is essential, but difficult for newly arrived refugees. Information varies, plus there are language barriers and geographical complications.Walter Daniel Fernandez, Professor of Information Systems, UNSW SydneyManuel Wiesche, Postdoctoral research associate, Technical University of MunichMaximilian Schreieck, PhD student, Technical University of MunichLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/990822018-08-26T20:11:14Z2018-08-26T20:11:14ZThe digital divide: small, social programs can help get seniors online<figure><img src="https://images.theconversation.com/files/232413/original/file-20180817-165958-klc8tg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Older Australians are falling behind younger people in their capacity to access and make use of the internet.</span> <span class="attribution"><span class="source">www.shutterstock.com</span></span></figcaption></figure><p>In Australia, use of the internet is almost universal. At last count, <a href="http://www.abs.gov.au/ausstats/abs@.nsf/mf/8146.0">86%</a> of the population was digitally connected.</p>
<p>Despite the continuing stereotype that many older people are not technologically savvy, or that they can’t learn new things, in 2015 <a href="https://www.acma.gov.au/theACMA/engage-blogs/engage-blogs/Research-snapshots/Digital-lives-of-older-Australians">79% of people aged 65 and over</a> had used the internet compared to <a href="http://www.abs.gov.au/AUSSTATS/abs@.nsf/2f762f95845417aeca25706c00834efa/feff508f920ab48cca2570fe00198565!OpenDocument">6%</a> in 2001. </p>
<p>Increasingly, government services such as <a href="https://my.gov.au/LoginServices/main/login?execution=e1s1">myGov</a> and <a href="https://www.myagedcare.gov.au/">myagedcare</a> rely on individuals having access to computers and the internet, and the knowledge of how to use them. Older adults who do not have these skills may therefore miss out on the services and support they require.</p>
<p>Online resources are not likely to help those who are not already online. But small, social, community run programs that allow seniors to share learning and do it on their own terms have proved effective.</p>
<h2>Factors that might affect an older person’s ability to go online</h2>
<p>The <a href="https://digitalinclusionindex.org.au/wp-content/uploads/2016/08/Australian-Digital-Inclusion-Index-2017.pdf">Australian Digital Inclusion Index</a> (ADII) measures which social groups benefit the most from digital connection, and which ones are being left behind. The score is based on measures of access, affordability and digital ability. It shows how these dimensions change over time, according to people’s social and economic circumstances, as well as across geographic locations.</p>
<p>In 2017, Australia’s overall score improved from the previous year. But the age gap has grown steadily, indicating older people are falling behind younger people in their capacity to access and make use of the internet. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/232630/original/file-20180820-30599-tjp7tf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">There are many factors restricting older people from accessing the internet.</span>
<span class="attribution"><span class="source">www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>People aged 65 and over are among the least digitally included groups in Australia, particularly if they are women, on lower incomes or not living in a major city.</p>
<p>Someone in their 60s is more likely to be familiar with and use the internet than someone in their 70s, 80s or older. This is due to the likelihood that younger cohorts would have used such technology in their working lives.</p>
<p>Of course, there are many other variables which can greatly affect an older adult’s chance of being capable of accessing and using the web. These include gender, education, employment status/type, social and economic background, language skills and current health and/or disability. </p>
<p>Where older Australians live can also be a determining factor, as many <a href="https://digitalinclusionindex.org.au/wp-content/uploads/2016/08/Australian-Digital-Inclusion-Index-2017.pdf">regional, rural or remote locations</a> don’t have the digital infrastructure to ensure access to the internet equal to that of metropolitan centres. </p>
<h2>Programs currently available</h2>
<p>Interest in and ability to use digital technologies plays a role in limiting or promoting access and use. For those older Australians who are not yet online but who want to be, there are now a number of community and online programs available that can help them become tech savvy. </p>
<p>One of these programs is <a href="https://www.telstra.com.au/tech-savvy-seniors">Telstra’s Tech Savvy Seniors program</a>. The online program contains a number of self-paced and self-directed learning modules and includes step by step videos and instruction guides (in 11 languages as well as English) which provide older adults with the basic skills to use computers, the internet and smart phones. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/SVqMLkD2raI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Telstra instructional video: how to change settings on an Android device.</span></figcaption>
</figure>
<p>Other providers include the federal government’s <a href="https://beconnected.esafety.gov.au/index.php?redirect=0">Be Connected Program</a> which aims to ensure “every” Australian is online. It also provides online learning modules. So does the <a href="https://www.godigi.org.au/about">GoDigi Program</a>, delivered through a partnership between Australia Post and Infoxchange and the <a href="https://www.ascca.org.au/index.php/computer-clubs-list">Australian Seniors Computer Clubs Association</a>, an organisation which provides access to a large number of computer clubs around Australia to support older Australians in using computer technology.</p>
<p>The online components of all these programs make the assumption interested older adults already have basic access to the internet or at least someone to help them with their initial set-up and going online.</p>
<p>For those older adults who don’t have this kind of support and who learn best with personal instruction, a number of face-to-face training programs are provided across all these organisations. In many cases these programs are held at local libraries, which play an important role as sites for hosting learning events, particularly in rural and regional contexts.</p>
<p>There are also various social enterprises emerging with the aim to support older people for a fee, such as the <a href="https://lively.org.au/help-with-technology/">Lively organisation</a>. These programs can be great for those who would prefer help in-home. But they’re also problematic in addressing the <a href="https://theconversation.com/australias-digital-divide-is-not-going-away-91834">digital divide</a> for older people on low incomes, which is one of the biggest barriers to online access. There is a real need to develop programs that are free for older people in order to address the affordability divide. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australias-digital-divide-is-not-going-away-91834">Australia's digital divide is not going away</a>
</strong>
</em>
</p>
<hr>
<p>Recently, home and community care organisations have recognised the benefits technology can provide to their clients. One provider implemented its <a href="https://www.amanaliving.com.au/news-publications/news/2018/05/fighting-loneliness-digital-inclusion">own technology program</a> to help promote connection and alleviate social isolation for its home, community and residential care recipients. The program includes an intergenerational initiative which teams care residents with local high school students who teach them how to use technology. The social aspects of the program for both students and residents were shown to be as beneficial as the knowledge gained. </p>
<h2>Communities of practice</h2>
<p>Another provider, <a href="http://www.umbrellacommunitycare.com.au/services/social-clubs-activities/">Umbrella Multicultural Community Care</a>, set up an “internet café”, in which older migrants attend the local community centre once or twice a week to learn how to better access the internet. Attendees used this opportunity to do online shopping, reconnect with family, places and friends from overseas, as well as to join online poetry groups or listen to music from their homeland. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/232414/original/file-20180817-165967-1jph6ep.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Community programs are helping older migrants connect to the internet.</span>
<span class="attribution"><span class="source">www.shutterstock.com</span></span>
</figcaption>
</figure>
<p>Programs aimed at enhancing the digital literacy of elderly migrants are <a href="https://doi.org/10.1016/j.puhe.2018.03.005">more likely</a> to be successful if delivered in socially supportive settings. The Internet Cafe provides an important opportunity to build and retain social networks across distance. </p>
<p>Another recent example of this successful method of learning can be seen in the <a href="https://thewest.com.au/news/peel-rockingham/tech-classes-for-rockingham-seniors-ng-ya-376016">iPad Seniors Group</a> in WA. It was started by a community volunteer who won an iPad and didn’t know how to use it. She set up a self-help group to teach herself and others, which has now been running for six years.</p>
<p>What these programs have in common is a willingness to support older adults to learn how to use the internet in a social environment at their own pace on their own terms, from each other. Expanding funding and awareness for these sorts of programs has the best chance of bridging the digital divide.</p><img src="https://counter.theconversation.com/content/99082/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sue Malta and Raelene Wilding are co-authors of a chapter entitled "Not so ubiquitous: Digital Inclusion and Older Adults in Australia" in the edited volume "Digital Inclusion: Be on the Right Side of the Digital Divide" due out in October 2018 by Lexington Books. </span></em></p><p class="fine-print"><em><span>Raelene Wilding receives funding from the Australian Research Council for the project, Ageing and New Media (DP160102552). </span></em></p><p class="fine-print"><em><span>Loretta Baldassar receives funding from the Australian Research Foundation for the project Ageing and New Media (DP160102552). She is affiliated with a number of residential aged care facilities in Western Australia where she conducts research for this project. </span></em></p>Important programs helping older adults learn how to use the internet are effective but limited.Sue Malta, Research Fellow, National Ageing Research Institute and School of Population and Global Health, University of Melbourne, and Adjunct Research Fellow, Swinburne University, The University of MelbourneRaelene Wilding, Associate Professor of Sociology, La Trobe UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/971412018-06-13T19:12:31Z2018-06-13T19:12:31ZBeware of shadows created by the cloud<figure><img src="https://images.theconversation.com/files/222651/original/file-20180611-191947-ppfc2k.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C1196%2C772&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cloud services are flexible, but are they secure?</span> <span class="attribution"><a class="source" href="https://pxhere.com/en/photo/1241325">Pxhere</a></span></figcaption></figure><p>As kids, we were all haunted by our shadows following us everywhere. Physics taught us that shadows are mostly present on sunny days, where they tend to hide from clouds. This rule, however, does not apply to today’s digital era. How can we explain the strong presence of shadow IT activities with the emergence of cloud services?</p>
<h2>What is shadow IT?</h2>
<p>Shadow IT activities refer to IT projects being managed by outside providers without the knowledge of the IT department. Having business units short-circuit their IT department to acquire a new service represents the core definition of shadow IT activities within a firm. In addition, as cloud technologies facilitate the provision of services for business units, firms are suffering from an increased number of shadow IT activities. </p>
<p>According to a survey conducted by <a href="https://www.eu.ntt.com/en/about-us/press-releases/news/article/Shadow_IT_Cloud_Usage_a_Growing_Challenge_for_CIOs.html">NTT communications</a> in 2016, 83% of the respondents perceive shadow IT activities to be increasing by 2018, where 87% of the participants were business oriented and 80% IT oriented. What is the harm in business employees adopting some cloud services without the consent of the IT department? Are IT employees aware of the different causes of shadow IT activities? Are business employees attentive to the various consequences to such activities? Would communication between these departments reduce the presence of shadow IT?</p>
<h2>Root causes of shadow IT</h2>
<p>Shadow IT activities have gained popularity with the emergence of new technologies such as cloud computing. The various benefits generated by cloud services (low costs, flexibility, easy access, ubiquity, etc.) have increased its adoption in firms, especially by business units. However, as adoption is increasing, shadow IT activities are also increasing. <a href="https://www.ciphercloud.com/shadow-it-discovery">CipherCloud</a> sheds light on the strong presence of shadow IT in large firms. </p>
<p>Their statistics show that 80% of business employees short-circuit their IT departments and buy unsanctioned Software-as-a-Service (software cloud services) from cloud service providers. For instance, when business employees feel unsatisfied by the services provided by their IT units, the easily accessible nature of cloud technologies enable them to acquire the needed services. This fulfills their needs, overriding the rigidity, control and inflexibility of traditional IT departments. Through the adoption of cloud services, businesses are found with the need to develop their IT skills allowing them hence, to feel independent of their IT departments.</p>
<h2>Impact of shadow IT</h2>
<p>Shadow IT activities lead to several negative consequences. For example, business employees are not always aware of the different national and supranational regulations and compliance laws, which threatens their firm. According to a report by <a href="https://info.skyhighnetworks.com/WP-CARR-Q2-2015_Download_White.html?Source=website&LSource=website">SkyHigh Networks</a> in 2015, only 7% of the adopted cloud services met the firm’s security, governance and compliance requirements. Thus, various security issues are generated by the presence of shadow IT activities. </p>
<p>In addition, <a href="https://www.gartner.com/smarterwithgartner/top-10-security-predictions-2016/?cm_mmc=social-_-rm-_-gart-_-swg">Gartner</a> predicted that by 2020, one third of attacks experienced by firms will target their shadow IT resources. Even if cloud technologies possess a long list of benefits, they also generate many threats for firms. One of the highest risks is security breaches facilitated by storing data in public clouds. These data are also prone to be lost. <a href="https://www.emc.com/about/news/press/2014/20141202-01.htm">EMC</a> affirmed a $1.7 trillion loss per year caused by data loss, confirming that businesses remain unprepared in the cloud era. In addition to these threats, excessive shadow IT activities cause disruptive controlled environments in a firm, in addition to some resource conflicts leading to a loss of synergies between the various departments.</p>
<p>However, if we see shadow IT from business units’ perspectives, we could claim that it has some important positive consequences. For starters, acquiring innovative services not only improves the firm’s innovation but its productivity as well. Business employees, then, deal with up-to-date solutions allowing them to be competitive in today’s fast-evolving market. Through saving employees’ time, they can hence focus on their main core businesses. This shows that if managed and governed correctly, shadow IT can have a positive impact on the firm. But, how should firms govern and manage their firms with the presence of shadow IT activities?</p>
<h2>Solutions to shadow IT</h2>
<p>Governance is one of the top concerns of every successful firm. When governed correctly and effectively, firms are prone to be competitive and successful. As seen earlier, shadow IT activities lead to various dangerous consequences, threatening the firm on many levels. Thus, in order to redirect these threats into positive consequences, firms need to raise awareness regarding the security, compliance and regulation issues caused by shadow IT activities. </p>
<p>In addition, better synergies, more frequent communication, and closer collaborations enable the strategic alignment of business and IT objectives. Through this alignment, the firm heads in one clear direction, transforming its obstacles into pillars. Shadow IT activities possess a high impact on the relationship between business and IT units. Thus, when dealt with carefully and governed effectively, this relationship pushes the firm forward.</p><img src="https://counter.theconversation.com/content/97141/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sabine Khalil ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'a déclaré aucune autre affiliation que son organisme de recherche.</span></em></p>Outside IT projects managed without the knowledge of IT departments are on the rise. What are the risks and possible consequences?Sabine Khalil, Assistant Professor in Management of Information Sytems, PropediaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/970622018-06-08T10:53:16Z2018-06-08T10:53:16ZStudents need IT skills to compete in the new economy<figure><img src="https://images.theconversation.com/files/222274/original/file-20180607-137285-wf455u.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Jobs that are 'IT intensive' have shown dramatic growth, new research shows.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/engineer-businesswoman-network-server-room-369371690">Mark Agnor/www.shutterstock.com</a></span></figcaption></figure><p>By 2026, employment in computer and information technology occupations is <a href="https://www.bls.gov/ooh/computer-and-information-technology/home.htm">projected to grow</a> 13 percent over what it was in 2016. Jobs in these fields will require skills in cloud computing, big data collection and storage, information security and more.</p>
<p>As I argue in <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3053127">a paper</a> in the Journal of Monetary Economics with <a href="https://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=1294107">Giovanni Gallipoli</a>, these information technology – or IT – skills are increasingly required if you want a job with upward mobility and autonomy.</p>
<p>A new “IT intensity” index that I developed illustrates this trend. Using data from the Bureau of Labor Statistics that measures tasks and skills across occupations, the index gauges how much occupations deal with information technology. I developed this IT intensity index as a <a href="https://www.christosmakridis.com/">labor economist</a> who studies <a href="https://scholar.google.com/citations?user=KKQXJ_8AAAAJ&hl=en&oi=ao">macroeconomic trends, policy and their interaction</a> with individuals and labor markets.</p>
<p>While the index is far from perfect, it allows us to distinguish between jobs that require more interaction with computers, whether it’s software engineering or coding. </p>
<h2>Increase in IT jobs</h2>
<p>Using our IT intensity index, together with data tables from the <a href="https://www.bls.gov/oes/">Occupation Employment Statistics program</a>, I found that IT intensive occupations grew by 19.5 percent between 2004 and 2017, while less IT intensive occupations only grew by 2.4 percent. That’s more than eight times as large of a growth rate over the past decade.</p>
<p>And yet, companies routinely <a href="https://www.usatoday.com/story/tech/talkingtech/2017/03/28/tech-skills-gap-huge-graduates-survey-says/99587888/">complain</a> about not being able to find enough workers. While the skills gap for <a href="https://www.burning-glass.com/research-project/digital-skills-gap/">digital and technical tasks</a> is large, some researchers argue it is largest for <a href="https://www.csoonline.com/article/3247708/security/research-suggests-cybersecurity-skills-shortage-is-getting-worse.html">cybersecurity</a>.</p>
<p>There is also concern about unmet demand for <a href="https://www.technologyreview.com/s/608707/the-myth-of-the-skills-gap/">coordination and communication skills</a>. </p>
<p>The fact that so many prospective job candidates lack these skills might <a href="https://economics.mit.edu/files/12763">help explain</a> the <a href="https://fred.stlouisfed.org/series/CIVPART">decline in labor force participation</a> and <a href="http://www.pewresearch.org/fact-tank/2014/10/09/for-most-workers-real-wages-have-barely-budged-for-decades/">stagnation of median hourly wages</a> over the past few decades.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222234/original/file-20180607-137301-18a3fxb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">‘IT intensive’ jobs rose at eight times the rate as other jobs from 2004 to 2017.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/developing-programming-coding-technologies-website-design-640115662">REDPIXEL.PL/www.shutterstock.com</a></span>
</figcaption>
</figure>
<p>On the other hand, <a href="http://www.nber.org/papers/w24001">new research</a> suggests that the economic benefits of technological advancements – such as the development of artificial intelligence – do not always show up right away in national measurements of productivity growth. The research blames <a href="http://www.nber.org/papers/w24001">“implementation lags”</a> in technology as a likely culprit.</p>
<h2>New models in higher education</h2>
<p>Educational institutions can help turn things around by equipping individuals with IT skills. Technology is changing at an increasing rate and a four-year degree may not give students the skills they need to remain competitive until retirement. Students today must become lifelong learners. To do that, universities need to provide their services to enough students to make an impact and focus on teaching relevant and tangible skills, particularly around data analysis, that are in increased demand. Several universities stand out as leaders in this regard.</p>
<p>The standouts include Arizona State University and Georgia State University, which rank first and fourth, respectively, as the <a href="https://www.usnews.com/best-colleges/rankings/national-universities/innovative">most innovative universities</a> in the United States, according to U.S. News. These schools are actively using <a href="https://www.npr.org/sections/ed/2016/10/30/499200614/how-one-university-used-big-data-to-boost-graduation-rates">big data</a> to help improve the delivery and scale of their services to improve student success rates.</p>
<p>For example, GSU is using data analytics to <a href="http://fortune.com/2017/02/18/big-data-college-completion/">help predict</a> how at-risk students might do in certain courses. This enables <a href="http://diverseeducation.com/article/102425/">better advising</a>, which in turn helps improve completion rates. <a href="https://www.nytimes.com/2018/05/15/us/georgia-state-african-americans.html">GSU has boosted its graduation rate</a> to 54 percent in 2017 – up from 32 percent in 2003.</p>
<h2>Benefits of completion</h2>
<p>What does a college degree have to do with equipping students with IT skills? Using my measure of IT jobs, together with data from the <a href="https://www.census.gov/programs-surveys/acs/">American Community Survey</a> between 2005 and 2016, I find that workers with a college degree are 36 percentage points more likely to work an IT job, controlling for other demographic factors, such as age, race and gender. Even if a student does not major in computer science, universities provide an environment to cultivate the skills needed to excel in IT jobs, which demand <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3053127">other interpersonal and noncognitive skills</a>. For instance, <a href="https://qz.com/945083/new-research-suggests-it-really-does-pay-to-get-a-double-major-in-college/">other research</a> that I’ve done suggests that students who double major in STEM and liberal arts tend to earn 10 percent higher annual salaries than their counterparts who double major in just one broad field.</p>
<p>ASU is another example of a university that has significantly boosted the number of students it educates. Specifically, ASU enrolled nearly <a href="https://www.asu.edu/about/enrollment">more than 98,000 students</a> in 2016 – significantly more than the 73,000 it enrolled in 2012 – while simultaneously rising in <a href="https://www.asu.edu/rankings">international rankings</a>.</p>
<p>One of the things that ASU does particularly well is provide students with real-world experience. For example, ASU’s <a href="https://entrepreneurship.asu.edu/launch/edson-student-entrepreneur-initiative">Edson Student Entrepreneur Initiative</a> provides students with funding, mentoring and office space to co-found a startup. </p>
<p>Encouraging students to engage with real-world problems not only directly cultivates their problem-solving skills, but also indirectly exposes them to the limitless possibilities of IT. Access to technology can help a startup to <a href="https://www.entrepreneur.com/article/273841">stay competitive</a>. The same holds true for students.</p><img src="https://counter.theconversation.com/content/97062/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Christos Makridis receives funding from National Science Foundation, Institute of Humane Studies, Stanford University, Rutgers SMLR, Lincoln Institute.</span></em></p>More students must acquire IT skills in order to secure jobs with upward mobility, according to a researcher who developed an index that shows a dramatic growth in ‘IT intensive’ jobs.Christos A. Makridis, Economist, Massachusetts Institute of Technology (MIT)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/941602018-05-18T10:42:14Z2018-05-18T10:42:14ZI teach refugees to map their world<figure><img src="https://images.theconversation.com/files/216696/original/file-20180427-135830-1aum3yd.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A scene from Zaatari refugee camp, Jordan.</span> <span class="attribution"><span class="source">Brian Tomaszewski</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>I first visited the Zaatari refugee camp in early 2015. Located in northern Jordan, the camp is home to more than 80,000 Syrian refugees. I was there as part of a <a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1427873">research study on refugee camp wireless and information infrastructure</a>. </p>
<p>It’s one thing to read about refugees in the news. It’s a whole different thing to actually go visit a camp. I saw people living in metal caravans, mixed with tents and other materials to create a sense of home. Many used improvised electrical systems to keep the power going. People are rebuilding their lives to create a better future for their families and themselves, just like any of us would if faced with a similar situation.</p>
<p><a href="https://scholar.google.com/citations?user=hcIpln4AAAAJ&hl=en">As a geographer</a>, I was quickly struck by how geographically complex Zaatari camp was. The camp management staff faced serious spatial challenges. By “spatial challenges,” I mean issues that any small city might face, such as keeping track of the electrical grid; understanding where people live within the camp; and locating other important resources, such as schools, mosques and health centers. Officials at Zaatari had some maps of the camp, but they struggled to keep up with its ever-changing nature. </p>
<p>An experiment I launched there led to up-to-date maps of the camp and, I hope, valuable training for some of its residents.</p>
<h2>The power of maps</h2>
<p>Like many other refugee camps, Zaatari developed quickly in response to a humanitarian emergency. In rapid onset emergencies, mapping often isn’t as high of a priority as basic necessities like food, water and shelter. </p>
<p>However, <a href="https://doi.org/10.1515/jhsem-2014-0082">my research shows</a> that maps can be an invaluable tool in a natural disaster or humanitarian crisis. Modern digital mapping tools have been essential for locating resources and making decisions in a number of crises, from the <a href="https://reliefweb.int/map/haiti/haiti-earthquake-damage-map-january-12-2010">2010 earthquake in Haiti</a> to <a href="https://data2.unhcr.org/en/documents/download/62995">the refugee influx in Rwanda</a>.</p>
<p>This got me thinking that the refugees themselves could be the best people to map Zaatari. They have intimate knowledge of the camp’s layout, understand where important resources are located and benefit most from camp maps. </p>
<p>With these ideas in mind, <a href="https://www.rit.edu/gccis/geoinfosciencecenter/">my lab</a> teamed up with the United Nations High Commissioner for Refugees and Al-Balqa and Princess Sumaya universities in Jordan. </p>
<p>Modern maps are often made with a technology known as Geographic Information Systems, or GIS. Using funding from <a href="http://www.unhcr.org/innovation/tag/innovation-fund/">the UNHCR Innovation Fund</a>, we acquired the computer hardware to create a GIS lab. From corporate partner Esri, we were obtained <a href="http://www.esri.com/nonprofit">low-cost, professional GIS software</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/216677/original/file-20180427-135840-wtt2af.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">RefuGIS team member Yusuf Hamad and his son Abdullah – who was born in Zaatari refugee camp – learning about GIS.</span>
<span class="attribution"><span class="source">Brian Tomaszewski</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Over a period of about 18 months, we trained 10 Syrian refugees. Students in the RefuGIS class ranged in age from 17 to 60. Their backgrounds from when they lived in Syria ranged from being a math teacher to a tour operator to a civil engineer. I was extremely fortunate that one of my students, Yusuf Hamad, spoke fluent English and was able translate my instructions into Arabic for the other students. </p>
<p>We taught concepts such as coordinate systems, map projections, map design and geographic visualization; we also taught how to collect spatial data in the field using GPS. The class then used this knowledge to map places of interest in the camp, such as the locations of schools, mosques and shops.</p>
<p>The class also learned how to map data using mobile phones. The data has been used to update camp reference maps and to support a wide range of camp activities. </p>
<p>I made a particular point to ensure the class could learn how to do these tasks on their own. This was important: No matter how well-intentioned a technological intervention is, it will often fall apart if the displaced community relies completely on outside people to make it work. </p>
<p>As a teacher, <a href="https://doi.org/10.1109/GHTC.2017.8239276">this class</a> was my most satisfying educational experience. This was perhaps my finest group of GIS students across all the types of students I have taught over my 15 years of teaching. Within a relatively short amount of time, <a href="https://data2.unhcr.org/en/documents/download/55994">they were able to create professional maps</a> that now serve camp management staff and refugees themselves. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=424&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=424&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=424&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=533&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=533&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218500/original/file-20180510-34006-1gt8yv3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=533&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A map created with geographic information collected by students in the RefuGIS program.</span>
<span class="attribution"><span class="source">UNHCR</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Jobs for refugees</h2>
<p>My experiences training refugees and humanitarian professionals in Jordan <a href="https://reliefweb.int/report/rwanda/unhcr-rwanda-factsheet-february-2015">and Rwanda</a> have made me reflect upon the broader possibilities that GIS can bring to the over <a href="http://www.unhcr.org/figures-at-a-glance.html">65 million refugees in the world today</a>. </p>
<p>It’s challenging for <a href="https://reliefweb.int/report/jordan/refugee-livelihoods-jordan-september-2017">refugees to develop livelihoods at a camp</a>. Many struggle to find employment after leaving. </p>
<p>GIS could help refugees create a better future for themselves and their future homes. If people return to their home countries, maps – essential to activities like construction and transportation – can aid the rebuilding process. If they adopt a new home country, they may find they have marketable skills. The worldwide geospatial industry is worth an estimated <a href="https://www.geospatialworld.net/blogs/geospatial-industrys-value-world-economy/">US$400 billion</a> and <a href="https://www.doleta.gov/brg/indprof/geospatial.cfm">geospatial jobs</a> are expected to <a href="https://www.doleta.gov/brg/indprof/geospatial_profile.cfm">grow over the coming years</a>. </p>
<p>Our team is currently helping some of the refugees get <a href="https://www.esri.com/training/certification/">GIS industry certifications</a>. This can further expand their career opportunities when they leave the camp and begin to rebuild their lives. </p>
<p>Technology training interventions for refugees often focus on things like <a href="http://refugeecodeweek.org/">computer programming</a>, <a href="https://medium.freecodecamp.org/how-we-taught-dozens-of-refugees-to-code-then-helped-them-get-developer-jobs-fd37036c13b0">web development</a> and other traditional IT skills. However, I would argue that GIS should be given equal importance. It offers a rich and interactive way to learn about people, places and spatial skills – things that I think the world in general needs more of. Refugees could help lead the way.</p><img src="https://counter.theconversation.com/content/94160/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Brian Tomaszewski receives funding from UNHCR and US NSF. </span></em></p>Maps can be an invaluable tool in a natural disaster or humanitarian crisis. A pilot project trained Syrian refugees at a Jordan camp to create their own.Brian Tomaszewski, Associate Professor of Information Sciences and Technologies, Rochester Institute of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/942692018-04-24T10:51:46Z2018-04-24T10:51:46ZWomen in tech suffer because of American myth of meritocracy<figure><img src="https://images.theconversation.com/files/214651/original/file-20180413-540-n4d3kp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Will they disrupt the tech sector? </span> <span class="attribution"><span class="source">Reuters/Eduardo Munoz</span></span></figcaption></figure><p>The <a href="http://journals.sagepub.com/doi/full/10.1177/2053168016672101">American dream is built</a> on the notion that the U.S. is a meritocracy. Americans believe success in life and business can be earned by anyone willing to put in the hard work necessary to achieve it, or so they say. </p>
<p>Thus, Americans commonly believe that those who are successful deserve to be so and those who aren’t are equally deserving of their fate – despite growing evidence that widening inequalities in <a href="https://www.routledge.com/Income-Inequality-in-America-An-Analysis-of-Trends-An-Analysis-of-Trends/Ryscavage/p/book/9781315703541">income</a>, <a href="http://goodtimesweb.org/industrial-policy/2014/SaezZucman2014.pdf">wealth</a>, <a href="http://www.jstor.org/stable/pdf/30034640.pdf?casa_token=1kXUGe2PvdQAAAAA:CHiX5oT5xeHEXYK4u5IhmroVwpu-EaDxmjOFhFBvND41PwFfZWKAuuoxPEvW999NmzaN-YaJCDIH1ZIZEAvPY62Cf_uzw9-KXV6Btm5w9Yk3nQ25ut0">labor</a> and <a href="https://ideas.repec.org/b/oxp/obooks/9780198779971.html">gender</a> play a major role in who makes it and who doesn’t.</p>
<p>And this very fact – that Americans believe their society is a meritocracy – is the biggest threat to equality, particularly when it comes to gender, as research by myself and others shows. </p>
<h2>The meaning of ‘meritocracy’</h2>
<p><a href="https://ideas.repec.org/b/oxp/obooks/9780198779971.html">Gender inequality</a> is pervasive in American society. </p>
<p>Women in the U.S. continue to experience <a href="http://dro.dur.ac.uk/16470/1/16470.pdf">gender bias</a>, <a href="https://link.springer.com/chapter/10.1007/978-94-017-9897-6_6">sexual harassment</a> and little progress in relation to equitable <a href="https://ideas.repec.org/b/oxp/obooks/9780198779971.html">wages</a>. Top positions in government and the business sector remain <a href="https://www.theguardian.com/global-development/2014/sep/29/women-better-off-far-from-equal-men">stubbornly male</a>.</p>
<p>At the same time, <a href="https://archive.nytimes.com/www.nytimes.com/packages/html/national/20050515_CLASS_GRAPHIC/index_01.html">75 percent of Americans</a> say they believe in meritocracy. This belief persists despite evidence that we tend to use it to <a href="https://www.researchgate.net/profile/John_Jost/publication/270539170_Working_for_the_System_Motivated_Defense_of_Meritocratic_Beliefs/links/55b2a23608ae9289a0858e2f.pdf">explain actions</a> that preserve the status quo of gender discrimination rather than reverse it. </p>
<p>This myth is so powerful, it influences our behaviors.</p>
<h2>‘Work harder’</h2>
<p>Entrepreneurship is an area where the myths and realities of the American meritocracy come to a head. </p>
<p>In the U.S., <a href="https://www.nawbo.org/resources/women-business-owner-statistics">women own 39 percent</a> of all privately owned businesses but receive only around 4 percent of venture capital funding. Put another way, male-led ventures receive 96 percent of all funding. </p>
<p>Yet the meritocracy myth, which <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2982414">my research shows</a> has a stronghold in the world of entrepreneurship, means that women are constantly told that all they have to do to get more of that <a href="https://nvca.org/pressreleases/total-venture-capital-dollars-invested-2017-track-reach-decade-high/">$22 billion or so in venture capital funding</a> is <a href="https://doi.org/10.1177/1042258717728028">make better pitches</a> or be more assertive. </p>
<p>The assumption is that women aren’t trying hard enough or doing the right things to get ahead, not that the way venture capitalists offer funding is itself unfair. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=432&fit=crop&dpr=1 600w, https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=432&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=432&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=543&fit=crop&dpr=1 754w, https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=543&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/214543/original/file-20180412-570-19se1zt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=543&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Ellen Pao, center, sued her venture capital firm for allegedly discriminated against her because she was a woman.</span>
<span class="attribution"><span class="source">AP Photo/Jeff Chiu</span></span>
</figcaption>
</figure>
<h2>‘Pipeline’ problem</h2>
<p>Another explanation for the lack of funding for women is pinned on the “pipeline” problem. That is, women just aren’t interested in the fields that form the backbone of the industry – science, technology, engineering and math. </p>
<p>Thus, if more women entered <a href="https://theconversation.com/us/topics/stem-8868">STEM fields</a>, there would be more women entrepreneurs, and more money would flow to them. Pipeline explanations assume that there are no obstacles <a href="https://theconversation.com/us/topics/women-in-stem-20447">preventing women</a> from becoming entrepreneurs in technology.</p>
<p>Yet, we know the opposite is true. According to technology historian Marie Hicks and her book “Programmed Inequality,” <a href="https://mitpress.mit.edu/books/programmed-inequality%22%22">women in tech were pushed out by men</a>. </p>
<p>Research I’ve conducted with management professor Susan Clark Muntean on entrepreneur support organizations, such as accelerators, shows that they often engage in outreach and recruitment tactics that <a href="https://doi.org/10.1111/gwao.12225">benefit men rather than women</a>. This is further supported by <a href="https://www.techstars.com/content/blog/diversity-at-techstars-companies/">survey data from Techstars</a>, one of the best-known and respected tech accelerators in the world. About 4 in 5 companies that have gone through their programs are white and almost 9 in 10 are male. </p>
<h2>‘Gender-neutral’ myth</h2>
<p>And yet these tech accelerators are guided by an implicit understanding that gender-neutral outreach and recruitment practices rather than targeted ones will bring in the “best” people. This notion is often expressed as <a href="https://onlinelibrary.wiley.com/doi/epdf/10.1111/gwao.12225">“Our doors are open to everyone”</a> to indicate that they do not discriminate.</p>
<p>Ironically, many organizations in the tech sector <a href="http://icic.org/wp-content/uploads/2016/05/ICIC_JPMC_Incubators_post.pdf">adopt this idea</a> because they believe it is gender-neutral and, thus, unbiased. </p>
<p>Yet claiming to be gender-neutral prevents organizations from recognizing that their practices are actually biased. Most outreach and recruitment takes place through word-of-mouth, alumni referrals and personal networks of accelerator leadership, which are <a href="https://doi.org/10.1111/gwao.12225">predominantly composed of males</a>.</p>
<p>These approaches often bring in more of the same: white male entrepreneurs rather than diverse professionals. As a result, women do not have equal access to resources in entrepreneurial ecosystems.</p>
<p>And all this is despite the fact that data on returns show venture-backed tech startups with women at the helm <a href="https://www.womenwhotech.com/startupinfographic">outperform</a> those led by men. </p>
<h2>Being ‘gender-aware’</h2>
<p>The first step to solving this problem is for tech startups, investors and accelerators to realize that what they call meritocracy is in fact itself gender-biased and results in mostly white men gaining access to resources and funding. By continuing to believe in meritocracy and maintaining practices associated with it, gender equality will remain a distant goal. </p>
<p>The next step is to move away from gender-neutral approaches and instead adopt <a href="https://www.weforum.org/agenda/2018/01/metoo-sexual-harassment-what-experts-say/">“gender-aware,” proactive measures</a> to change unfair practices. This includes setting concrete goals to achieve gender balance, examining the gender composition of boards, committees and other influential groups in the organization, and assessing the tools and channels used for outreach, recruitment and support of entrepreneurs.</p>
<p>The return on investment in <a href="http://reports.weforum.org/global-gender-gap-report-2015/the-case-for-gender-equality/">gender equality</a> is clear: Supporting and investing in businesses started by half the world’s population will create thriving societies and sustainable economies. And it starts with male allies who want to be part of the solution and recognize that meritocracy, as society currently defines it, isn’t the way to go.</p><img src="https://counter.theconversation.com/content/94269/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Banu Ozkazanc-Pan receives funding from The Ewing Marion Kauffman Foundation.</span></em></p>Americans’ widespread belief that they live in a meritocracy where anyone can get ahead actually makes inequality even worse, particularly in terms of gender.Banu Ozkazanc-Pan, Visiting Associate Professor of Engineering, Brown UniversityLicensed as Creative Commons – attribution, no derivatives.