tag:theconversation.com,2011:/fr/topics/technology-backlash-47393/articlesTechnology backlash – The Conversation2021-05-17T21:19:37Ztag:theconversation.com,2011:article/1543392021-05-17T21:19:37Z2021-05-17T21:19:37ZMachines can’t ‘personalize’ education, only people can<figure><img src="https://images.theconversation.com/files/400572/original/file-20210513-16-1h6flwv.jpg?ixlib=rb-1.1.0&rect=834%2C46%2C2989%2C1804&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Schools are facing accelerated COVID-19 pressures to integrate technology into children's education, and how they do has far-reaching implications. </span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>In the past year, COVID-19 abruptly disrupted schooling, and forced the question of <a href="https://globalnews.ca/news/7726753/covid-19-online-in-person-school-choice-2021-2022/">how much kindergarten to Grade 12 education should or will rely on online teaching in the near and distant future</a>. Education has taken a decided technological turn in its massive adaptation to online learning. This is precipitating a critical debate in education right now, with a most uncertain future and much depending on its outcome. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ontarios-choice-of-fully-online-school-would-gamble-on-children-for-profit-158292">Ontario's ‘choice’ of fully online school would gamble on children for profit</a>
</strong>
</em>
</p>
<hr>
<p>One key concern when considering both online learning and the tech platforms teachers may rely on in classrooms is a long-standing issue of how education should accommodate student individuality. For at least 150 years, education in the western world has been <a href="https://books.google.ca/books/about/The_Underground_History_of_American_Educ.html?id=p55tQgAACAAJ">conflicted over this issue</a>. </p>
<p>Education advocates like homeschooling champion <a href="https://simplycharlottemason.com/what-is-the-charlotte-mason-method/">Charlotte Mason</a> and <a href="https://www.britannica.com/biography/John-Dewey">education reformer John Dewey</a> advocated for recognition of students as unique persons whose interests and backgrounds shaped them in particular ways. Writing in 1897, Dewey argued it was <a href="https://books.google.ca/books?id=EgwVAAAAIAAJ&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false">critical for educators to note and consider students’ unique qualities when designing curriculum</a>. </p>
<p>Mason’s and Dewey’s philosophies and the schooling approaches they advocated helped spur <a href="https://journal.jctonline.org/index.php/jct/article/view/807">educational debates about the meaning of “personalized learning.”</a> These also pitted them against others like scientific management guru <a href="https://www.bl.uk/people/frederick-winslow-taylor">Frederick Taylor</a> who argued for mass standardization in education. </p>
<p>This conflict remains central to education debates unfolding today. For example, while some proponents of remote learning argue <a href="https://www.d2l.com/en-apac/blog/personalize-learning-digital-classroom/">teachers can still offer personalized learning online</a>, there are also industries focused on the notion that <a href="https://www.edweek.org/technology/q-a-the-promise-and-pitfalls-of-artificial-intelligence-and-personalized-learning/2019/11">AI can also “personalize” student experiences</a>. But machines aren’t persons.</p>
<p>Emerging research <a href="https://edsource.org/2020/disappointing-grades-technology-glitches-and-glimpses-of-learning-fun/641615">shows wide variability in student experiences</a> across technology-based approaches and platforms. Even when particular teachers are successful in delivering remote learning with students’ personal <a href="https://www.transformativelearningfoundation.org/faculty/michael-maser-v2/">and holistic interests</a> in mind, they are working in an educational context with <a href="https://theconversation.com/tax-pandemic-profiteering-by-tech-companies-to-help-fund-public-education-155705">increased marketing, uptake and profiting from educational technologies</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/teachers-have-been-let-down-by-a-decade-of-inaction-on-digital-technologies-142938">Teachers have been let down by a decade of inaction on digital technologies</a>
</strong>
</em>
</p>
<hr>
<p>Specific tech “solutions” like buying particular software for schools are often “Taylorist” insofar as the school or classroom is now committed to particular way of interacting and learning. In some cases <a href="https://www.nytimes.com/2019/04/21/technology/silicon-valley-kansas-schools.html">school communities come to complain that personal contact has been replaced with computerization</a>. </p>
<p>Technology surely has a role in education, but determining what it will be, and whose interests it will really serve, is a critical public debate. To this end, here are three thinkers who can help guide parents, educators and administrators in considering how education can adapt to changing technological circumstances while centering students as people and fostering caring human communities. </p>
<h2>1. Nel Noddings</h2>
<p>In her <a href="https://www.ucpress.edu/book/9780520275706/caring">ground-breaking book, <em>Caring</em></a>, educational ethicist Nel Noddings describes the importance of seeing and “confirming” students as persons. Noddings says such “confirmation” elicits a practice of dialogue in which educators “see and receive the other” as they really are, as a teaching and moral responsibility. </p>
<p>I believe that truly “seeing” and acknowledging students is a feasible response in videoconferencing environments like Zoom and should be recognized as a best practice. The same is also true for how educators direct students to apps that enable students to pursue learning activities reflecting personal choices: for example, platforms like DIY.org, Khan Academy, YouTube and others. Teachers can can and should validate students’ particular interests as they engage these sources.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/vkmYzbwrufg?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Interview with Ian McGilchrist on the divided brain and the search for meaning.</span></figcaption>
</figure>
<h2>2. Iain McGilchrist</h2>
<p>In a recent text, <a href="https://www.taylorfrancis.com/chapters/ways-attending-iain-mcgilchrist/e/10.4324/9781003049876-2">“Ways of attending: How our divided brain constructs the world</a>,” Scottish neuroscientist Iain McGilchrist asserts that technological thinking and <a href="https://www.youtube.com/watch?v=dFs9WO2B8uI">compartmentalization have come to dominate human thinking</a>. </p>
<p>This is thinking rooted in the brain’s left hemisphere and exemplified by mathematical reasoning and rationalization. He says the brain’s right hemisphere, responsible for whole-person, big-picture thinking, and moral decision-making, plays a secondary role. McGilchrist contends that new digital technologies driven by machine logic are effectively hijacking human attention, forcing us to become more machine-like. </p>
<p>McGilchrist advises everyone to study how we are interacting with technology to better understand how technology is influencing behaviours, including how it distracts us and channels our attention. If we don’t better perceive this, he warns, we risk becoming increasingly alienated from the feelings and moral decision-making that define our humanity. </p>
<h2>3. Ursula Franklin</h2>
<p>Scientist, <a href="https://alchetron.com/Ursula-Franklin">acclaimed humanitarian</a> and pacifist Ursula Franklin described in her <a href="https://www.cbc.ca/radio/ideas/the-humane-world-of-ursula-franklin-a-scientist-who-wanted-us-to-question-technology-1.5825485">1989 Massey Lecture series and book</a>, <a href="https://houseofanansi.com/products/the-real-world-of-technology-digital"><em>The Real World of Technology</em></a> how the Industrial Revolution set in motion technological processes, like assembly lines, that ushered in sweeping societal changes.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A woman at a microphone." src="https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=869&fit=crop&dpr=1 600w, https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=869&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=869&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1092&fit=crop&dpr=1 754w, https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1092&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/400578/original/file-20210513-13-5o3tdl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1092&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ursula Franklin speaks in Ottawa in July 1970. Franklin argued that money spent by Canada on warfare research could be better spent on environmental research.</span>
<span class="attribution"><span class="source">CP PHOTO/Chuck Mitchell</span></span>
</figcaption>
</figure>
<p>She characterized such processes as “prescriptive” in how they engineered human behaviour through compliance and conditioning, resulting in an “enormous social mortgage.” Franklin contrasts prescriptive technologies with “holistic” technologies that are controlled by an individual user, like personal craftsmanship. </p>
<p>To Franklin, holistic technologies enable people to enact caring gestures, and are spontaneous and flexible, where prescriptive technologies are rigid and mechanistic. Franklin’s philosophy points to the idea that we should recognize the limits and power of technology. </p>
<p>Franklin’s insights should lead us to remember that while <a href="https://link.springer.com/chapter/10.1007%2F978-3-030-13743-4_9">collaboration amongst students can be enhanced in technological environments</a>, some education researchers also caution that technological tools themselves don’t create holistic, inclusive or creative communities. Only humans can do this. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/digital-platforms-alone-dont-bridge-youth-divides-121222">Digital platforms alone don't bridge youth divides</a>
</strong>
</em>
</p>
<hr>
<h2>Serving people</h2>
<p>The insights of Noddings, McGilchrist, Franklin and others urge us to deeply consider the technologies we choose to use in our schools and what role they play. This does not mean that we reject the integration of technology into education. I believe many educators have demonstrated it is possible to strike a healthy balance when integrating technology with educational goals. </p>
<p>But future educational paths will reflect choices we make now. In facing today’s unprecedented challenges, educators and school administrators must continue to support education as an endeavour that holds at its core the mission of serving all people.</p><img src="https://counter.theconversation.com/content/154339/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>michael maser has previously received funding from mitacs. </span></em></p>Insights of neuroscientist Ian McGilchrist, philosopher Nel Noddings and physicist Ursula Franklin help centre students and our collective future in debates about education and technology.Michael Maser, PhD candidate - Faculty of Education, Simon Fraser UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1419922020-07-03T15:38:39Z2020-07-03T15:38:39ZE-scooters could disrupt travel as we know it – expect the car industry to fight back<figure><img src="https://images.theconversation.com/files/345523/original/file-20200703-29-7cyc8v.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C7940%2C4595&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/young-man-night-electric-scooter-escooter-1457367524">R.Classen/Shutterstock</a></span></figcaption></figure><p>Does the road out of lockdown look like a motorway or a cycle path? With the UK government announcing <a href="https://www.gov.uk/government/consultations/legalising-rental-e-scooter-trials-defining-e-scooters-and-rules-for-their-use/legalising-rental-e-scooter-trials">a rental e-scooter trial</a> in <a href="https://www.bbc.co.uk/news/uk-england-tees-53272688">cities nationwide</a>, it’s possible that the transport system we had before the pandemic will begin to look quite different from the one we had before.</p>
<p>From Saturday July 4 2020, rental e-scooters will be <a href="https://www.bbc.co.uk/news/uk-53219331">allowed on British roads</a>. The Department for Transport brought forward the trial by a year, hoping to help reduce congestion as the UK emerges from lockdown. There are also plans to relaunch an <a href="https://www.theverge.com/2020/7/3/21312011/lime-relaunches-jump-electric-bikes-london-scooters">electric bike-share service</a> in London.</p>
<p>Amid fears of COVID-19 transmission on public transport, <a href="https://theconversation.com/cars-transition-from-lockdown-is-a-fork-in-the-road-here-are-two-possible-outcomes-for-future-travel-139885">car use in the UK has surged</a>. It’s hoped that fleets of electric scooters and bikes in cities could help replace cars and address the “<a href="https://maas-alliance.eu/how-micro-mobility-solves-multiple-problems-in-congested-cities/">first mile-last mile</a>” problem, where users currently have to travel by car to reach their train station or bus stop, or get home from them. </p>
<p>But perhaps the automobile industry feels threatened. The French advertising standards authority recently banned a Dutch e-bike advert for creating “<a href="https://www.theguardian.com/media/2020/jul/01/france-bans-dutch-bike-tv-ad-for-creating-climate-of-fear">a climate of fear</a>” around cars.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/kMpqVfnuyII?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>If history is any guide, this spat between the hegemonic car industry and insurgent e-bike and scooter companies will be fought again and again over the coming years, both in public and in private. Ultimately, it’s a familiar battle between those who benefit from the status quo and those who, for various reasons, offer change.</p>
<h2>Battles over legitimacy</h2>
<p>Changes in the way a society organises things like transport, energy or food are sometimes called “socio-technical transitions”. Take sailing ships. These vessels became bigger and more sophisticated over the course of hundreds – if not thousands – of years and then, suddenly, steamships came in and replaced them within decades. The transition was complete by 1900. </p>
<p>Similar transitions around the provision of heat and light took place in the 19th and 20th centuries. The <a href="https://www.businessinsider.com/thomas-edison-light-bulb-publicity-stunt-2013-11?r=US&IR=T">famous exploits</a> of Thomas Edison show how those trying to drive a transition are keen to gain public approval for their technology. They can do this by allaying fears over their novel product, or by smearing the alternative. </p>
<p>When battling to create markets for electricity, Edison made light bulbs <a href="https://journals.sagepub.com/doi/abs/10.2307/3094872">in the shape of a flame</a> to mimic the already common gas light. He also <a href="https://www.businessinsider.com/edison-financed-the-electric-chair-2014-7?r=US&IR=T">launched a campaign</a> of publicly electrocuting animals to demonstrate that alternating current electricity, favoured by rival businessman George Westinghouse, was too dangerous for consumers to trust.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/345524/original/file-20200703-21-rwr8h2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Light bulbs look that way for a reason.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/glowing-edisons-light-bulbs-on-dark-370383596">Tereshchenko Dmitry/Shutterstock</a></span>
</figcaption>
</figure>
<p>The history of transport innovations is no different. In Britain in 1865, with the advent of mechanically propelled vehicles on public highways, the government passed the “<a href="https://www.britannica.com/topic/Locomotives-on-Highways-Act">Red Flag Act</a>”, which made sure a man with a red flag walked before road vehicles hauling multiple wagons, in order to warn passersby. It may sound ridiculous in our world of constant traffic, but the public needed to be eased into accepting the transition from horse and cart and footpaths to cars, lorries and motorways.</p>
<p>As their products are rolled out across the UK, e-scooter and e-bike businesses may have to answer <a href="https://www.bbc.co.uk/news/uk-england-london-51283256">questions about their safety</a> too. But even their analogue ancestors faced this kind of scrutiny, though not all of it reasonable. </p>
<p>A sexist medical scare in late Victorian Britain saw doctors warning women about the effect of vigorous cycling on their health, as “over-exertion… and the unconscious effort to maintain one’s balance” was thought to cause “<a href="https://www.vox.com/2014/7/8/5880931/the-19th-century-health-scare-that-told-women-to-worry-about-bicycle">bicycle face</a>”, a combination of “hard, clenched jaws and bulging eyes”.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=396&fit=crop&dpr=1 600w, https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=396&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=396&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=497&fit=crop&dpr=1 754w, https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=497&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/345525/original/file-20200703-21-lniqbg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=497&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">‘Bicycle jaw’ almost discredited bikes among female riders in Britain.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Bicycle#/media/File:Women_on_bicycles,_late_19th_Century_USA.jpg">JGKlein/Wikipedia</a></span>
</figcaption>
</figure>
<p>There’s even a story that the London Underground hired a one-legged man named William “Bumper” Harris in 1911 to <a href="https://www.mylondon.news/news/nostalgia/unbelievable-story-behind-first-ever-18070658">ride the new-fangled “moving staircase”</a> (what we’d nowadays call an escalator) up and down each day to reassure frightened commuters.</p>
<p>So, given what we know about how legitimacy battles over new technology have played out in the past, what can we expect today?</p>
<h2>Winning hearts and minds</h2>
<p>Viral stories about <a href="https://www.commentarymagazine.com/noah-rothman/driver-less-cars-moral-panic/">accidents</a> or <a href="https://www.bloomberg.com/news/articles/2019-01-10/when-electric-scooters-crash-who-s-to-blame">insurance woes</a> are likely, but market opponents might also take the opportunity to cast doubt on the <a href="https://theconversation.com/are-shared-e-scooters-good-for-the-planet-only-if-they-replace-car-trips-121166">green credentials of electric scooters and bikes</a>. Already e-scooter companies have tried to <a href="https://www.bloomberg.com/news/articles/2020-06-29/a-survival-plan-for-electric-scooter-startups">portray themselves</a> as both green and diverse in response.</p>
<p>Undoubtedly, some people might try to stoke privacy concerns, as businesses attempt to <a href="https://www.theverge.com/2020/6/8/21284490/aclu-ladot-mds-lawsuit-scooter-tracking-uber">capture tracking data</a> of people using scooter and bike-sharing services. Such real-time data of peoples’ movements is <a href="https://streetfightmag.com/2020/07/02/why-you-should-be-using-a-demand-side-platform-for-location-advertising/?mc_cid=19abf69295&mc_eid=87b988c129#.Xv3ONm1Kipp">a goldmine for advertisers</a>.</p>
<p>It’s important not to paint this simply as David versus Goliath though. Many of the <a href="https://www.volkswagen.co.uk/electric?adchan=sem&campaign=generic&adgroup=generic&publisher=google&adpl=google&country=GB&language=en&gclsrc=aw.ds&&gclid=EAIaIQobChMIqJylwpau6gIV2evtCh1lYgT9EAAYASAAEgKsqvD_BwE&mkwid=s_373447802874_%2Belectric%20%2Bvehicle_b_c&mtid=vdvv2y1xd0&slid=&product_id=">big automakers are actively exploring</a> how to tap into the market for smaller electric vehicles.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/345526/original/file-20200703-33939-1orjmab.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">E-bikes could be used widely by courier services.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/male-courier-bicycle-delivering-packages-city-1316331548">Halfpoint/Shutterstock</a></span>
</figcaption>
</figure>
<p>Sooner or later, the trials of e-scooters will end and a decision will be made. Not all <a href="https://link.springer.com/article/10.1016/j.thbio.2005.11.002">hyped technologies</a> succeed: the much vaunted <a href="https://www.bbc.co.uk/news/business-53160518">Segway is ceasing production</a>. The same fate probably awaits some of the many electric personal transport devices on sale.</p>
<p>In the meantime, it may well be battles over policy that matter most. Pavements, cycle lanes and roads are “<a href="https://www.bbc.co.uk/news/av/uk-england-london-53261402/cars-mount-pavement-to-avoid-lewisham-road-barrier">all up for grabs</a>” as streets are reshaped in the post-lockdown world.</p>
<p>Whether e-scooters and other small electric vehicles are allowed to share this planned cycling infrastructure could be pivotal. <a href="https://www.taur.com/post/riding-electric-scooters-in-bike-lanes-makes-us-all-safer">Scooter companies want it</a>. Others <a href="https://www.reddit.com/r/londoncycling/comments/as06xa/scooters_in_cycle_lanes/">aren’t so sure</a>.</p>
<p>Whatever happens, expect <a href="https://www.cambridge.org/core/books/power-in-movement/cycles-of-contention/4822FECC7E62E9D1D067235F6F03B027">a bumpy ride</a>.</p><img src="https://counter.theconversation.com/content/141992/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marc Hudson is a member of Climate Emergency Manchester, which has been campaigning for Manchester City Council to install pop-up cycle lanes as a social justice response to COVID-19 lockdown.</span></em></p>E-scooters and e-bikes are coming to Britain’s streets, but it may be a bumpy ride.Marc Hudson, Research Associate in Social Movements, Keele UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1202772019-08-22T12:36:33Z2019-08-22T12:36:33ZDon’t ban new technologies – experiment with them carefully<figure><img src="https://images.theconversation.com/files/284800/original/file-20190718-116586-1m4325e.jpg?ixlib=rb-1.1.0&rect=0%2C106%2C3968%2C2863&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's a mess, but is it all bad?</span> <span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Scooters_on_the_sidewalk.jpg">EHFXC/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>For many years, Facebook’s internal slogan was “<a href="https://www.wired.com/2014/04/zuckerberg-f8-interview/">move fast and break things</a>.” And that’s what the company did – along with most other Silicon Valley startups and the venture capitalists who fund them. Their general attitude is one of asking for forgiveness after the fact, rather than for permission in advance. Though this can allow for some bad behavior, it’s probably the right attitude, philosophically speaking.</p>
<p>It’s true that the try-first mindset has frustrated the public. Take the Lime scooter company, for instance. The company launched its scooter sharing service in multiple cities <a href="https://www.vox.com/2018/8/30/17690056/scooters-bird-lime-san-francisco-santa-monica-permits-uber-lyft">without asking permission</a> from local governments. Its electric scooters <a href="https://www.latimes.com/local/lanow/la-me-ln-bird-scooter-vandalism-20180809-story.html">don’t need base stations or parking docks</a>, so the company and its <a href="https://www.npr.org/2018/07/28/631812255/scooters-sidewalk-nuisances-or-the-future-of-public-transportation">customers can leave them anywhere</a> for the next person to pick up – even if that’s <a href="https://www.washingtonpost.com/business/economy/pedestrians-and-e-scooters-are-clashing-in-the-struggle-for-sidewalk-space/2019/01/11/4ccc60b0-0ebe-11e9-831f-3aa2c2be4cbd_story.html">in the middle of a sidewalk</a>. This <a href="https://arstechnica.com/tech-policy/2018/11/bird-sues-beverly-hills-argues-it-cant-ban-e-scooters-even-for-6-months/">general disruption</a> has led to <a href="https://wtop.com/dc/2019/06/dc-council-proposes-legislation-to-set-boundaries-on-e-scooter-companies/">calls to ban the scooters</a> in <a href="https://newschannel9.com/news/local/mayor-briley-decides-to-ban-electric-scooters-in-nashville">cities around the country</a>.</p>
<p>Scooters are not alone. <a href="https://qz.com/1084981/map-all-the-places-where-uber-is-partially-or-fully-banned/">Ridesharing services</a>, <a href="https://www.bloomberg.com/news/articles/2018-03-27/uber-s-autonomous-cars-suspended-by-arizona-after-fatal-crash">autonomous cars</a>, <a href="https://issues.org/perspective-should-artificial-intelligence-be-regulated/">artificial intelligence systems</a> and <a href="https://arstechnica.com/tech-policy/2019/03/sorry-amazon-philadelphia-bans-cashless-stores/">Amazon’s cashless stores</a> have also all been targets of bans (or proposed bans) in different states and municipalities before they’ve even gotten off the ground.</p>
<p>What these efforts have in common is what <a href="https://scholar.google.com/citations?user=6pDPD_gAAAAJ&hl=en">philosophers like me</a> call the “<a href="https://doi.org/10.1017/CBO9781139939652">precautionary principle</a>,” the idea that new technologies, behaviors or policies should be banned until their supporters can demonstrate that they will not result in any significant harms. It’s the same basic idea Hippocrates had in ancient Greece: Doctors should “<a href="https://www.health.harvard.edu/blog/first-do-no-harm-201510138421">do no harm</a>” to patients. </p>
<p>The precautionary principle entered the political conversation <a href="http://lawdigitalcommons.bc.edu/iclr/vol14/iss1/2">in the 1980s</a> in the context of environmental protection. Damage to the environment is hard – if not impossible – to reverse, so it’s prudent to seek to prevent harm from happening in the first place. But as I see it, that’s not the right way to look at most new technologies. New technologies and services aren’t creating irreversible damage, even though they do generate some harms.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/284805/original/file-20190718-116543-t7i244.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Environmental pollution is so harmful and hard to clean up that precautions are useful.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/lahad-datusabahmalaysia-taken-on-8-jjune-1453585367">imrankadir/Shutterstock.com</a></span>
</figcaption>
</figure>
<h2>Precaution has its place</h2>
<p>As a general concept, the precautionary principle is essentially conservative. It allows existing technologies, even if new ones – the ones that face preemptive bans – are safer overall. </p>
<p>This approach also runs counter to the most basic idea of liberalism, in which people are <a href="https://www.archives.gov/founding-docs/declaration">broadly allowed to do what they want</a>, unless there’s a rule against it. This is limited only when our right to free action <a href="https://quoteinvestigator.com/2011/10/15/liberty-fist-nose/">interferes with someone else’s rights</a>. The precautionary principle reverses this, banning people from doing what they want, unless it is specifically allowed.</p>
<p>The precautionary principle makes sense when people are talking about some issues, like the environment or public health. It’s easier to avoid the problems of air pollution or dumping trash in the ocean than trying to clean up afterward. Similarly, giving children drinking water that’s contaminated with lead has effects that aren’t reversible. The children simply must <a href="https://www.who.int/ceh/publications/leadguidance.pdf">deal with the health effects of their exposure</a> for the rest of their lives.</p>
<p>But as much of a nuisance as dockless scooters might be, they aren’t the same as poisoned water. </p>
<h2>Managing the effects</h2>
<p>Of course, dockless scooters, autonomous cars and a whole host of new technologies do generate real harms. A Consumer Reports investigation in early 2019 found <a href="https://www.consumerreports.org/product-safety/e-scooter-ride-share-industry-leaves-injuries-and-angered-cities-in-its-path/">more than 1,500 injuries from electric scooters</a> since the dockless companies were founded. That’s in addition to the more common nuisance of having to step over <a href="https://www.latimes.com/opinion/readersreact/la-ol-le-lime-bird-ant-scooters-20180818-story.html">scooters carelessly left</a> in the middle of the sidewalk – and the difficulties <a href="https://pilotonline.com/life/social-issues/article_d6f87e76-9cbd-11e9-a085-53857d29dd05.html">people using wheelchairs</a>, crutches, strollers or walkers may have in getting around them.</p>
<p>Those harms are not nothing, and can help motivate arguments for banning scooters. After all, they can’t hurt anyone if they’re not allowed. What’s missing from those figures, however, is how many of those people riding scooters would have gotten into a car instead. Cars are <a href="https://www.nsc.org/road-safety/safety-topics/fatality-estimates">far more dangerous</a> and far worse for the environment.</p>
<p>Yet the precautionary principle isn’t right for cars, either. As the number of <a href="https://theconversation.com/helping-autonomous-vehicles-and-humans-share-the-road-68044">autonomous cars</a> on the road climbs, they’ll be involved in an <a href="https://theconversation.com/after-fatality-autonomous-car-development-may-speed-up-63488">increasing number of crashes</a>, which will no doubt get lots of media attention.</p>
<p>It is worth keeping in mind that autonomous cars will have been a <a href="https://theconversation.com/redefining-safety-for-self-driving-cars-87419">wild technology success</a> even if they are in millions of crashes every year, so long as they improve on the <a href="https://www.iii.org/fact-statistic/facts-statistics-highway-safety">6.5 million crashes and 1.9 million people</a> who were seriously injured in a car crash in 2017.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/alnDYYwAs74?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A look at the precautionary principle in environmental regulation.</span></figcaption>
</figure>
<h2>Disruption brings benefits too</h2>
<p>It may also be helpful to remember that dockless scooters and ridesharing apps and any other technology that displaces existing methods can really only become a nuisance if a lot of people use them – that is, if many people find them valuable. Injuries from scooters, and the number of scooters left lying around, have increased because the <a href="https://nacto.org/shared-micromobility-2018/">number of people using them has skyrocketed</a>. Those 1,500 reported injuries are <a href="https://nacto.org/shared-micromobility-2018/">from 38.5 million rides</a>.</p>
<p>This is not, of course, to say that these technologies and the firms that produce them should go unregulated. Indeed, a number of these firms have behaved quite poorly, and have legitimately created some harms, which should be regulated. </p>
<p>But instead of preemptively banning things, I suggest continuing to rely on the standard approach in the liberal tradition: See what kinds of harms arise, handle the early cases via the court system, and then consider whether a pattern of harms emerges that would be better handled upfront by a new or revised regulation. The <a href="https://www.cpsc.gov/">Consumer Product Safety Commission</a>, which looks out for dangerous consumer goods and holds manufacturers to account, is an example of this.</p>
<p>Indeed, laws and regulations already cover littering, abandoned vehicles, negligence and assault. New technologies may just introduce new ways of generating the same old harms, ones that are already reasonably well regulated. Genuinely new situations can of course arise: <a href="https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html">San Francisco’s ban on municipal use of facial recognition technologies</a> may well be sensible, as people quite reasonably can democratically decide that the state shouldn’t be able to track their every move. People might well decide that companies shouldn’t be able to either.</p>
<p>Silicon Valley’s CEOs aren’t always <a href="https://theconversation.com/what-the-google-gender-manifesto-really-says-about-silicon-valley-82236">sympathetic characters</a>. And “disruption” really can be <a href="https://www.theguardian.com/commentisfree/2019/jun/19/tim-cook-if-youve-built-a-chaos-factory-you-cant-dodge-responsibility-for-the-chaos">disruptive</a>. But liberalism is about innovation and experimentation and finding new solutions to humanity’s problems. Banning new technologies – even ones as trivial as dockless scooters – embodies a conservatism that denies that premise. A lot of new ideas aren’t great. A handful are really useful. It’s hard to tell which is which until we try them out a bit.</p><img src="https://counter.theconversation.com/content/120277/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ryan Muldoon does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>New technologies and services aren’t creating irreversible damage, even though they do generate some harms. Preemptive bans would stifle innovation and block potential solutions to real problems.Ryan Muldoon, Associate Professor of Philosophy, University at BuffaloLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1202092019-08-19T11:11:10Z2019-08-19T11:11:10ZBring on the technology bans!<figure><img src="https://images.theconversation.com/files/284115/original/file-20190715-173360-127u0ka.jpg?ixlib=rb-1.1.0&rect=65%2C19%2C4300%2C2873&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Is there still time to reach the 'off' button?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/emergency-red-stop-button-activated-by-341231540">Raul Topan/Shutterstock.com</a></span></figcaption></figure><p>In mid-July 2019, Oakland, California, <a href="https://www.vice.com/en_us/article/zmpaex/oakland-becomes-third-us-city-to-ban-facial-recognition-xz">became the third U.S. city</a> to ban municipal departments from using facial recognition technology. Meanwhile, <a href="https://oversight.house.gov/legislation/hearings/facial-recognition-technology-part-1-its-impact-on-our-civil-rights-and">Congress began hearings</a> on whether and how to regulate it on a national level. In a surprising moment of bipartisan consensus, the only thing <a href="https://thehill.com/opinion/technology/446726-facial-recognition-surveillance-in-congresss-crosshairs">lawmakers fought about</a> was how extensive restrictions ought to be.</p>
<p>This response to a powerful, potentially invasive technology is a sign of how the public and policymakers might respond to future technological developments – especially those using artificial intelligence. Not only does facial recognition allow Facebook to automate people-tagging in photos, but it also <a href="https://www.nytimes.com/2019/05/18/us/facial-recognition-police.html">supercharges law enforcement’s ability</a> to track down crime suspects. Ethical questions abound. As Georgetown’s Center on Privacy and Technology put it, facial recognition could lead to “a world where, once you set foot outside, <a href="https://www.americaunderwatch.com/">the government can track your every move</a>.” And it’s just the beginning.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/283898/original/file-20190712-173370-1lqmv8o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Cameras are already watching many American streets.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Surveillance/9a415a9f9a1a45039592943ccecf1e21/1/0">AP Photo/Matt Rourke</a></span>
</figcaption>
</figure>
<p>On the horizon is a flood of digital innovations that could be at least as powerful, wide-ranging and controversial: “<a href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072">deepfake</a>” videos showing people doing things they never did, the “<a href="https://theconversation.com/4-ways-internet-of-things-toys-endanger-children-94092">internet of things</a>” constantly monitoring private homes, <a href="https://theconversation.com/think-facebook-can-manipulate-you-look-out-for-virtual-reality-93118">manipulative virtual reality</a>, <a href="https://theconversation.com/safe-efficient-self-driving-cars-could-block-walkable-livable-communities-103583">self-driving cars overwhelming communities</a> and more.</p>
<p>I’m a researcher studying <a href="https://scholar.google.com/citations?user=JdxjEQIAAAAJ&hl=en&oi=ao">digital technology’s societal impacts</a>, and it’s my job to stay informed about upcoming technologies and to project future outcomes. But, with more and more innovation, there is less and less time to reflect on the consequences. Many of my colleagues feel the same.</p>
<p>To tame this onrushing tide, society needs dams and dikes. Just as has begun to happen with facial recognition, it’s time to consider legal bans and moratoriums on other emerging technologies. These need not be permanent or absolute, but innovation is not an unmitigated good. The more powerful a technology is, the more care it requires to operate safely.</p>
<h2>Little urgency</h2>
<p>There’s not a pressing need for most new digital technologies. Some innovations, of course, are almost completely positive: anesthesia, electric light, radio, vaccines. But today’s society often celebrates innovation for its own sake, even when the benefits are questionable – and more and more, the benefits are indeed questionable. </p>
<p>Is it really worth a <a href="https://theconversation.com/drones-to-deliver-incessant-buzzing-noise-and-packages-116257">crowded, buzzing sky filled with drones</a> to get <a href="https://www.thespectrum.com/story/news/local/mesquite/2019/06/24/amazon-start-drone-delivery-within-months/1546797001/">one-hour delivery</a> of consumer goods, instead of delivery in 24 hours, or even two days? Is virtual reality so great that children should, effectively, grow up with <a href="https://bigthink.com/kevin-dickinson/is-virtual-reality-dangerous-for-children">their eyes glued to video screens</a>? When governments can conduct hard-to-trace <a href="https://theconversation.com/losing-control-the-dangers-of-killer-robots-58262">assassinations by drone</a>, is anyone truly safe? Scanning <a href="https://gizmodo.com/10-ludicrously-advanced-technologies-we-can-expect-by-t-1788671727">lists of possible future technologies</a> can incite more fear than hope.</p>
<p>These types of innovations repeatedly fail to provide overall improvements in truly meaningful ways, like how deeply people love each other, how compassionately people care, how well society supports the less privileged, or how wisely humans steward the planet. If anything, technology appears to <a href="https://www.theatlantic.com/technology/archive/2011/03/technology-is-not-the-answer/73065/">amplify humans’ moral weaknesses</a> by coddling people with consumer comforts and echo chambers. The last half-century has seen a golden age of digital innovation, yet <a href="https://www.theatlantic.com/technology/archive/2011/03/technology-is-not-the-answer/73065/">rates of poverty have stagnated</a>, <a href="https://www.cnbc.com/2018/07/19/income-inequality-continues-to-grow-in-the-united-states.html">inequality has soared</a> and <a href="https://www.bp.com/content/dam/bp/business-sites/en/global/corporate/pdfs/energy-economics/statistical-review/bp-stats-review-2018-full-report.pdf">sustainability seems farther</a> out of reach. </p>
<p>Most of the technological advances in the works today won’t address those problems; they’ll tackle smaller annoyances that there’s simply no rush to relieve.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/283899/original/file-20190712-173370-kj7fxp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Plastic bottles sounded like a great idea, but they’re clogging oceans and beaches.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Ocean-Plastic-Business/075e081859de4acbb88cd37c7abaa633/11/0">AP Photo/Matt Dunham</a></span>
</figcaption>
</figure>
<h2>Harms nearly certain, but unclear</h2>
<p>New technologies always have <a href="http://www.edwardtenner.com/why_things_bite_back__technology_and_the_revenge_of_unintended_consequences_21108.htm">unintended consequences</a> – often negative – and innovators always underestimate how bad they’ll be. Pesticides have <a href="https://www.upress.pitt.edu/books/9780822954187/">caused public health scourges</a>. Plastic bottles have <a href="https://www.nationalreview.com/2018/09/plastic-bag-straw-bans-ocean-pollution/">polluted the oceans</a>. Smartphones are contributing to a <a href="https://theconversation.com/with-teen-mental-health-deteriorating-over-five-years-theres-a-likely-culprit-86996">teenage mental health crisis</a>.</p>
<p>Consider what an AI system might do if directed to do something obvious – like maximize profits, using all the information and tools at its disposal. It might hold embarrassing personal information for ransom to coerce users to purchase goods, or extort criminal actions from people with darker secrets. </p>
<p>Nothing has yet stopped online stores’ algorithms from <a href="https://www.nytimes.com/2019/06/24/technology/e-commerce-dark-patterns-psychology.html">lying to increase sales</a>, nor curbed <a href="https://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science">Facebook’s actual ability to manipulate users’ moods</a>. Tech companies routinely treat their <a href="https://venturebeat.com/2019/06/18/optimizely-raises-50-million-to-expand-its-a-b-testing-and-experimentation-platform/">customers as experimental guinea pigs</a>, and are already <a href="https://www.forbes.com/sites/forbesagencycouncil/2019/06/03/how-digital-agencies-can-adapt-to-an-ai-driven-industry/">applying artificial intelligence systems for a range of purposes</a>. </p>
<p>If these are just the known effects of tech companies; efforts and innovations, imagine what unintended consequences might lurk. The premise of the popular game “<a href="http://www.decisionproblem.com/paperclips/index2.html">Universal Paperclips</a>” is that an AI focused on optimizing a business ends up <a href="https://www.theverge.com/tldr/2017/10/11/16457742/ai-paperclips-thought-experiment-game-frank-lantz">destroying the known universe</a>. Science fiction is rapidly becoming science fact.</p>
<h2>Difficult to go backwards</h2>
<p>Once unleashed, digital technologies are particularly difficult genies to put back in the bottle. In this respect, they differ from other advanced technologies. Soon after World War II, activists began to call for <a href="https://www.sup.org/books/title/?id=9646">bans on nuclear arms</a>, culminating in the <a href="https://theconversation.com/is-the-nuclear-nonproliferation-treaty-on-its-last-legs-119857">Non-Proliferation Treaty</a> in 1970. The treaty has been effective in keeping an 80-year-old technology limited to just eight or nine countries – that’s an impressive feat, especially across the jagged history of global politics.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=406&fit=crop&dpr=1 600w, https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=406&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=406&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=511&fit=crop&dpr=1 754w, https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=511&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/283900/original/file-20190712-173360-5wdlgc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=511&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">U.S. and Soviet officials sign the Nuclear Non-Proliferation Treaty in 1968.</span>
<span class="attribution"><a class="source" href="https://no.usembassy.gov/obama-45th-anniversary-nuclear-non-proliferation-treaty/">US State Department</a></span>
</figcaption>
</figure>
<p>Nuclear weapons, however, require significant resources to design, build, test and deploy. By contrast, digital technologies are easy to share, making them even harder to control. Advanced hacking tools have been <a href="https://www.nytimes.com/2019/05/06/us/politics/china-hacking-cyber.html">stolen and shared online</a>: Techniques developed by the U.S. National Security Agency have been used in global cyberattacks by China, Russia and North Korea. Their software is now available to anyone with an internet connection.</p>
<h2>An imbalance of power</h2>
<p>Technology companies pushing their advances have money, influence and time on their side. The <a href="https://www.nytimes.com/2019/06/05/us/politics/amazon-apple-facebook-google-lobbying.html">millions of lobbying dollars</a> they spend are pocket change when compared to their <a href="https://www.marketwatch.com/story/amazon-google-microsoft-and-intel-find-billions-more-in-profit-2017-10-26">multi-billion-dollar profits</a>, and they can keep the funding going indefinitely, waiting out news cycles and activist energy. </p>
<p>In my view, uncertainty about how new technologies will affect society overall means that skeptical forces deserve more support. Bans and moratoriums would mean that rich, powerful entities would have to seek legal and societal permission before unleashing their potential monsters onto the market. That doesn’t seem like too much to ask.</p>
<p>There are many reasons to continue to build new technologies – to remain globally competitive, to advance human knowledge and to prepare for potential future crises. Technology has its benefits. But slowing the pace of its advance would give society more time to think through the consequences and debate which aspects of new technologies are desirable, and which should be outlawed.</p><img src="https://counter.theconversation.com/content/120209/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kentaro Toyama does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Legal bans and moratoriums on other emerging technologies need not be permanent or absolute, but the more powerful a technology is, the more care it requires to operate safely.Kentaro Toyama, W. K. Kellogg Professor of Community Information, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1192832019-07-17T11:18:19Z2019-07-17T11:18:19Z3 myths to bust about breaking up ‘big tech’<figure><img src="https://images.theconversation.com/files/284118/original/file-20190715-173342-1ji5sep.jpg?ixlib=rb-1.1.0&rect=45%2C0%2C5069%2C3376&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Before taking on tech giants, shatter a few misconceptions.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/iron-hammer-breaking-glass-window-340890053">W. Scott McGill/Shutterstock.com</a></span></figcaption></figure><p>As the public and government regulators around the world discuss <a href="https://www.npr.org/2019/06/09/731044346/big-tech-and-antitrust">whether and how</a> to manage the power of technology companies, one idea that keeps coming up is breaking up these large conglomerate corporations into smaller pieces. Public distrust for tech companies has shifted to talk of <a href="https://www.wsj.com/articles/justice-department-is-preparing-antitrust-investigation-of-google-11559348795">antitrust action</a> against them. Facebook, for instance, might then have to <a href="https://www.mercurynews.com/2018/05/21/facebook-owns-instagram-messenger-whatsapp-now-theres-a-call-to-break-it-all-up/">compete with Instagram for photo-sharing</a> and WhatsApp for messaging – rather than owning both. </p>
<p>The idea has managed to garner support from both <a href="https://www.politico.com/2020-election/candidates-views-on-the-issues/technology/tech-competition-antitrust/">Massachusetts Sen. Elizabeth Warren</a>, a Democrat, and <a href="https://www.nbcnews.com/politics/donald-trump/trump-claims-collusion-between-big-tech-democrats-backs-antitrust-fines-n1015726">Republican President Donald Trump</a>.</p>
<p>However, <a href="https://www.politico.com/2020-election/candidates-views-on-the-issues/technology/tech-competition-antitrust/">advocates</a> and <a href="https://www.weforum.org/agenda/2019/07/these-are-some-of-the-best-quotes-about-technology-monopolies-in-2019/">opponents</a> of breaking up big technology firms are falling prey to some serious misconceptions. I study the effects of digital technologies on lives and livelihoods across 85 countries and lead Tufts Fletcher School’s <a href="https://sites.tufts.edu/digitalplanet/">Digital Planet</a> initiative studying technological innovation around the world. In my opinion, there are three myths worth busting before considering taking on big tech. </p>
<h2>Myth 1: Comparing Standard Oil and Google</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=901&fit=crop&dpr=1 600w, https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=901&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=901&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1132&fit=crop&dpr=1 754w, https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1132&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/284102/original/file-20190715-173370-5ovggf.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1132&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">John D. Rockefeller, founder of Standard Oil.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:John_D_Rockefeller_1872.png">Urbanrenewal/Wikimedia Commons</a></span>
</figcaption>
</figure>
<p>Arguments for and against antitrust action against tech firms rely heavily on the <a href="https://www.nytimes.com/1998/10/19/business/microsoft-trial-precedents-previous-antitrust-cases-leave-room-for-both-sides.html">experiences of earlier cases</a>. The massive <a href="https://theconversation.com/for-tech-giants-a-cautionary-tale-from-19th-century-railroads-on-the-limits-of-competition-91616">19th-century monopoly Standard Oil</a> has, in fact, been referred to as the “<a href="https://www.nytimes.com/2018/02/20/magazine/the-case-against-google.html">Google of its day</a>.” There are also people who are recalling the 1990s <a href="https://www.nytimes.com/2018/05/18/opinion/microsoft-antitrust-case.html">antitrust case against Microsoft’s dominant position</a> in the era of personal computers. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=490&fit=crop&dpr=1 600w, https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=490&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=490&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=615&fit=crop&dpr=1 754w, https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=615&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/284103/original/file-20190715-173360-2qxmqd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=615&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Google co-founders Sergey Brin, left, and Larry Page.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Schmidt-Brin-Page-20080520_(cropped).jpg">Joi Ito/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Those cases from the past may seem similar to today’s situation, but this era is different in one crucial way: the global technology marketplace. Currently, there are two parallel “big tech” clusters. One is in the U.S., dominated by <a href="https://theconversation.com/big-tech-isnt-one-big-monopoly-its-5-companies-all-in-different-businesses-92791">Google, Amazon, Facebook and Apple</a>. The other is based in China, dominated by <a href="https://singularityhub.com/2018/08/17/baidu-alibaba-and-tencent-the-rise-of-chinas-tech-giants/">Baidu, Alibaba, Tencent and Huawei</a>. This global market is subject to different political and policy pressures than regulators faced when dealing with Standard Oil and Microsoft.</p>
<p>Both clusters are attempting to add users to <a href="https://hbr.org/2019/01/which-countries-are-leading-the-data-economy">accumulate reservoirs of data</a>, which will fuel the next stage of competitiveness in a future run by artificial intelligence. The Chinese government has blocked most of the U.S. companies from entering the Chinese market, protecting its “<a href="https://www.scmp.com/tech/china-tech/article/2120913/china-recruits-baidu-alibaba-and-tencent-ai-national-team">AI national team</a>.” The <a href="https://www.bloomberg.com/news/articles/2018-06-27/alibaba-pulls-back-in-u-s-amid-trump-crackdown-on-chinese-investment">U.S. government has done likewise</a>, blacklisting some Chinese outfits for a period while discouraging others.</p>
<p>If the U.S. technology giants are broken up, the result would be a vastly uneven global playing field, pitting fragmented U.S. companies against consolidated state-protected Chinese firms.</p>
<p>Geopolitical factors aren’t limited to the U.S.-China rivalry. The European Union, Russia and India are also heavy users of Silicon Valley technologies, and each is <a href="https://www.ft.com/content/3eb00398-9815-11e9-8cfb-30c211dcd229">exploring its own options</a> for legislation and regulation too.</p>
<p>U.S. companies’ size and data accumulation capabilities give the country economic and political influence around the globe. Their power would change if they were broken up – and, in my view, that should be a key consideration in regulators’ decisions.</p>
<h2>Myth 2: Price is right</h2>
<p>There are two main views of antitrust action in these discussions. One focuses on consumer welfare, which has been the prevailing approach federal lawyers have taken <a href="https://www.jstor.org/stable/724991">since the 1960s</a>. The other view suggests that regulators should look at the <a href="https://www.yalelawjournal.org/note/amazons-antitrust-paradox">underlying structure of the market</a> and potential for <a href="https://www.pbwt.com/antitrust-update-blog/a-brief-overview-of-the-new-brandeis-school-of-antitrust-law">powerful players to exploit</a> their positions.</p>
<p>Those two sides seem to agree that price plays a key role. People who argue against breaking up the tech giants point out that Facebook and Google provide services that are <a href="https://slate.com/technology/2019/06/facebook-big-tech-antitrust-breakup-mistake.html">free to the consumer</a>, and that Amazon’s marketplace power drives its products’ costs down. On the other side, though, are those who say that <a href="https://www.yalelawjournal.org/note/amazons-antitrust-paradox">having low or no prices</a> is evidence that these companies are artificially lowering consumer costs to draw users into company-controlled systems that are <a href="https://techcrunch.com/2019/02/04/why-no-one-really-quits-google-or-facebook/">hard to leave</a>.</p>
<p>Both sides are missing the fact that the monetary price is less relevant as measure of what users pay in the technology industry than it is in other types of business. Users <a href="https://theconversation.com/how-much-is-your-data-worth-to-tech-companies-lawmakers-want-to-tell-you-but-its-not-that-easy-to-calculate-119716">pay for digital products with their data</a>, rather than just money. Regulators shouldn’t focus only on the monetary costs to the users. Rather, they should ask whether users are being asked for more data than is strictly necessary, whether information is being collected in <a href="https://theconversation.com/7-in-10-smartphone-apps-share-your-data-with-third-party-services-72404">intrusive or abusive ways</a> and whether customers are <a href="https://www.axios.com/mark-warner-josh-hawley-dashboard-tech-data-4ee575b4-1706-4d05-83ce-d62621e28ee1.html">getting good value in exchange for their data</a>.</p>
<h2>Myth 3: Trust-busting is all or nothing</h2>
<p>There aren’t just two ways for this debate to end, with either a breakup of one or more technology giants or simply leaving things as they are for the market to develop further. </p>
<p>My own idea of the best outcome would take a page from the history of antitrust litigation: The company that is sued is not broken up, and yet the very fact that there was a lawsuit leads to progress. That has happened in the past, in the cases against the Bell System, IBM and Microsoft.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=538&fit=crop&dpr=1 600w, https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=538&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=538&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=676&fit=crop&dpr=1 754w, https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=676&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/284111/original/file-20190715-173376-1k7ro27.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=676&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A replica of the first transistor, developed at AT&T’s Bell Laboratories in 1947.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Replica-of-first-transistor.jpg">National Archives</a></span>
</figcaption>
</figure>
<p>In the 1956 federal consent decree against the Bell System, which settled a seven-year legal proceeding against the company, the company wasn’t split up, but Bell was required to <a href="https://economics.yale.edu/sites/default/files/how_antitrust_enforcement.pdf">license all its patents royalty-free</a> to other firms. This meant that some of the most profound technological innovations in history – including the <a href="https://www.computerhistory.org/atchm/who-invented-the-transistor/">transistor</a>, the <a href="https://www.popsci.com/article/science/invention-solar-cell/">solar cell</a> and the <a href="https://www.photonics.com/Articles/A_History_of_the_Laser_1960_-_2019/a42279">laser</a> – became widely available, yielding computers, solar power and other technologies that are crucial to the modern world. When the Bell System was <a href="https://www.cio.com/article/3267826/breaking-up-is-hard-to-do-why-the-bell-system-breakup-isn-t-a-model-for-tech.html">eventually broken up</a> in 1982, it did not do nearly as much to spread <a href="https://si.wsj.net/public/resources/images/BF-AV826_ATT_16U_20171120171814.jpg">innovation and competition</a> as the agreement that kept the Bells together a quarter-century earlier. </p>
<p>The antitrust action against IBM lasted 13 years and didn’t break up the firm. However, as part of its tactics to avoid appearing to be a monopoly, IBM agreed to <a href="https://www.cnet.com/news/ibm-and-microsoft-antitrust-then-and-now/">separate pricing for its hardware and software products</a>, previously sold as an indivisible bundle. This created an opportunity for entrepreneurs Bill Gates and Paul Allen to create a new software-only company, called Microsoft. The surge of software innovations that have followed can clearly trace their origins to the IBM settlement. </p>
<p>Two decades later, Microsoft was itself the target of an antitrust action. In the resulting settlement, <a href="https://www.theverge.com/2018/9/6/17827042/antitrust-1990s-microsoft-google-aol-monopoly-lawsuits-history">Microsoft agreed to ensure its products were compatible</a> with competitors’ software. That made room in the emerging internet marketplace for web browsers, the predecessors of Apple’s Safari, Mozilla’s Firefox and Google Chrome.</p>
<p>Even Margrethe Vestager, the European Union’s top antitrust official and frequent tech-giant nemesis, has said that “<a href="https://www.nytimes.com/2018/02/20/magazine/the-case-against-google.html">Antitrust prosecutions are part of how technology grows</a>.” But that doesn’t mean they all have to achieve their most extreme ends, of breaking up the companies. </p>
<p>Antitrust rules are complicated enough, and plenty of experts will be called on to give their views on what to do with “big tech.” Technology pervades every aspect of modern lives, giving each person a responsibility to weigh in on this issue without misconceptions clouding their judgments. Technology has become a political issue. In a politically overheated climate, public sentiments may matter even more than the opinions of experts.</p><img src="https://counter.theconversation.com/content/119283/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti has founded and directs the Institute for Business in the Global Context at Fletcher/Tufts that has received funding from Mastercard, Microsoft, the Gates Foundation, the Rockefeller Foundation and the Onassis Foundation. He is a Non-Resident Senior Fellow at Brookings India and a Senior Advisor on Digital Inclusion at the Mastercard Center for Inclusive Growth.</span></em></p>Advocates and opponents of breaking up Facebook, Google and other technology giants are falling prey to some serious misconceptions.Bhaskar Chakravorti, Dean of Global Business, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1175652019-06-12T11:30:10Z2019-06-12T11:30:10ZCompanies’ self-regulation doesn’t have to be bad for the public<figure><img src="https://images.theconversation.com/files/278726/original/file-20190610-52758-189aq1l.jpg?ixlib=rb-1.1.0&rect=0%2C162%2C5184%2C3282&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Managing a shared resource doesn't have to involve fences.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/sheep-new-zealand-421561492">Caroline Ryan</a></span></figcaption></figure><p>If Boeing is allowed to <a href="https://www.washingtonpost.com/investigations/how-the-faa-allows-jetmakers-to-self-certify-that-planes-meet-us-safety-requirements/2019/03/15/96d24d4a-46e6-11e9-90f0-0ccfeec87a61_story.html">certify that a crash-prone aircraft is safe</a>, and Facebook can <a href="https://www.nytimes.com/2019/03/07/opinion/zuckerberg-privacy-facebook.html">violate users’ privacy expectations</a>, should companies and industries ever be <a href="https://thehill.com/blogs/congress-blog/the-administration/436328-corporate-self-regulation-is-failing">allowed to police themselves</a>? The debate is <a href="https://www.reuters.com/article/us-tech-antitrust-legal-explainer/explainer-should-big-tech-fear-u-s-antitrust-enforcers-idUSKCN1T62K3">heating up</a> particularly in the U.S. tech sector with growing calls to regulate – or even break up – the likes of <a href="https://www.marketwatch.com/story/amazon-retail-chief-says-scrutiny-is-warranted-but-companys-breakup-is-not-2019-06-05">Google, Apple and Amazon</a>. </p>
<p>It turns out to be possible, at least sometimes, for companies and industries to govern themselves, while still protecting the public interest. Groundbreaking work by <a href="http://www.aei.org/publication/elinor-ostrom-and-the-solution-to-the-tragedy-of-the-commons/">Nobel Prize-winning political economist Elinor Ostrom</a> and her husband Vincent found a solution to a classic economic quandary, in which people – and businesses – self-interestedly enrich themselves as quickly as possible with <a href="https://doi.org/10.1111/ablj.12116">certain resources</a> including <a href="http://bierdoctor.com/papers/Rader_derived_data_abstract_May_2017.pdf">personal data</a>, thinking little about the secondary costs they might be inflicting on others.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=892&fit=crop&dpr=1 600w, https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=892&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=892&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1121&fit=crop&dpr=1 754w, https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1121&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/278729/original/file-20190610-52771-1j02bnf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1121&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Elinor Ostrom in 2009, when she won the Nobel Prize in Economics.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Nobel_Prize_2009-Press_Conference_KVA-30.jpg">Holger Motzkau/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>As the director of the <a href="https://ostromworkshop.indiana.edu/research/internet-cybersecurity/index.html">Ostrom Workshop Program on Cybersecurity and Internet Governance</a>, I have been involved in numerous projects studying how to solve these sorts of problems when they arise, both online and offline. Most recently, my <a href="https://illinoislawreview.org/print/vol-2017-no-2/when-toasters-attack/">work</a> has looked at how to manage the massively interconnected world of sensors, computers and smart devices – what I <a href="https://www.cisco.com/c/dam/en_us/solutions/industries/docs/gov/everything-for-cities.pdf">and others</a> call the “<a href="https://dx.doi.org/10.2139/ssrn.3266188">internet of everything</a>.” </p>
<p>I’ve found that there are ways <a href="https://doi.org/10.1162/002081898550789">companies can become leaders</a> by <a href="https://ssrn.com/abstract=2573787">experimenting with business opportunities</a> and collaborating with peers, while still working with regulators to protect the public, including both in the air and in cyberspace.</p>
<h2>Tragedy revisited</h2>
<p>In a classic economic problem, called “<a href="https://en.wikipedia.org/wiki/Tragedy_of_the_commons">the tragedy of the commons</a>,” a parcel of grassland is made available for a community to graze its livestock. Everyone tries to get the most benefit from it – and as a result, the land is overgrazed. What started as a resource for everyone becomes of little use to anyone. </p>
<p>For many years, economists thought there were only two possible solutions. One was for the government to step in and limit how many people could graze their animals. The other was to split the land up among private owners who had exclusive use of it, and could sustainably manage it for their individual benefit.</p>
<p>The Ostroms, however, found a third way. In some cases, they revealed, <a href="http://www.aei.org/publication/elinor-ostrom-and-the-solution-to-the-tragedy-of-the-commons/">self-organization can work well</a>, especially when the various people and groups involve can <a href="https://www.iucn.org/downloads/policy_matters_19_preface__introductions_and_chapters_1_5.pdf">communicate</a> effectively. They called it “polycentric governance,” because it allows regulation to come from more than just one central authority. Their work can help determine if and when companies can effectively regulate themselves – or whether it’s best for the government to step in.</p>
<h2>A polycentric primer</h2>
<p>The concept can seem complicated, but in practice it is increasingly popular, in federal programs and even as a goal for <a href="https://www.washingtonpost.com/news/the-switch/wp/2014/10/07/internet-operations-chief-snowden-disclosures-make-my-job-easier/">governing the internet</a>. </p>
<p>Scholars such as Elinor Ostrom produced a broad swath of research over decades, looking at <a href="https://books.google.hr/books/about/Polycentricity_and_Local_Public_Economie.html?id=iBZ32c7KLWUC&redir_esc=y">public schools and police department performance</a> in Midwestern U.S. cities, coastal overfishing, forest management in nations like Nepal, and even <a href="https://ir.lawnet.fordham.edu/ulj/vol37/iss3/7">traffic jams</a> in New York City. They identified <a href="https://dx.doi.org/10.2139/ssrn.1304697">commonalities among all these studies</a>, <a href="https://www.nobelprize.org/uploads/2018/06/ostrom_lecture.pdf">including</a> whether the group’s members can help set the rules by which their shared resources are governed, how much control they have over who gets to share it, how disputes are resolved, and how everyone’s use is monitored.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/T6OgRki5SgM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Nobel Prize winner Elinor Ostrom explains her work in a 2010 lecture.</span></figcaption>
</figure>
<p>All of these factors can help predict whether individuals or groups will successfully self-regulate, whether the challenge they’re facing is <a href="https://ssrn.com/abstract=1494833">climate change</a>, <a href="https://digitalcommons.wcl.american.edu/cgi/viewcontent.cgi?referer=&httpsredir=1&article=1888&context=aulr">cybersecurity</a>, or anything else. <a href="http://escotet.org/2010/11/interview-with-nobel-laureate-elinor-ostrom/">Trust is key</a>, as Lin Ostrom said, and an excellent way to build trust is to let <a href="http://dx.doi.org/10.1038/nclimate2490">smaller groups make their own decisions</a>.</p>
<p>Polycentric governance’s embrace of self-regulation involves relying on <a href="https://www.ubs.com/microsites/nobel-perspectives/en/laureates/elinor-ostrom.html">human ingenuity</a> and collaboration skills to solve difficult problems – while focusing on practical measures to address specific challenges.</p>
<p>Self-regulation does have its limits, though – as has been clear in the revelations about how <a href="https://www.businessinsider.com/faa-let-boeing-self-regulate-software-believed-737-max-crashes-2019-3">the Federal Aviation Administration allowed Boeing</a> to <a href="https://www.businessinsider.com/faa-let-boeing-self-regulate-software-believed-737-max-crashes-2019-3">certify the safety</a> <a href="https://arstechnica.com/information-technology/2019/03/boeing-downplayed-737-max-software-risks-self-certified-much-of-planes-safety/">of its own software</a>. Facebook has also been heavily criticized for failing to block an <a href="https://www.cbsnews.com/news/facebooks-biggest-fails-before-cambridge-analytica/">anonymous horde</a> of <a href="https://www.wired.com/story/facebook-passwords-plaintext-change-yours/">users across the globe</a> from <a href="https://theconversation.com/facebooks-social-responsibility-should-include-privacy-protection-94549">manipulating people</a>’s <a href="https://www.nytimes.com/2019/04/25/technology/facebook-regulation-ftc-fine.html">political views</a>.</p>
<p>Polycentric regulation is a departure from the idea of “<a href="https://www.dallasnews.com/opinion/commentary/2012/06/14/jeffrey-weiss-elinor-ostroms-enduring-trust-in-the-commons">keep it simple, stupid</a>” – rather, it is a call for engagement by numerous groups to grapple with the complexities of the real world. </p>
<p>Both Facebook and Boeing now need to convince themselves, their employees, investors, policymakers, users and customers that they can be trusted. Ostrom’s ideas suggest they could begin to do this by engaging with peers and industry groups to set rules and ensure they are enforced.</p>
<h2>Governing the ‘internet of everything’</h2>
<p>Another industry in <a href="https://www.forbes.com/sites/annashedletsky/2018/08/06/why-industrial-iot-is-usually-a-failure-and-how-to-fix-it/#2fe576d042ed">serious need of better regulations</a> is the smart-device business, with tens of billions of connected devices around the world, and little to no <a href="https://www.csmonitor.com/World/Passcode/Passcode-Voices/2016/1026/Opinion-How-to-fix-an-internet-of-broken-things">concern</a> for user security or privacy.</p>
<p>Customers often buy the cheapest smart-home camera or digital sensor, <a href="https://www.schneier.com/books/click_here/">without looking at competitors’</a> security and privacy protections. The results are predictable – hackers have hijacked thousands of internet-connected devices and used them to attack the <a href="https://www.forbes.com/sites/davelewis/2017/10/23/the-ddos-attack-against-dyn-one-year-later/#4765cbe51ae9">physical network of the internet</a>, take control of <a href="https://www.bbc.com/news/technology-30575104">industrial</a> equipment, and spy on private citizens through their smartphones and <a href="https://www.marketwatch.com/story/woman-claims-hacker-used-baby-monitor-to-spy-on-her-in-her-bedroom-2018-06-07">baby monitors</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/278730/original/file-20190610-52789-1oe6wxi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Who else might be watching this view, over the internet?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/closeup-baby-monitor-security-538634722">Saklakova/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Some governments are starting to get involved. The state of California and the European Union are exploring laws that promote “<a href="https://www.natlawreview.com/article/california-law-iot-devised-to-have-reasonable-security-feature">reasonable</a>” security requirements, at least as a baseline. The EU is encouraging companies to band together to establish <a href="https://iapp.org/news/a/will-the-gdpr-incite-sectoral-codes-of-conduct/">industry-wide codes of conduct</a>. </p>
<h2>Getting governance right</h2>
<p>Effective self-governance may seem impossible in the “Internet of everything” because of the scale and variety of groups and industries involved, but polycentric governance does provide a useful lens through which to view these problems. Ostrom has asserted this approach may be <a href="https://dx.doi.org/10.2139/ssrn.1304697">the most flexible and adaptable way</a> to manage rapidly changing industries. It may also help avoid conflicting government regulations that risk stifling innovation in the name of protecting consumers without helping either cause. </p>
<p>But success is not certain. It requires active engagement by all parties, who must share a sense of responsibility to the customers and mutual trust in one another. That’s not easy to build in any community, let alone the <a href="https://www.digitalistmag.com/digital-economy/2018/07/20/digital-transformation-modern-form-of-creative-destruction-06179806">dynamic tech industry</a>.</p>
<p>Government involvement can help build bridges and solidify trust across the private sector, as happened with cybersecurity efforts from the <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2446631">National Institute for Standards and Technology</a>. Some states, like <a href="https://www.techrepublic.com/article/ohio-law-creates-cybersecurity-safe-harbor-for-businesses/">Ohio</a>, are even rewarding firms for using appropriate self-regulation in their cybersecurity decision-making.</p>
<p>Polycentric governance can be flexible, adapting to new technologies more appropriately – and often more quickly – than pure governmental regulation. It also can be more efficient and cost-effective, though it’s not a cure for all regulatory ills. And it’s important to note that regulation can spur innovation as well as protect consumers, especially <a href="https://www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/the-simple-rules-of-disciplined-innovation">when the rules are simple</a> and outcome focused.</p>
<p>Consider the North American Electric Reliability Council. That organization was originally created as a group of companies that came together voluntarily in an effort to protect against blackouts. NERC standards, however, were eventually made legally enforceable in the aftermath of the <a href="http://www.eenews.net/stories/1059985876/print">Northeast blackout of 2003</a>. They are an example of an organic code of conduct that was voluntarily adopted and subsequently reinforced by government, consistent with professor Ostrom’s ideas. Ideally, it should not require such a crisis to spur this process forward. </p>
<p>Ultimately, what’s needed – and what professor Ostrom and her colleagues and successors have called for – is more experimentation and less theorizing. As the 10-year anniversary of Ostrom’s Nobel Prize approaches, I believe it is time to put her insights to work, offering industries the opportunity to self-regulate where appropriate while leaving the door open for the possibility of government action, including antitrust enforcement, to protect the public and promote <a href="https://ndias.nd.edu/news-publications/ndias-quarterly/the-meaning-of-cyber-peace/">cyber peace</a>.</p>
<p>[ <em><a href="https://theconversation.com/us/newsletters?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=thanksforreading">Thanks for reading! We can send you The Conversation’s stories every day in an informative email. Sign up today.</a></em> ]</p><img src="https://counter.theconversation.com/content/117565/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Scott Shackelford does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A Nobel Prize-winning political economist found a way to promote good governance and protect users without the need for heavy-handed government regulation.Scott Shackelford, Associate Professor of Business Law and Ethics; Director, Ostrom Workshop Program on Cybersecurity and Internet Governance; Cybersecurity Program Chair, IU-Bloomington, Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1101532019-01-31T11:43:33Z2019-01-31T11:43:33ZFacebook at 15: It’s not all bad, but now it must be good<figure><img src="https://images.theconversation.com/files/256152/original/file-20190129-108364-1ljvmw1.jpg?ixlib=rb-1.1.0&rect=343%2C17%2C5535%2C3895&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Doth the CEO protest too much?</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Facebook-Privacy-Scandal-Congress/5122dc257cb64d2198e92691210420d9/54/0">AP Photo/Andrew Harnik</a></span></figcaption></figure><p>It is almost too easy to bash Facebook these days. Nearly <a href="https://www.theverge.com/2018/2/6/16976328/facebook-mark-zuckerberg-pollster-tavis-mcginn-honest-data">a third of Americans</a> feel the country’s <a href="https://www.statista.com/statistics/248074/most-popular-us-social-networking-apps-ranked-by-audience/">most popular social media platform</a> is bad for society. As the company approaches its 15th birthday, <a href="https://www.theverge.com/2018/2/6/16976328/facebook-mark-zuckerberg-pollster-tavis-mcginn-honest-data">Americans rate its social benefit</a> as <a href="https://twitter.com/benioff/status/1062578525377425408?lang=en">better than Marlboro cigarettes</a>, but worse than McDonald’s. </p>
<p>Yet as a <a href="https://fletcher.tufts.edu/people/bhaskar-chakravorti">scholar of digital technologies</a> and their effects on society – and even though I am not on Facebook – I worry that public perception has become overly critical of Facebook. It’s true that the company has been behaving like many 15-year-old adolescents, acting <a href="https://www.nbcnews.com/tech/tech-news/facebook-s-2018-timeline-scandals-hearings-security-bugs-n952796">irresponsibly and selfishly</a>, and making <a href="http://fortune.com/2019/01/20/sheryl-sandberg-facebook-five-step-plan/">endless promises</a> to do better, at least until the next mess is uncovered. However, as talk grows of <a href="https://www.nytimes.com/2019/01/18/technology/facebook-ftc-fines.html">fines</a> and <a href="https://investorplace.com/2019/01/facebook-stock-is-immune-to-regulation/">regulations</a>, it’s worth remembering there is such a thing as overregulation, which would respond to the urgency and charged political climate of the current moment but hurt the public interest in the long run.</p>
<p>Official action to rein in Facebook’s power should reflect on the bad and ugly things the company has done and allowed to happen. But the debate shouldn’t forget some things about Facebook that would qualify as “great,” which may have been missed in the avalanche of negative sentiment toward the company and its leaders.</p>
<h2>The bad stuff</h2>
<p>The individual and social harms due to Facebook are many, including contributing to <a href="https://www.ft.com/content/02b6d334-8c2d-11e8-b18d-0181731a0340">concentration in the online advertising market</a>, <a href="https://blocnotesdeleco.banque-france.fr/billet-de-blog/les-monopoles-un-danger-pour-les-etats-unis">with negative impact on productivity and wage growth</a>, <a href="https://doi.org/10.1016/j.chb.2011.08.026">distracting</a> <a href="https://www.sciencedirect.com/science/article/pii/S0747563210000646">students</a> and <a href="http://doi.org/10.1093/aje/kww189">potentially causing users</a> <a href="https://munews.missouri.edu/news-releases/2015/0203-if-facebook-use-causes-envy-depression-could-follow/">mental distress</a> and <a href="https://www.bloomberg.com/news/articles/2019-01-10/facebook-junkies-are-similar-to-drug-addicts-study-finds">giving rise to symptoms akin to substance abuse</a>.</p>
<p>The bottom line is clear: Spending too much time on Facebook may be bad for you. </p>
<h2>Things get ugly</h2>
<p>All technology companies have been experiencing some <a href="https://theconversation.com/us/topics/technology-backlash-47393">heightened skepticism</a>. However, more <a href="https://www.cbinsights.com/research/facebook-fares-very-poorly-in-this-survey">Americans felt negatively toward Facebook</a> than those who felt similarly about Amazon, Google, Microsoft and Apple combined, according to a 2017 poll. Facebook’s place in the public perception has only deteriorated since then. </p>
<p>The <a href="https://www.nbcnews.com/tech/tech-news/facebook-s-2018-timeline-scandals-hearings-security-bugs-n952796">company’s violations of user trust</a> are legion, including <a href="https://www.cnn.com/2018/12/19/tech/facebook-user-data-big-tech-companies/index.html">ignoring its own privacy policies</a>, <a href="https://www.nytimes.com/2018/12/19/technology/facebook-data-sharing.html">sharing data without permission</a>, <a href="https://www.usatoday.com/story/tech/2019/01/25/facebook-duped-kids-into-spending-games-without-parents-permission/2679250002/">tricking children into spending their parents’ money</a>, <a href="https://www.nytimes.com/2018/11/04/us/politics/election-misinformation-facebook.html">allowing disinformation campaigns</a> that affect elections in the U.S. and elsewhere, and – perhaps worst of all – magnifying propaganda that has <a href="https://www.nytimes.com/2018/04/21/world/asia/facebook-sri-lanka-riots.html">sparked violence</a> around the world.</p>
<p>In the U.S., the company’s services have allowed bias and discrimination to take root. In early 2018, the National Fair Housing Alliance and affiliated groups sued Facebook, alleging that its advertising platform let <a href="https://www.nytimes.com/2018/03/27/nyregion/facebook-housing-ads-discrimination-lawsuit.html">landlords and real-estate brokers discriminate</a> against women, disabled veterans and single mothers, among other groups. The company’s own civil-rights audit found it <a href="https://www.cnbc.com/2018/12/18/facebooks-sheryl-sandberg-on-civil-right-abuses.html">contributed to voter suppression</a> and targeted manipulative advertising to impressionable groups. That report came on the heels of two comprehensive reports compiled for the U.S. Senate detailing how <a href="https://comprop.oii.ox.ac.uk/research/ira-political-polarization/">Russian government agents used Facebook</a> and other social media sites to <a href="https://www.newknowledge.com/articles/the-disinformation-report/">influence Americans’ thinking</a>.</p>
<p>The company’s rap sheet is long and growing. Its <a href="https://techcrunch.com/2019/01/20/stung-by-criticism-facebooks-sandberg-outlines-new-plans-to-tackle-misinformation/">repeated assurances that it will fix</a> the problems are now roundly assumed to be empty promises.</p>
<h2>But wait, there is great stuff, too</h2>
<p>With this much going wrong, it is easy to forget that the company has shown great technological and business sophistication in connecting people like never before. Facebook combined innovative <a href="https://medium.com/s/a-brief-history-of-attention/how-likes-went-bad-b094ddd07d4">social-networking ideas</a> from others and <a href="https://moneyinc.com/10-largest-facebook-acquisitions-record/">bought up potential competitors</a> like Instagram and WhatsApp. This itself constitutes an innovation in creating a connectivity platform like no other.</p>
<p>In terms of contribution to the economy, the company is right – if a tad self-serving – to note that it has <a href="https://www.wsj.com/articles/the-facts-about-facebook-11548374613">helped small businesses</a> reach new customers and build relationships with both existing and prospective clients. The value of those connections is unclear – a single “like” could be worth <a href="https://www.businessinsider.com/what-is-a-facebook-like-actually-worth-in-dollars-2013-3">anywhere between nothing and US$214.81</a>, depending on the type of business and what it’s looking for Facebook users to do. An independent study from the U.S. Bureau of Economic Analysis found that from 2005 to 2015, U.S. <a href="https://www.philadelphiafed.org/-/media/research-and-data/publications/working-papers/2017/wp17-37.pdf?la=en">gross domestic product grew one-tenth of 1 percent faster</a> than it would have if Facebook hadn’t existed.</p>
<p>In terms of how connectivity helps advance other innovations, Facebook is a key contributor to <a href="https://thenewstack.io/a-reason-to-not-hate-facebook-open-source-contributions/">leading-edge open-source coding projects</a> in a range of applications, such as machine learning, gaming, 3D printing, home automation, scientific programming and data analysis, among others. The company has also leveraged its huge network of users to help <a href="https://www.fastcompany.com/40546380/facebooks-disaster-maps-helps-rescuers-know-where-theyre-needed-most">authorities</a>, <a href="https://mashable.com/2017/11/29/facebook-community-help-api-fundraising/">communities</a> and <a href="http://fortune.com/2015/11/16/facebook-safety-check/">families</a> respond efficiently to natural and human-caused disasters.</p>
<p>Particular groups of Facebook users may also see distinct benefits from being connected. Elderly people may get a <a href="https://uanews.arizona.edu/story/should-grandma-join-facebook-it-may-give-her-a-cognitive-boost-study-finds">cognitive boost</a>; people who <a href="https://doi.org/10.1089/cyber.2009.0411">seek a self-esteem boost</a> from viewing their own profiles, <a href="https://doi.org/10.1089/cpb.2008.0214">shy people</a>, <a href="https://doi.org/10.1007/s11606-010-1526-3">people with diabetes</a> and <a href="https://doi.org/10.1352/1934-9556-52.6.456">people on the autism spectrum</a> have all felt more support and improved well-being from using the site. </p>
<h2>Can Facebook turn great to good?</h2>
<p>As Facebook turns 15, the company faces a critical set of challenges. U.S. officials will be scrutinizing its activities and seeking ways to curb its power in society. Regulating Facebook itself will <a href="https://www.vox.com/technology/2018/4/12/17224096/regulating-facebook-problems">not be easy</a>, and will generate endless debate. The company will also have to contend with covert online agents <a href="https://thehill.com/policy/cybersecurity/427430-intel-leaders-warn-of-russian-influence-threat-ahead-of-2020-election">seeking to undermine democracy</a> by using Facebook to influence elections in India, Europe, Nigeria and Poland, among other places – not to mention the 2020 U.S. presidential election.</p>
<p>The company’s management will have to take bold steps, not only to defend Facebook’s positive features, but to eliminate – or at least reduce – the harm the company’s products and services do to people and society. Most companies aspire to go from “<a href="https://www.harpercollins.com/9780066620992/good-to-great/">good to great</a>”; Facebook’s challenge at 15 is a bit more complicated: It must convince a skeptical public and regulators chomping at the bit that it can mitigate the effects of its bad and the ugly sides – and go from being great to being a <a href="https://www.newyorker.com/magazine/2018/09/17/can-mark-zuckerberg-fix-facebook-before-it-breaks-democracy">force for good in the world</a>.</p><img src="https://counter.theconversation.com/content/110153/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti has founded and directs the Institute for Business in the Global Context at Fletcher/Tufts that has received funding from Mastercard, Microsoft, the Gates Foundation and the Onassis Foundation. He is a Non-Resident Senior Fellow at Brookings India and a Senior Advisor on Digital Inclusion at the Mastercard Center for Inclusive Growth.</span></em></p>Facebook has been acting irresponsibly and selfishly, and promising to do better without actually improving. But that’s not the whole story: The company has some positive qualities, too.Bhaskar Chakravorti, Dean of Global Business, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1074282018-12-12T11:42:38Z2018-12-12T11:42:38ZDon’t worry about screen time – focus on how you use technology<figure><img src="https://images.theconversation.com/files/249815/original/file-20181210-76977-1c28o2h.jpg?ixlib=rb-1.1.0&rect=0%2C33%2C5615%2C3699&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Through creative off-label uses of technology, some people have improved close relationships and their health.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/pixel-girl-peeps-out-phone-720066970">KristinaZ/Shutterstock.com</a></span></figcaption></figure><p>Many Americans find themselves bombarded by expert advice to limit their screen time and break their addictions to digital devices – including enforcing and modeling this restraint for the children in their lives. However, <a href="https://scholar.google.com/citations?user=8bP4OqUAAAAJ&hl=en">over 15 years</a> of closely observing people and talking with them about how they use technological tools, I’ve developed a more nuanced view: Whether a technology helps or hurts someone depends not just on the amount of time they spend with it but on how they use it.</p>
<p>I’ve found many people who have found impressively creative ways to tailor the technologies they have to serve their values and personal objectives, improving their relationships and even their health.</p>
<p>In my forthcoming book, “<a href="https://mitpress.mit.edu/books/left-our-own-devices">Left to Our Own Devices</a>,” I introduce readers to people who pushed products beyond their intended purpose, creating their own off-label uses. Some of them turned self-help products, like smart scales and mood apps, into mechanisms for deepening relationships; others used apps like Tinder, designed to spark interpersonal connection, as an emotional pickup – gathering data to feel better about themselves without the hookup. And still others have pieced together different tools and technologies to suit their own needs.</p>
<h2>Looking beyond the rules</h2>
<p>A few years ago, for instance, my colleagues and I <a href="https://doi.org/10.2196/jmir.1371">created an app to help people manage stress</a> as part of a health technology research project. Psychotherapy and other mental health services have traditionally been offered as individual treatments, and so we expected people would use our app on their own, when they were alone. We put a great deal of effort into assuring privacy and instructed people who participated in our research that the app was for their use only.</p>
<p>But many of the participants ended up bringing the app into their conversations with others. One woman used it with her son to process a heated argument they had earlier in the day. She sat down with him and together explored the visuals in the app that represented stages of anger. They followed the app’s cognitive therapy cues for thinking about feelings and reactions – their own and each other’s. She shared it with him not as a flashy distraction, but as a bridge to help each understand the other’s perspectives and feelings.</p>
<p>The app was intended to help her change the way she thought about stress, but she also used it to address the source of her stress – making the app more effective by, in a certain sense, misusing it.</p>
<h2>New turns with familiar devices</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/250033/original/file-20181211-76971-12cj5ow.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Controlling the lights can send a message.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/mobile-phone-womans-hand-night-city-157563695">LDprod/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Another woman I spoke with took smart lights – the ones that can change color at the tap of a button in a smartphone app – far beyond their intended functions of improving decor and energy efficiency. When she changed the color of the lights in the home she shared with her partner from white to red, it was a <a href="https://doi.org/10.1145/3027063.3053141">signal that she was upset</a> and that they needed to talk. The light color became an external symbol of the conflict between them and provided a new way to begin a difficult conversation. </p>
<p>Similarly creative thinking helped strengthen the relationships between patients and a physician I interviewed. She practiced primarily through telemedicine, meeting with patients via a secure medical videoconferencing system. She was aware that physical and emotional distance could weaken a relationship already fraught with sensitivity and an imbalance of power between an expert and a patient.</p>
<p>So she experimented with the view her camera provided of her and her surroundings. First, she showed patients a view of just her face, in front of an unadorned white wall that revealed nothing about her. Then she shifted the camera to show more of her home, which of course revealed more of herself. Patients could now see some of the art that she liked as well as elements of her home, which said something about her habits, values and personality. </p>
<p>This sharing leveled the playing field in some ways. As patients were opening up themselves to her by describing symptoms and the details of their lifestyle, they could see that she was not a lab-coat-clad expert issuing directives from an intimidating medical office – she was a real person living in an ordinary apartment. This step toward reciprocity made it easier for patients to relate to her. She believes this is part of why her patients have expressed feeling close to her and so much trust in her treatment. It was a small adaptation that brought greater rapport and connection to a technology often viewed as a poor replacement for in-person meetings.</p>
<p>With increasing attention to the effects of technologies, we should not only be concerned with their potential harms. As I’ve observed, experimenting with how – not just how much – we use technology might uncover unexpected ways to make life better.</p>
<p>
<section class="inline-content">
<img src="https://images.theconversation.com/files/248895/original/file-20181204-133100-t34yqm.png?w=128&h=128">
<div>
<header>Margaret E. Morris is the author of:</header>
<p><a href="https://mitpress.mit.edu/books/left-our-own-devices">Left to Our Own Devices: Outsmarting Smart Technology to Reclaim Our Relationships, Health, and Focus</a></p>
<footer>MIT Press provides funding as a member of The Conversation US.</footer>
</div>
</section>
</p><img src="https://counter.theconversation.com/content/107428/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Margaret E. Morris does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Whether a technology helps or hurts people depends not on how much time they spend with it, but how they use it.Margaret E. Morris, Affiliate Faculty in Human Centered Design and Engineering, University of WashingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1069892018-11-19T11:38:14Z2018-11-19T11:38:14ZTechnology giants didn’t deserve public trust in the first place<figure><img src="https://images.theconversation.com/files/245866/original/file-20181115-194500-r9y4u2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Should you have trusted this man with so much of your personal data?</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Facebook-Privacy-Scandal-Congress/c3d8748116cc46d6baf5f6a6413ec0d2/3/0">AP Photo/Carolyn Kaster</a></span></figcaption></figure><p>Amazon may have been expecting lots of public attention when it announced where it would establish its new headquarters – but like many technology companies recently, it probably didn’t anticipate how negative the response would be. In Amazon’s chosen territories of New York and Virginia, <a href="https://www.washingtonpost.com/business/2018/11/13/amazons-hq-split-between-nycarlington-sparks-backlash-locals-politicians/">local politicians balked</a> at <a href="https://www.yonkerstribune.com/2018/11/statement-from-senator-michael-gianais-and-new-york-city-council-member-jimmy-van-bramer-regarding-long-island-dicty-amazon-deal">taxpayer-funded enticements</a> promised to the company. Journalists across the political spectrum <a href="http://www.startribune.com/minnesota-like-others-was-played-by-amazon-in-the-hq2-sweepstakes/500229032/">panned the deals</a> – and <a href="https://www.westword.com/news/twitter-reaction-to-amazon-hq2-bypassing-denver-bullet-dodged-11000743">social media filled up</a> with the <a href="https://www.businessinsider.com/people-are-furious-about-amazon-hq2-reported-selection-2018-11">voices of New Yorkers and Virginians</a> pledging resistance.</p>
<p>Similarly, revelations that <a href="https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html">Facebook exploited anti-Semitic conspiracy theories</a> to undermine its critics’ legitimacy indicate that instead of changing, Facebook would rather go on the offensive. Even as Amazon and Apple saw their <a href="https://www.nytimes.com/2018/09/04/technology/amazon-stock-price-1-trillion-value.html">stock-market values briefly top US$1 trillion</a>, technology executives were <a href="https://www.theguardian.com/technology/2018/apr/11/mark-zuckerbergs-testimony-to-congress-the-key-moments">dragged before Congress</a>, struggled to coherently take a <a href="https://www.wired.com/story/twitter-dehumanizing-speech-policy/">stance on hate speech</a>, got caught <a href="https://www.nytimes.com/2018/11/01/technology/google-walkout-sexual-harassment.html">covering up sexual misconduct</a> and saw their own <a href="https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html">employees protesting business deals</a>.</p>
<p>In some circles this is being seen as a <a href="https://www.recode.net/2018/4/10/17220060/facebook-trust-major-tech-company">loss of public trust</a> in the technology <a href="https://www.wired.com/story/the-other-tech-bubble/">firms that promised to remake the world</a> – <a href="https://newrepublic.com/article/146924/silicon-valleys-origin-story">socially, environmentally and politically</a> – or at least as frustration with the way these companies have changed the world. But the technology companies need to do much more than regain the public’s trust; they need to <a href="https://www.nytimes.com/2018/11/08/business/sundar-pichai-google-corner-office.html">prove that they deserved it</a> in the first place – which, when placed in the context of the <a href="https://hss.sas.upenn.edu/people/zachary-loeb">history of technology criticism and skepticism</a>, they didn’t.</p>
<h2>Looking away from the problems</h2>
<p>Big technology companies used to frame their projects in vaguely utopian, positive-sounding lingo that obscures politics and public policy, transcending partisanship and – conveniently – avoiding scrutiny. Google used to remind its workers “<a href="https://gizmodo.com/google-removes-nearly-all-mentions-of-dont-be-evil-from-1826153393">Don’t be evil</a>.” Facebook worked to “<a href="https://qz.com/1012461/facebook-changes-its-mission-statement-from-ing-its-mission-statement-from-sharing-making-the-world-more-open-and-connected-to-build-community-and-bring-the-world-closer-together/">make the world more open and connected</a>.” Who could object to those ideals?</p>
<p>Scholars warned about the dangers of platforms like these, long before many of their founders were even born. In 1970, social critic and historian of technology Lewis Mumford predicted that the goal of what he termed “computerdom” would be “<a href="https://www.boundary2.org/2018/07/loeb/">to furnish and process an endless quantity of data</a>, in order to expand the role and ensure the domination of the power system.” That same year a <a href="https://www.wired.com/story/silicon-valley-tyranny-of-structurelessness/">seminal essay by feminist thinker Jo Freeman</a> warned about the <a href="https://jofreeman.com/joreen/tyranny.htm">inherent power imbalances</a> that remained in systems that appeared to make everyone equal. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=643&fit=crop&dpr=1 600w, https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=643&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=643&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=808&fit=crop&dpr=1 754w, https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=808&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/245868/original/file-20181115-172710-1jm44a8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=808&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Joseph Weizenbaum warned you.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Joseph_Weizenbaum.jpg">Ulrich Hansen/Wikimedia Commons</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Similarly, in 1976, the computer scientist Joseph Weizenbaum predicted that in the decades ahead people would find themselves in a state of distress as they became <a href="https://librarianshipwreck.wordpress.com/2016/07/27/an-island-of-reason-in-the-cyberstream-on-the-life-and-thought-of-joseph-weizenbaum/">increasingly reliant on opaque technical systems</a>. Countless similar warnings have been issued ever since, including important recent scholarship such as information scholar <a href="https://scholar.google.com/citations?user=2VnIbO4AAAAJ&hl=en&oi=ao">Safiya Noble</a>’s exploration of how <a href="http://www.slate.com/articles/podcasts/if_then/2018/09/if_then_talks_to_professor_safiya_noble_on_her_book_algorithms_of_oppression.html">Google searches replicate racial and gender biases</a> and media scholar <a href="https://scholar.google.com/citations?user=_FlYDU4AAAAJ&hl=en">Siva Vaidhyanthan</a>’s declaration that “<a href="https://logicmag.io/05-the-problem-with-facebook-is-facebook/">the problem with Facebook is Facebook</a>.”</p>
<p>The technology companies are powerful and wealthy, but their days of avoiding scrutiny may be ending. The American public seems to be starting to suspect that the <a href="https://theconversation.com/big-tech-isnt-one-big-monopoly-its-5-companies-all-in-different-businesses-92791">technology giants were unprepared</a>, and perhaps unwilling, to assume responsibility for the tools they unleashed upon the world. </p>
<p>In the aftermath of the 2016 U.S. presidential election, concern remains high that Russian and other foreign governments are using any available social media platform to <a href="https://www.theatlantic.com/politics/archive/2018/10/doj-says-russian-trolls-interfering-midterm-elections/573526/">sow discord and discontent</a> in <a href="https://www.bloomberg.com/features/2018-government-sponsored-cyber-militia-cookbook/">societies around the globe</a>. </p>
<p>Facebook has <a href="https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html">still not solved the problems</a> in <a href="https://www.theverge.com/2018/10/17/17986992/facebook-portal-privacy-claims-ad-targeting">data privacy and transparency</a> that caused the <a href="https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078">Cambridge Analytica scandal</a>. Twitter is the preferred <a href="https://www.cnn.com/interactive/2017/politics/trump-tweets/">megaphone for President Donald Trump</a> and home to <a href="https://theconversation.com/hate-speech-is-still-easy-to-find-on-social-media-106020">disturbing quantities of violent hate speech</a>. The future of Amazon’s corporate offices is shaping up to be a <a href="https://www.wired.com/story/amazon-hq2-search-backfired/">multi-sided brawl</a> among elected officials and the people they supposedly represent. </p>
<h2>Is it ignorance or naivete?</h2>
<p>Viewing the present situation with the history of critiques of technology in mind, it’s hard not to conclude that the technology companies deserve the crises they are facing. These companies ask people to entrust them with their emails, personal data, online search histories and financial information, to the point that many of these companies proudly tout that they know individuals <a href="http://knowledge.wharton.upenn.edu/article/internet-can-tell-us-really/">better than they know themselves</a>. They promote their latest systems, including “smart speakers” and “smart cameras,” seeking to ensure that <a href="https://www.theguardian.com/technology/2018/nov/10/spy-christmas-smart-home-facebook-portal-google-home-hub-amazon-show-alexa">users’ every waking moment</a> – and sleeping moments too – can be monitored, feeding more data into their money-making algorithms.</p>
<p>Yet seemingly inevitably these companies go on to demonstrate how <a href="https://theconversation.com/silicon-valley-from-hearts-delight-to-toxic-wasteland-86983">unworthy of trust</a> they actually are, leaking data, <a href="https://www.wired.com/story/online-ad-targeting-does-work-as-long-as-its-not-creepy/">sharing personal information</a> and failing to <a href="https://www.nytimes.com/2018/09/28/technology/facebook-hack-data-breach.html">prevent hacking</a>, as they slowly fill the world with a disturbing techno-paranoia worthy of an episode of “<a href="https://www.netflix.com/title/70264888">Black Mirror</a>.” </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/RwUA9JB8iYg?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Selected ‘Black Mirror’ highlights.</span></figcaption>
</figure>
<p>Technology firms’ responses to each new revelation fit a standard pattern: After a scandal emerges, the company involved expresses alarm that anything went wrong, promises to investigate, <a href="https://www.theguardian.com/technology/2018/mar/21/mark-zuckerberg-response-facebook-cambridge-analytica">and pledges to do better in the future</a>. Some time – days, weeks or even months – later, the company reveals that the scandal was a direct result of how the system was designed, and trots out a dismayed executive to express outrage at the destructive uses bad people found for their system, without admitting that the problem is the system itself.</p>
<p>Zuckerberg himself told the U.S. Senate in April 2018 that the Cambridge Analytica scandal had taught him “<a href="https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/">we have a responsibility</a> to not just give people tools, but to make sure that those tools are used for good.” That’s a <a href="https://www.facebook.com/RubesCartoons/photos/a.477074597360/10154718644957361/?type=3&theater">pretty fundamental lesson</a> to have missed out on while creating a multi-billion-dollar company.</p>
<h2>Rebuilding from what’s left</h2>
<p>Using any technology – from a knife to a computer – carries risks, but as technological systems increase in size and complexity <a href="https://www.nature.com/articles/477404a.pdf?origin=ppub">the scale of these risks tends to increase as well</a>. A technology is only useful if people can use it safely, in ways where the benefits outweigh the dangers, and if they can feel confident that they understand, and accept, the potential risks. A couple of years ago, Facebook, Twitter and Google may have appeared to most people as benign communication methods that brought more to society than they took away. But with every new scandal, and bungled response, more and more people are seeing that these companies pose serious dangers to society.</p>
<p>As tempting as it may be to point to the “off” button, there’s not an easy solution. Technology giants have made themselves part of the <a href="http://www.pewinternet.org/2018/03/01/social-media-use-in-2018/">fabric of daily life for hundreds of millions of people</a>. <a href="https://www.theguardian.com/technology/2018/mar/20/facebook-is-it-time-we-all-deleted-our-accounts">Suggesting that people just quit is simple</a>, but fails to recognize how reliant many people have become on these platforms – and how trapped they may feel in an increasingly intolerable situation. </p>
<p>As a result, people <a href="https://www.amazon.com/Store-James-Patterson-ebook/dp/B01MYGCHSG">buy books about how bad Amazon</a> is – by ordering them on Amazon. They conduct Google searches for articles about <a href="https://trends.google.com/trends/explore?q=google%20data%20collection&geo=US">how much information Google knows</a> about each individual user. They tweet about how much they hate Twitter and post on Facebook articles about Facebook’s latest scandal.</p>
<p>The technology companies may find themselves ruling over an increasingly aggravated user base, as their platforms spread the discontent farther and wider than possible in the past. Or they might choose to change themselves dramatically, <a href="https://www.recode.net/2018/5/18/17366868/scott-galloway-google-apple-facebook-amazon-regulation-privacy-power-dictators-kara-swisher-podcast">breaking themselves up</a>, turning some controls over to the <a href="https://www.nytimes.com/2018/03/28/technology/social-media-privacy.html">democratic decisions of their users</a> and taking responsibility for the harm their platforms and products have done to the world. So far, though, it seems the industry hasn’t gone beyond offering half-baked apologies while continuing to go about business as usual. Hopefully that will change. But if the past is any guide, it probably won’t.</p><img src="https://counter.theconversation.com/content/106989/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Zachary Loeb does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Scholars and skeptics warned about Facebook long before its founder was even born. Technology companies keep asking for more and more data and proving they can’t be trusted.Zachary Loeb, Ph.D. Student in History and Sociology of Science, University of PennsylvaniaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1050072018-11-06T11:42:02Z2018-11-06T11:42:02ZA game plan for technology companies to actually help save the world<figure><img src="https://images.theconversation.com/files/242837/original/file-20181029-76411-1xzi1i9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Working together, people and technology companies can make a lot of progress.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/saving-world-10199140">Pedro Tavares/Shutterstock.com</a></span></figcaption></figure><p>Smartphones, computers and social media platforms have become indispensable parts of modern life, but the technology companies that make them and write their software are under siege. In any given week, <a href="https://www.recode.net/2018/9/28/17915864/facebook-data-breach-mark-zuckerberg-hack-personal-data">Facebook</a> or <a href="https://slate.com/technology/2018/10/google-is-losing-users-trust.html">Google</a> or <a href="https://gizmodo.com/new-documents-show-amazons-face-scanning-tech-for-cops-1830032358">Amazon</a> does something to erode public trust in them. Now could be a moment for the industry to make good on Bill Gates’s promise of technology to do good, by “<a href="https://www.wired.com/2013/11/bill-gates-wired-essay/">unlocking the innate compassion</a> we have for our fellow human beings” and improving the world – or Mark Zuckerberg’s dream of building a “<a href="https://www.facebook.com/notes/mark-zuckerberg/building-global-community/10154544292806634">new social infrastructure</a> to create the world we want for generations to come.”</p>
<p>Around the globe, countries and societies are <a href="https://unstats.un.org/sdgs/files/report/2018/TheSustainableDevelopmentGoalsReport2018.pdf">falling behind</a> on reducing social inequalities and meeting goals for economic development and environmental sustainability. The <a href="http://www.ipcc.ch/">Intergovernmental Panel on Climate Change</a> is issuing increasingly dire warnings about the effects climate change will have on human life on Earth – the beginnings of which are already unfolding. </p>
<p>I lead a major research initiative called <a href="https://sites.tufts.edu/digitalplanet/">The Digital Planet</a> at the Fletcher School at Tufts where we study how technology is changing lives and livelihoods around the world. Here is an outline of how technology giants or nimble startups could help make Gates’s and Zuckerberg’s promises a reality.</p>
<h2>Identify a big hairy problem</h2>
<p>There is a long list of global problems to combat, including hunger, drought, poverty, bad health, polluted water and poor sanitation. One that’s connected to all the others is the recent <a href="http://www.ipcc.ch/report/sr15/">bombshell news</a> that climate change is accelerating: Over the next 20 years, Earth’s atmosphere will reach average temperatures as much as 2.7 degrees Fahrenheit above preindustrial levels. Consequently, extreme weather and natural disasters, food shortages, inundated coastlines and the near-elimination of coral reefs will likely happen even sooner than previously anticipated. </p>
<p>The scope of climate change gives <a href="https://theconversation.com/big-tech-isnt-one-big-monopoly-its-5-companies-all-in-different-businesses-92791">companies like Google, Facebook and Amazon</a> excellent opportunities to find specific approaches that would have meaningful effects.</p>
<h2>Trace the root causes</h2>
<p>There are, of course, many elements driving climate change. Consider the agriculture sector, which <a href="https://www.nature.com/news/one-third-of-our-greenhouse-gas-emissions-come-from-agriculture-1.11708">produces one-third</a> of all greenhouse gas emissions. Farms emit the <a href="https://www.nature.com/news/one-third-of-our-greenhouse-gas-emissions-come-from-agriculture-1.11708">largest share</a> and could benefit from a range of technologies, such as data analytics and artificial intelligence. As a bonus, innovating in agriculture could help <a href="https://unstats.un.org/sdgs/files/report/2018/TheSustainableDevelopmentGoalsReport2018.pdf">feed more people</a>. </p>
<h2>Identify how technology can make a big difference</h2>
<p>Technological tools could help farmers collect and use data to <a href="https://www.wri.org/blog/2013/10/farmer-innovation-improving-africa%E2%80%99s-food-security-through-land-and-water-management">manage their crops more precisely</a> in ways that would reduce greenhouse gas emissions – such as using less fertilizer and plowing and planting fields more efficiently. Specifically, better data on soil and plant health could help farmers know where they need to increase or decrease irrigation or pesticide and fertilizer use. These practices save farmers money and increase farms’ productivity, generating more food with less waste. </p>
<h2>Recognize how you can make money from it</h2>
<p>If companies are to get involved, there needs to be an opportunity to earn money – and the more, the better. </p>
<p>One estimate suggests that making changes in farming and food practices that enhance productivity, promote sustainable methods and reduce waste could produce <a href="http://report.businesscommission.org/uploads/BetterBiz-BetterWorld_170215_012417.pdf">commercial opportunities and new savings worth US$2.3 trillion</a> overall worldwide annually.</p>
<p><a href="https://sites.tufts.edu/digitalplanet/">Our research team</a>, in work that is ongoing, has estimated that of that $2.3 trillion a year, $250 billion could come from the application of artificial intelligence and other analytics for precision farming alone – $195 billion of which would be in the developing world, with $45.6 billion in South Asia and $13.4 billion in East Africa. Other estimates for the effects of AI and analytics are less specific, but still within the same range – <a href="https://www.mckinsey.com/featured-insights/artificial-intelligence/visualizing-the-uses-and-potential-impact-of-ai-and-other-analytics">between $164 billion and $486 billion</a> annually. There is indeed money to be made by technology companies interested in developing climate-friendly, productivity-improving interventions in agriculture.</p>
<h2>Innovate to overcome the many barriers to change</h2>
<p>Before the commercial value can be unlocked, however, there are many barriers to consider. Many rural areas, even in the developed world, <a href="http://www.pewresearch.org/fact-tank/2017/05/19/digital-gap-between-rural-and-nonrural-america-persists/">don’t have affordable high-speed internet connections</a> and, particularly in the developing world, the farming community is not as technology savvy as other professions. Further, farming practices have been handed down through generations and the idea of using data to make modifications to such long-held beliefs and methods can be countercultural. </p>
<p>In addition, there are many practical realities: <a href="http://www.fao.org/docrep/005/Y3918E/y3918e10.htm">83 percent of the world’s cultivated land</a> is fed only by rain, with no irrigation systems to make use of better data. Beyond that, in most parts of the world, <a href="https://www.theguardian.com/sustainable-business/2015/feb/02/pioneer-firms-feed-world-agriculture-india-mozambique-profit">seeds and fertilizer are not high-quality</a>, lowering crop efficiency. Further, a lot of <a href="http://www.postharvest.org/home0.aspx">farms’ output is wasted</a> because of lack of refrigeration and slow transportation from fields to consumers.</p>
<p>With all those obstacles, it is understandable that investments in data-driven agriculture <a href="https://www.wsj.com/articles/why-big-data-hasnt-yet-made-a-dent-on-farms-1494813720">dropped 39 percent</a> from 2015 to 2016.</p>
<p>There are groups still working, though. <a href="https://www.microsoft.com/en-us/research/project/farmbeats-iot-agriculture/">FarmBeats</a> is a Microsoft project that combines low-cost sensors in the ground with drones that both create aerial maps and act as wireless data relay points. Nigeria’s <a href="http://zenvus.com/">Zenvus</a> and India’s <a href="http://www.aibono.com/">Aibono</a> analyze soil data. Kenya’s <a href="https://farmdrive.co.ke/">FarmDrive</a> develops credit scores for people without formal bank accounts or standard borrowing histories by using alternative data, like mobile phone and social media activity, together with local agricultural and economic information. Ghana’s <a href="https://farmerline.co/">Farmerline</a> tells farmers about weather forecasts, market information and financial tips. </p>
<p>These are creative efforts to solve deep and complex problems, but clearly there is room for large, well-resourced technology companies to step in, make a difference with big ideas, deep pockets and global support.</p>
<h2>Invest in partnerships</h2>
<p>Technology entrepreneurs will need to develop business models and organizational structures that are better at collaborating with local agricultural communities and businesses, to navigate personal and political relationships as well as regulations and government programs. Technology will not, on its own, be some sort of silver bullet that will unlock prosperity. </p>
<p>Changing technology companies into agents for widespread global good will not be easy – and it can be done in areas beyond agricultural innovation, too. </p>
<p>There has been no shortage of talk about these ideas: <a href="https://techcrunch.com/2018/05/23/50-tech-ceos-come-to-paris-to-talk-about-tech-for-good/">50 CEOs</a> met with French President Emmanuel Macron to discuss socially positive technologies; World Economic Forum events around the world discuss societal benefits of a <a href="https://www.weforum.org/about/the-fourth-industrial-revolution-by-klaus-schwab">Fourth Industrial Revolution</a>; and some companies, such as <a href="https://www.ericsson.com/en/about-us/sustainability-and-corporate-responsibility/sustainable-development-goals">Ericsson</a> and <a href="https://www.sap.com/dmc/exp/2018-01-unglobalgoals/">SAP</a>, are already committed to fulfilling <a href="https://sustainabledevelopment.un.org/?menu=1300">United Nations goals for global sustainability</a>. </p>
<p>We still have a long way to go. There is still a chance for technology companies to move fast and fix things by truly helping save the world – but sea levels are rising, so the time is now.</p><img src="https://counter.theconversation.com/content/105007/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti has founded and directs the Institute for Business in the Global Context at Fletcher/Tufts that has received funding from Mastercard, Microsoft, the Gates Foundation and the Onassis Foundation. He is a Non-Resident Senior Fellow at Brookings India and a Senior Advisor on Digital Inclusion at the Mastercard Center for Inclusive Growth.
</span></em></p>Amazon, Facebook and Google have lofty goals for their effects on global society. But people around the world are still waiting for the positive results. Here’s what the tech giants could do.Bhaskar Chakravorti, Dean of Global Business, The Fletcher School, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/869832018-07-11T11:13:22Z2018-07-11T11:13:22ZSilicon Valley, from ‘heart’s delight’ to toxic wasteland<figure><img src="https://images.theconversation.com/files/221590/original/file-20180604-175451-enx1j4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Once lauded for their vision and promise, Silicon Valley giants have made life so hard for locals that residents regularly protest the companies, including their amenities like charter buses to save workers from the region's terrible traffic.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/San-Francisco-Tech-Bus-Protest/e1a36cfdc907429890f81e24bdb1f53a/1/0">AP Photo/Richard Jacobsen</a></span></figcaption></figure><p>There was a time when California’s Santa Clara Valley, bucolic home to orchards and vineyards, was known as “<a href="https://en.wikipedia.org/wiki/Santa_Clara_Valley#The_Valley_of_Heart's_Delight">the valley of heart’s delight</a>.” The same area was later dubbed “Silicon Valley,” shorthand for the high-tech combination of creativity, capital and California cool. However, a <a href="https://www.vanityfair.com/news/2017/11/has-the-silicon-valley-hype-cycle-finally-run-its-course">backlash</a> is now well <a href="https://www.theguardian.com/technology/2017/dec/22/tech-year-in-review-2017">underway</a> – even from the loyal <a href="https://www.cnet.com/news/the-honeymoon-is-over-in-silicon-valley-facebook-google-twitter/">gadget-reviewing press</a>. Silicon Valley increasingly conjures something very different: exploitation, excess, and <a href="https://www.sup.org/books/title/?id=1432">elitist detachment</a>. </p>
<p>Today there are <a href="https://ofmpub.epa.gov/apex/cimc/f?p=CIMC:LIST:18425648058001:::35:P35_STREET,P35_BF_ASSESS_IND,P35_BF_ASSESS_PILOT_IND,P35_BF_CLEANUP_IND,P35_BF_RLF_IND,P35_BF_RLF_PILOT_IND,P35_BF_128A_IND,P35_BF_TBA_IND,P35_FF_BRAC_IND,P35_FF_RCRA_IND,P35_FF_SF_IND,P35_RCRA_CURRENT_IND,P35_RCRA_REMEDY_SEL_IND,P35_RCRA_CONSTR_COMPLT_IND,P35_RCRA_REMEDY_COMPLT_IND,P35_RCRA_REMEDY_NYS_IND,P35_SF_NPL_CODE,P35_SF_NPL_CODE_F,P35_SF_NPL_CODE_D,P35_STIMULUS_SF_IND,P35_STIMULUS_BF_IND,P35_BF_MULTIPURPOSE_IND,P35_BF_AWP_IND,P35_FD1,P35_FD2,P35_FD3,P35_FD4,P35_State_code,P35_county_name,P35_BASIC_QUERY:,,,,,,,,,,,,,,,,P,F,,,,,,,,,,California,Santa%20Clara,(SF_NPL_CODE=%27P%27)OR(SF_NPL_CODE_F=%27F%27)">23 active Superfund toxic waste cleanup sites</a> in Santa Clara County, California. <a href="https://blog.valerieaurora.org/2018/01/17/getting-free-of-toxic-tech-culture/">Its culture is equally unhealthy</a>: Think of the <a href="https://www.wired.com/2014/10/the-secret-about-gamergate-is-that-it-cant-stop-progress/">Gamergate misogynist harassment campaigns</a>, the entitled “<a href="https://qz.com/622452/tech-bros-and-their-sense-of-entitlement-will-be-silicon-valleys-undoing/">tech bros</a>” and <a href="https://theconversation.com/what-the-google-gender-manifesto-really-says-about-silicon-valley-82236">rampant sexism and racism</a> in Silicon Valley firms. These same companies demean the online public with <a href="https://theconversation.com/big-data-security-problems-threaten-consumers-privacy-54798">privacy breaches</a> and <a href="https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078">unauthorized</a> <a href="https://theconversation.com/understanding-facebooks-data-crisis-5-essential-reads-94066">sharing</a> of <a href="https://www.nytimes.com/2018/06/04/technology/facebook-device-partnerships-criticized.html">users’ data</a>. Thanks to the companies’ influences, it’s <a href="http://www.businessinsider.com/tech-workers-cant-afford-silicon-valley-housing-prices-2018-2">extremely expensive to live in the area</a>. And transportation is so clogged that there are <a href="https://www.bbc.com/news/technology-42738709">special buses bringing tech-sector workers</a> to and from their jobs. Some critics even perceive <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">threats</a> to <a href="https://theconversation.com/when-will-google-defend-democracy-96838">democracy</a> itself. </p>
<p>In a word, Silicon Valley has become toxic.</p>
<p>Silicon Valley’s rise is well documented, but the backlash against its distinctive culture and unscrupulous corporations hints at an imminent twist in its fate. As <a href="https://scholar.google.com/citations?user=pLxPBeQAAAAJ&hl=en&oi=sra">historians of technology</a> <a href="https://scholar.google.com/citations?user=BQE9RXgAAAAJ&hl=en">and industry</a>, we find it helpful to step back from the breathless champions and critics of Silicon Valley and think about the long term. The rise and fall of another American economic powerhouse – Detroit – can help explain how regional reputations change over time.</p>
<h2>The rise and fall of Detroit</h2>
<p>The city of Detroit became a famous node of industrial capitalism thanks to the pioneers of the automotive age. Men such as Henry Ford, Horace and John Dodge, and William Durant cultivated Detroit’s image as a center of technical novelty in the early 20th century. </p>
<p>The very name “Detroit” soon became a metonym for the industrial might of the American automotive industry and the <a href="https://www.history.com/how-detroit-won-world-war-ii">source of American military power</a>. General Motors President Charles E. Wilson’s remark, “<a href="https://blogs.loc.gov/inside_adams/2016/04/when-a-quote-is-not-exactly-a-quote-general-motors/">For years I thought what was good for our country</a> was good for General Motors, and vice versa,” was an arrogant but accurate account of Detroit’s place at the heart of American prosperity and global leadership.</p>
<p>The public’s view changed after the 1950s. The auto industry’s leading firms slid into bloated bureaucratic rigidity and <a href="http://www.autolife.umd.umich.edu/Race/R_Overview/R_Overview4.htm">lost ground to foreign competitors</a>. By the 1980s, Detroit was the image of blown-out, depopulated <a href="https://www.imdb.com/title/tt0098213/">post-industrialism</a>. </p>
<p>In retrospect – and perhaps as a cautionary tale for Silicon Valley – the moral decline of Detroit’s elite was evident long before its <a href="https://www.strongtowns.org/journal/2018/6/12/the-psychology-of-decline">economic decline</a>. Henry Ford became famous in the pre-war era for the cars and trucks that carried his name, but he was also an <a href="https://www.pbs.org/wgbh/americanexperience/features/henryford-antisemitism/">anti-Semite, proto-fascist and notorious enemy of organized labor</a>. Detroit also was the source of defective and deadly products that Ralph Nader criticized in 1965 as “<a href="https://www.history.com/this-day-in-history/unsafe-at-any-speed-hits-bookstores">unsafe at any speed</a>.” Residents of the region now <a href="https://theconversation.com/detroits-recovery-the-glass-is-half-full-at-most-69752">bear the costs of its amoral industrial past</a>, beset with high unemployment and <a href="https://theconversation.com/piping-as-poison-the-flint-water-crisis-and-americas-toxic-infrastructure-53473">poisonous drinking water</a>.</p>
<h2>A new chapter for Silicon Valley</h2>
<p>If the story of Detroit can be simplified as industrial prowess and national prestige, followed by moral and economic decay, what does that say about Silicon Valley? The term “<a href="http://www.computerhistory.org/atchm/who-named-silicon-valley/">Silicon Valley</a>” first appeared in print in the early 1970s and gained widespread use throughout the decade. It combined both place and activity. The Santa Clara Valley, a relatively small area south of the San Francisco Bay, home to San Jose and a few other small cities, was the base for a computing revolution based on <a href="https://theconversation.com/beyond-silicon-the-search-for-new-semiconductors-55795">silicon chips</a>. Companies and workers flocked to the Bay Area, seeking a <a href="https://theconversation.com/how-silicon-valley-industry-polluted-the-sylvan-california-dream-85810">pleasant climate, beautiful surroundings and affordable land</a>.</p>
<p>By the 1980s, venture capitalists and <a href="https://mitpress.mit.edu/books/making-silicon-valley">companies</a> in the Valley had mastered the silicon arts and were getting filthy, stinking rich. This was when “Silicon Valley” became shorthand for an <a href="https://www.sup.org/books/title/?id=654">industrial cluster</a> where universities, entrepreneurs and capital markets fueled technology-based economic development. <a href="https://archive.org/details/valleyofheartsde00malo">Journalists fawned</a> over successful companies like Intel, Cisco and Google, and analysts filled shelves with books and reports about how other regions could become the “<a href="https://press.princeton.edu/titles/7859.html">next Silicon Valley</a>.” </p>
<p>Many concluded that its culture set it apart. Boosters and publications like Wired magazine celebrated the combination of the <a href="http://www.press.uchicago.edu/ucp/books/book/chicago/F/bo3773600.html">Bay Area hippie legacy</a> with the <a href="https://www.worldcat.org/title/cyberselfish-a-critical-romp-through-the-terribly-libertarian-culture-of-high-tech/oclc/898998860">libertarian individualism</a> embodied by the late Grateful Dead lyricist <a href="https://www.eff.org/cyberspace-independence">John Perry Barlow</a>. The libertarian myth masked some crucial elements of Silicon Valley’s success – especially <a href="https://cup.columbia.edu/book/the-cold-war-and-american-science/9780231522205">public funds</a> dispersed through the U.S. Defense Department and Stanford University.</p>
<p>In retrospect, perhaps that ever-expanding gap between Californian dreams and American realities led to the undoing of Silicon Valley. Its detachment from the lives and concerns of ordinary Americans can be seen today in the <a href="https://www.nytimes.com/2018/05/24/business/elon-musk-tesla-twitter-media.html">unhinged Twitter rants</a> of automaker Elon Musk, the <a href="https://www.nytimes.com/2017/01/11/fashion/peter-thiel-donald-trump-silicon-valley-technology-gawker.html">extreme politics of PayPal co-founder Peter Thiel</a>, and the <a href="https://www.wired.com/2008/03/ff-kurzweil/">fatuous dreams of immortality</a> of Google’s vitamin-popping director of engineering, Ray Kurzweil. Silicon Valley’s moral decline has never been clearer, and it now struggles to survive the toxic mess it has created.</p><img src="https://counter.theconversation.com/content/86983/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Big technology firms are becoming known for mistreating workers, customers and society as a whole. Is an economic powerhouse about to collapse like Detroit did years go?Andrew L. Russell, Dean, College of Arts & Sciences; Professor of History, SUNY Polytechnic InstituteLee Vinsel, Assistant Professor of Science and Technology Studies, Virginia TechLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/946332018-04-10T17:39:27Z2018-04-10T17:39:27ZHow you helped create the crisis in private data<figure><img src="https://images.theconversation.com/files/213962/original/file-20180409-114128-1i5qiz9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What role did you play?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/cartoon-hand-pointing-finger-punching-hole-501267073">Composite of Christos Georghiou and sdecoret/Shutterstock.com</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>As Facebook’s <a href="https://www.axios.com/read-mark-zuckerberg-testimony-for-congress-1523288674-4ec25015-b37c-4c9e-b367-fd55f9e227f4.html">Mark Zuckerberg testifies</a> <a href="https://www.npr.org/sections/thetwo-way/2018/04/04/599394175/facebooks-mark-zuckerberg-will-testify-in-congress-on-april-11">before Congress</a>, he’s likely wondering how his company got to the point where he must submit to public questioning. It’s worth pondering how we, the Facebook-using public, got here too.</p>
<p>The scandal in which <a href="https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html">Cambridge Analytica harvested data from millions of Facebook users</a> to craft and target advertising for Donald Trump’s presidential campaign has <a href="https://www.opendemocracy.net/marcus-gilroy-ware/cambridge-analytica-outrage-is-real-story">provoked broad outrage</a>. More helpfully, it has exposed the powerful yet perilous role of data in U.S. society.</p>
<p>Repugnant as its methods were, Cambridge Analytica did not create this crisis on its own. As I argue in my forthcoming book, “<a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674737501">The Known Citizen: A History of Privacy in Modern America</a>,” big corporations (in this case, Facebook) and political interests (in this case, right-wing parties and campaigns) but also ordinary Americans (social media users, and thus likely you and me) all had a hand in it.</p>
<h2>The allure of aggregate data</h2>
<p>Businesses and governments have led the way. <a href="https://cup.columbia.edu/book/creditworthy/9780231168083">As long ago as the 1840s</a>, credit-lending firms understood the profits to be made from customers’ financial reputations. These precursors of <a href="https://theconversation.com/equifax-breach-is-a-reminder-of-societys-larger-cybersecurity-problems-84034">Equifax</a>, Experian and TransUnion eventually became enormous <a href="https://www.nytimes.com/1991/06/22/news/credit-bureaus-draw-fire-for-misuse-of-data.html">clearinghouses of personal data</a>.</p>
<p>For its part, the federal government, from the <a href="https://yalebooks.yale.edu/book/9780300195422/american-census">earliest census in 1790</a> to the creation of New Deal social welfare programs, has long relied on <a href="https://press.princeton.edu/titles/7183.html">aggregate as well as individual data</a> to distribute resources and administer benefits. For example, a person’s individual Social Security payments depend in part on <a href="https://www.ssa.gov/oact/cola/colasummary.html">changes in the overall cost of living</a> across the country.</p>
<p>Police forces and national security analysts, too, gathered fingerprints and other data in the name of <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674010024">social control</a>. Today, they employ some of the same methods as commercial data miners to profile criminals or terrorists, <a href="https://nyupress.org/books/9781479892822/">crafting ever-tighter nets of detection</a>. State-of-the-art public safety tools include access to social media accounts, online photographs, geolocation information and <a href="https://www.washingtonpost.com/world/national-security/surveillance-footage-cellphone-data-led-to-takedown-of-austin-bomber-officials-say/2018/03/21/04a8a91e-2d42-11e8-b0b0-f706877db618_story.html">cell tower data</a>.</p>
<h2>Probing the personal</h2>
<p>The search for better data in the 20th century often meant delving into individuals’ most personal, intimate lives. To that end, marketers, strategists and behavioral researchers conducted <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674027428">increasingly sophisticated surveys, polls</a> and <a href="http://www.orbooks.com/catalog/divining-desire-liza-featherstone/">focus groups</a>. They identified <a href="https://www.nytimes.com/2004/02/15/magazine/the-very-very-personal-is-the-political.html">effective ways</a> to reach specific customers and voters – and often, to <a href="https://yalebooks.yale.edu/book/9780300188011/daily-you">influence their behaviors</a>.</p>
<p>In the middle of the last century, for example, motivational researchers sought psychological knowledge about consumers in the hopes of subconsciously influencing them through <a href="http://www.upenn.edu/pennpress/book/14747.html">subliminal advertising</a>. Those probes into consumers’ personalities and desires foreshadowed <a href="https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078">Cambridge Analytica’s pitch to commercial and political clients</a> – using data, as its website proudly proclaims, “<a href="https://cambridgeanalytica.org">to change audience behavior</a>.”</p>
<p>Citizens were not just unwitting victims of these schemes. People have regularly, and willingly, revealed details about themselves in the name of security, convenience, health, social connection and self-knowledge. Despite <a href="http://www.pewresearch.org/fact-tank/2016/09/21/the-state-of-privacy-in-america/">rising public concerns about privacy and data insecurity</a>, large numbers of Americans still find benefits in releasing their data to government and commercial enterprises, whether through <a href="https://gizmodo.com/e-zpass-is-the-best-tracking-device-thats-already-in-y-1308535900">E-ZPasses</a>, <a href="https://www.theguardian.com/world/2018/jan/28/fitness-tracking-app-gives-away-location-of-secret-us-army-bases">Fitbits</a> or <a href="http://www.businessinsider.com/three-ways-social-media-is-tracking-you-2015-5">Instagram</a> posts.</p>
<h2>Revealing ourselves</h2>
<p>It is perhaps particularly appropriate that the Facebook scandal bloomed from a personality test app, “This is your digital life.” For decades, <a href="http://www.simonandschuster.com/books/The-Cult-of-Personality-Testing/Annie-Murphy-Paul/9780743280723">human relations departments and popular magazines</a> have urged Americans to yield private details, and harness the power of aggregate data, to better understand themselves. But in most situations, people weren’t consciously trading privacy for that knowledge. </p>
<p>In the linked and data-hungry internet age, however, those volunteered pieces of information take on lives of their own. Individual responses from <a href="http://www.cbc.ca/news/technology/facebook-cambridge-analytica-friends-api-by-design-1.4583337">270,000 people</a> on this particular test became a gateway to more data, including that belonging to <a href="https://uk.reuters.com/article/uk-facebook-privacy/facebook-says-data-leak-hits-87-million-users-widening-privacy-scandal-idUKKCN1HB2CK">another 87 million of their friends</a>. </p>
<p>Today, data mining corporations, political operatives and others seek data everywhere, hoping to turn that information to their own advantage. As Cambridge Analytica’s actions revealed, those groups will use data for startling purposes – such as targeting very specific groups of voters with <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">highly customized messages</a> – even if it means violating the policies and professed intentions of one of the <a href="https://www.wired.com/story/mark-zuckerberg-talks-to-wired-about-facebooks-privacy-problem/">most powerful corporations on the planet</a>.</p>
<p>The benefits of aggregate data help explain why it has been so difficult to enact rigorous <a href="https://theconversation.com/fragmented-us-privacy-rules-leave-large-data-loopholes-for-facebook-and-others-94606">privacy laws in the U.S.</a> As government and corporate data-gathering efforts swelled over the last century, citizens largely accepted, without much discussion or protest, that their society would be fueled by the collection of personal information. In this sense, we have all – regular individuals, government agencies and corporations like Facebook – collaborated to create the present crisis around private data.</p>
<p>But as Zuckerberg’s summons to Washington suggests, people are beginning to grasp that <a href="https://www.theguardian.com/technology/2018/jan/31/facebook-profit-mark-zuckerberg">Facebook’s enormous profits</a> exploit the value of their information and come at the price of their privacy. By making the risks of this arrangement clear, Cambridge Analytica may have done some good after all.</p><img src="https://counter.theconversation.com/content/94633/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sarah Igo does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The current reckoning with data has been a long time coming, a historian of privacy in the US writes.Sarah Igo, Associate Professor of History; Associate Professor of Political Science; Associate Professor of Sociology; Associate Professor of Law, Vanderbilt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/942452018-04-04T10:43:22Z2018-04-04T10:43:22ZResisting technology, Appalachian style<figure><img src="https://images.theconversation.com/files/213001/original/file-20180403-189798-1luyzx5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Each has its own merits, even in a technology-centric world.</span> <span class="attribution"><span class="source">The Conversation from Shutterstock images by heinsbergsphotos, jannoon028, Troy Kellogg</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>When people hear “Appalachia,” stereotypes and even slurs often immediately jump to mind, words like “backwards,” “ignorant,” “hillbilly” or “yokel.” But Appalachian attitudes about technology’s role in daily life are extremely sophisticated – and turn out to be both insightful and useful in a technology-centric society.</p>
<p>Many Americans tend to view Appalachian life as involving deprivation and deficit. This can be particularly pointed regarding technology: Rural residents are frequently neglected in research on technology use, and where they are included, the data usually focus on the <a href="http://www.pewresearch.org/fact-tank/2017/05/19/digital-gap-between-rural-and-nonrural-america-persists/17-04-20_ruraltechuse-1/">lower rates of ownership and use</a> of smartphones and laptop computers in rural areas. Articles can come across as scholars and reporters <a href="https://doi.org/10.1080/08900520802221946">saying something like</a>, “Poor rural Appalachians – they don’t even own the newest iPhone!”</p>
<p>It’s true that many <a href="https://theconversation.com/reaching-rural-america-with-broadband-internet-service-82488">rural areas aren’t served</a> with the fastest broadband and the most robust cellular coverage in the U.S. But in the wake of the <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">Cambridge Analytica scandal</a> in which the data from an estimated 50 million Facebook users were used to <a href="https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078">craft and inform online political advertising</a>, it’s worth considering whether people in Appalachia are deprived of the benefits of technology – or if they’re protecting themselves from harmful effects of its misuse.</p>
<h2>Skepticism and caution</h2>
<p>In a recent study, my colleagues and I used focus groups and interviews to explore <a href="http://ijoc.org/index.php/ijoc/article/view/7052">how people use technology in rural Appalachia</a>. These open-ended methods allow participants to discuss their experiences and opinions in their own terms. For instance, most technology surveys don’t ask people why they don’t own the latest phone or computer – they just <a href="http://www.pewresearch.org/fact-tank/2017/05/19/digital-gap-between-rural-and-nonrural-america-persists/">assume people would if they could</a>.</p>
<p>Those studies miss key insights our research was able to identify and explore. When we gave people a chance to tell their own stories about technology, we most often heard about two themes.</p>
<p>The first, which we called “resistance,” appeared in people’s doubts about the concept that more technology is always better. They also carefully considered whether the potential usefulness of new technologies was worth the privacy sacrifices inherently required to use them.</p>
<p>People also described their intentional choices about how much technology to use and for what purposes – as well as intentional choices not to use technology in some situations. We called this theme “navigation.”</p>
<h2>Using humor to express concerns</h2>
<p>In addition, our research identified ways that common Appalachian values of self-deprecating humor, privacy and self-reliance are involved in how people in that region view and use technology.</p>
<p>Humor, for instance, can be a useful tool to resist unwelcome intrusions of technology. The <a href="https://www.worldcat.org/title/laughter-in-appalachia-a-festival-of-southern-mountain-humor/oclc/833697529">best Appalachian humor</a> involves intelligently poking fun at the joke-teller, which is not always well understood by outsiders influenced by demeaning stereotypes.</p>
<p>A woman in one of our groups told the story of being offered what was billed as an “upgrade” from a basic cellphone to a smartphone; her reply was, “No, I don’t want anything smarter than I am.” </p>
<p>That response is properly understood as resistance through humor. As she stated, “I have all that I need.” Many participants in our study expressed doubt that it really could be a lifestyle “upgrade” to have <a href="http://www.businessinsider.com/siri-new-always-on-feature-has-privacy-implications-2015-9">a phone “listening” to her conversations</a> or to have multiple companies <a href="https://theconversation.com/your-mobile-phone-can-give-away-your-location-even-if-you-tell-it-not-to-65443">tracking her location with their apps</a> every minute of every day. </p>
<p>Another woman expressed dismay about another form of corporate monitoring, describing “when you’re shopping for a cutting board on Amazon … and then when I sign on Facebook, every single ad on the sidebar is for a cutting board. … That freaks me out a lot more than other things.”</p>
<p>In a related vein, an adolescent girl expressed dismay about people who excessively document their lives: “I don’t want people messaging me every five minutes. Like, ‘Oh, look at my new selfie!’ You sent me a selfie 10 seconds ago. I don’t need another selfie of you. I see you enough at school.” </p>
<p>Some people, of course, might find benefits in those technologies. But it’s unfair to brand Appalachian skepticism as ignorance – particularly when it questions the corporations who are <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">selling full-time surveillance</a> of the general population for a considerable profit. </p>
<h2>Resistance is not ignorance</h2>
<p>Instead, it would be better to think of this resistance as an integral part of Appalachian culture. One man who was himself a pretty active user of technology noted that not everyone in his community is like him, saying, “Now, there are a lot of people that are my age that have not gone into the digital age … and I respect that, because they don’t want their privacy issues dealt with like that.” </p>
<p>People from New York or Chicago who use technology every day may never have stopped to form an opinion about people who choose to minimize their involvement with modern technology. They may not even know anyone like that. Or they may fall prey to the mainstream narrative that it’s acceptable to gently mock grandparents and <a href="https://www.theguardian.com/commentisfree/2014/nov/25/how-do-you-explain-the-internet-to-your-grandmother">others who do not use email or smartphones</a>. This man expresses understanding and esteem for technology resistance by his neighbors, even as he acknowledges that he is not a resister.</p>
<h2>Being mindful of technology choices</h2>
<p>The participants who used smartphones, tablets or computers reported more than 50 safety strategies they use to stay safe online, such as refusing suspicious contacts, restricting the information that they posted online or avoiding public Wi-Fi. Far from being passive or ignorant, they made a lot of intentional choices about how to handle technology when they do choose to use it. </p>
<p>Not all of this is Appalachian-specific; I’m not sure there is a culturally specific way to <a href="https://www.lifewire.com/how-to-delete-cookies-2617981">delete browser cookies</a>. </p>
<p>But some of their responses did show evidence of Appalachian attributes, like self-reliance. The most extreme story we heard was probably also the most illuminating. A man reported posing as an FBI agent in order to intimidate online con artists who were trying to scam him. In the focus group, no one mentioned that it was illegal to <a href="https://archives.fbi.gov/archives/philadelphia/press-releases/2013/york-man-charged-with-impersonating-fbi-agent">pose as an FBI agent</a>. Nobody even expressed concern about the man’s safety while he was harassing criminals. In fact, another man said he had done something similar, and a woman asked for more information about how to track the location of con artists!</p>
<p>There are lessons for everyone in these stories of Appalachian resistance and navigation to modern technology: Be a little more skeptical about whether these giant corporations really have your best interests at heart, and whether your life is really better as a result of <a href="https://theconversation.com/with-teen-mental-health-deteriorating-over-five-years-theres-a-likely-culprit-86996">all that time you spend on your smartphone</a>. Demand more transparent privacy settings and tell your politicians that you want to be able to opt out of this kind of <a href="https://theconversation.com/big-data-security-problems-threaten-consumers-privacy-54798">data sharing</a>. <a href="https://news.usc.edu/135603/arianna-huffington-tells-students-to-turn-off-their-phones-and-get-more-sleep/">Turn your phone off</a> every once in a while. Maybe venture outside your home <a href="https://www.independent.co.uk/life-style/gadgets-and-tech/news/smartphone-separation-anxiety-nomophobia-why-feel-bad-no-phone-personalised-technology-a7896591.html">without a phone</a>. </p>
<p>The next time an app wants to <a href="https://theconversation.com/your-mobile-phone-can-give-away-your-location-even-if-you-tell-it-not-to-65443">track your location full-time</a> or a Facebook quiz asks for permission to access the personal information of <a href="https://www.wired.com/story/cambridge-analytica-50m-facebook-users-data/">you and all your Facebook friends</a>, try to get in touch with your inner Appalachian.</p><img src="https://counter.theconversation.com/content/94245/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>This research was supported by the Digital Trust Foundation. </span></em></p>People in Appalachia are skeptical and cautious around technology – and how they think can be useful and instructive for living in a tech-centric world.Sherry Hamby, Research Professor of Psychology; Director of the Life Paths Appalachian Research Center, Sewanee: The University of the SouthLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/940782018-03-30T11:03:22Z2018-03-30T11:03:22ZHow Cambridge Analytica’s Facebook targeting model really worked – according to the person who built it<figure><img src="https://images.theconversation.com/files/212677/original/file-20180329-189824-1lbooac.jpg?ixlib=rb-1.1.0&rect=7%2C1197%2C4971%2C3002&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How accurately can you be profiled online?</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/laptop-shooting-target-arrows-on-screen-795280663">Andrew Krasovitckii/Shutterstock.com</a></span></figcaption></figure><p>The researcher whose work is at the center of the <a href="https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html">Facebook-Cambridge Analytica data analysis and political advertising uproar</a> has revealed that his method worked much like the one <a href="https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429">Netflix uses to recommend movies</a>. </p>
<p>In an email to me, Cambridge University scholar Aleksandr Kogan explained how his statistical model processed Facebook data for Cambridge Analytica. The accuracy he claims suggests it works about as well as <a href="https://www.cambridge.org/core/books/hacking-the-electorate/C0D269F47449B042767A51EC512DD82E">established voter-targeting methods</a> based on demographics like race, age and gender.</p>
<p>If confirmed, Kogan’s account would mean the digital modeling Cambridge Analytica used was <a href="https://www.youtube.com/watch?v=APqU_EJ5d3U">hardly the virtual crystal ball</a> <a href="https://techcrunch.com/2018/03/23/facebook-knows-literally-everything-about-you/">a few have claimed</a>. Yet the numbers Kogan provides <a href="https://civichall.org/civicist/will-the-real-psychometric-targeters-please-stand-up/">also show</a> what is – and isn’t – <a href="https://www.washingtonpost.com/news/monkey-cage/wp/2018/03/23/four-and-a-half-reasons-not-to-worry-that-cambridge-analytica-skewed-the-2016-election/">actually possible</a> by <a href="https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/">combining personal data</a> <a href="https://www.nbcnews.com/politics/politics-news/cambridge-analytica-s-effectiveness-called-question-despite-alleged-facebook-data-n858256">with machine learning</a> for political ends.</p>
<p>Regarding one key public concern, though, Kogan’s numbers suggest that information on users’ personalities or “<a href="https://www.vox.com/science-and-health/2018/3/23/17152564/cambridge-analytica-psychographic-microtargeting-what">psychographics</a>” was just a modest part of how the model targeted citizens. It was not a personality model strictly speaking, but rather one that boiled down demographics, social influences, personality and everything else into a big correlated lump. This soak-up-all-the-correlation-and-call-it-personality approach seems to have created a valuable campaign tool, even if the product being sold wasn’t quite as it was billed.</p>
<h2>The promise of personality targeting</h2>
<p>In the wake of the revelations that Trump campaign consultants Cambridge Analytica used <a href="https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html">data from 50 million Facebook users</a> to target digital political advertising during the 2016 U.S. presidential election, Facebook has <a href="https://www.nasdaq.com/symbol/fb/stock-report">lost billions in stock market value</a>, governments on <a href="https://www.theverge.com/2018/3/19/17141138/facebook-cambridge-analytica-uk-authorities-warrant-data-breach">both sides of the Atlantic</a> have <a href="https://www.pbs.org/newshour/politics/federal-trade-commission-to-investigate-facebook-as-companys-stock-value-sinks">opened investigations</a>, and a nascent <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">social movement</a> is calling on users to <a href="https://twitter.com/search?q=%23deletefacebook">#DeleteFacebook</a>.</p>
<p>But a key question has remained unanswered: Was Cambridge Analytica really able to effectively target campaign messages to citizens based on their personality characteristics – or even their “<a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">inner demons</a>,” as a company whistleblower alleged? </p>
<p>If anyone would know what Cambridge Analytica did with its massive trove of Facebook data, it would be Aleksandr Kogan and Joseph Chancellor. It was <a href="https://www.reuters.com/article/us-facebook-cambridge-analytica/trump-consultants-harvested-data-from-50-million-facebook-users-reports-idUSKCN1GT02Y">their startup Global Science Research</a> that collected profile information from <a href="https://www.wired.com/story/cambridge-analytica-50m-facebook-users-data/">270,000 Facebook users and tens of millions of their friends</a> using a personality test app called “thisisyourdigitallife.”</p>
<p>Part of <a href="https://scholar.google.com/citations?user=igL-0AsAAAAJ&hl=en">my own research</a> focuses on understanding <a href="https://doi.org/10.1177/0002716215570279">machine learning</a> methods, and <a href="https://www.amazon.com/Internet-Trap-Monopolies-Undermines-Democracy/dp/0691159262/">my forthcoming book</a> discusses how digital firms use recommendation models to build audiences. I had a hunch about how Kogan and Chancellor’s model worked.</p>
<p>So I emailed Kogan to ask. Kogan is still a <a href="https://www.bloomberg.com/news/articles/2018-03-20/meet-the-psychologist-at-the-center-of-facebook-s-data-scandal">researcher at Cambridge University</a>; his collaborator <a href="https://www.theguardian.com/news/2018/mar/18/facebook-cambridge-analytica-joseph-chancellor-gsr">Chancellor now works at Facebook</a>. In a remarkable display of academic courtesy, Kogan answered. </p>
<p>His response requires some unpacking, and some background.</p>
<h2>From the Netflix Prize to “psychometrics”</h2>
<p>Back in 2006, when it was still a DVD-by-mail company, Netflix offered a <a href="https://www.netflixprize.com/">reward of $1 million</a> to anyone who developed a better way to make predictions about users’ movie rankings than the company already had. A surprise top competitor was an <a href="https://www.kdnuggets.com/news/2007/n08/3i.html">independent software developer using the pseudonym Simon Funk</a>, whose basic approach was ultimately incorporated into all the top teams’ entries. Funk adapted a technique called “<a href="http://www.aclweb.org/anthology/E06-1013">singular value decomposition</a>,” condensing users’ ratings of movies into a <a href="https://www.youtube.com/watch?v=P5mlg91as1c">series of factors or components</a> – essentially a set of inferred categories, ranked by importance. As Funk <a href="http://sifter.org/simon/journal/20061027.2.html">explained in a blog post</a>,</p>
<blockquote>
<p>“So, for instance, a category might represent action movies, with movies with a lot of action at the top, and slow movies at the bottom, and correspondingly users who like action movies at the top, and those who prefer slow movies at the bottom.”</p>
</blockquote>
<p>Factors are artificial categories, which are not always like the kind of categories humans would come up with. The <a href="http://sifter.org/simon/journal/20061027.2.html">most important factor in Funk’s early Netflix model</a> was defined by users who loved films like “Pearl Harbor” and “The Wedding Planner” while also hating movies like “Lost in Translation” or “Eternal Sunshine of the Spotless Mind.” His model showed how machine learning can find correlations among groups of people, and groups of movies, that humans themselves would never spot.</p>
<p>Funk’s general approach used the 50 or 100 most important factors for both users and movies to make a decent guess at how every user would rate every movie. This method, often called <a href="https://en.wikipedia.org/wiki/Dimensionality_reduction">dimensionality reduction</a> or matrix factorization, was not new. Political science researchers had shown that <a href="https://en.wikipedia.org/wiki/NOMINATE_(scaling_method)">similar techniques using roll-call vote data</a> could predict the votes of members of Congress with 90 percent accuracy. In psychology the “<a href="https://doi.org/10.1037/0003-066X.48.1.26">Big Five</a>” model had also been used to predict behavior by clustering together personality questions that tended to be answered similarly.</p>
<p>Still, Funk’s model was a big advance: It allowed the technique to work well with huge data sets, even those with lots of missing data – like the Netflix dataset, where a typical user rated only few dozen films out of the thousands in the company’s library. More than a decade after the Netflix Prize contest ended, <a href="https://doi.org/10.1145/1401890.1401944">SVD-based methods</a>, or <a href="https://doi.org/10.1109/ICDM.2008.22">related models for implicit data</a>, are still the tool of choice for many websites to predict what users will read, watch, or buy. </p>
<p>These models can predict other things, too.</p>
<h2>Facebook knows if you are a Republican</h2>
<p>In 2013, Cambridge University researchers Michal Kosinski, David Stillwell and Thore Graepel published an article on the <a href="https://doi.org/10.1073/pnas.1218772110">predictive power of Facebook data</a>, using information gathered through an online personality test. Their initial analysis was nearly identical to that used on the Netflix Prize, using SVD to categorize both users and things they “liked” into the top 100 factors. </p>
<p>The paper showed that a factor model made with users’ Facebook “likes” alone was <a href="https://doi.org/10.1073/pnas.1218772110">95 percent accurate</a> at distinguishing between black and white respondents, 93 percent accurate at distinguishing men from women, and 88 percent accurate at distinguishing people who identified as gay men from men who identified as straight. It could even correctly distinguish Republicans from Democrats 85 percent of the time. It was also useful, though not as accurate, for <a href="https://doi.org/10.1073/pnas.1218772110">predicting users’ scores</a> on the “Big Five” personality test. </p>
<p>There was <a href="https://psmag.com/economics/big-data-big-brother-and-the-like-button-53894">public outcry</a> <a href="https://www.theatlantic.com/technology/archive/2013/03/armed-with-facebook-likes-alone-researchers-can-tell-your-race-gender-and-sexual-orientation/273963/">in response</a>; within weeks Facebook had <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">made users’ likes private</a> by default.</p>
<p>Kogan and Chancellor, also Cambridge University researchers at the time, were starting to use Facebook data for election targeting as part of a collaboration with Cambridge Analytica’s parent firm SCL. Kogan invited Kosinski and Stillwell to join his project, but it <a href="https://www.theguardian.com/education/2018/mar/24/cambridge-analytica-academics-work-upset-university-colleagues">didn’t work out</a>. Kosinski reportedly suspected Kogan and Chancellor might have <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">reverse-engineered the Facebook “likes” model</a> for Cambridge Analytica. Kogan denied this, saying his project “<a href="https://www.theguardian.com/education/2018/mar/24/cambridge-analytica-academics-work-upset-university-colleagues">built all our models</a> using our own data, collected using our own software.” </p>
<h2>What did Kogan and Chancellor actually do?</h2>
<p>As I followed the developments in the story, it became clear Kogan and Chancellor had indeed collected plenty of their own data through the thisisyourdigitallife app. They certainly could have built a predictive SVD model like that featured in Kosinski and Stillwell’s published research.</p>
<p>So I emailed Kogan to ask if that was what he had done. Somewhat to my surprise, he wrote back. </p>
<p>“We didn’t exactly use SVD,” he wrote, noting that SVD can struggle when some users have many more “likes” than others. Instead, Kogan explained, “The technique was something we actually developed ourselves … It’s not something that is in the public domain.” Without going into details, Kogan described their method as “a multi-step <a href="https://www.quora.com/What-is-a-co-occurrence-matrix">co-occurrence</a> approach.” </p>
<p>However, his message went on to confirm that his approach was indeed similar to SVD or other matrix factorization methods, like in the Netflix Prize competition, and the Kosinki-Stillwell-Graepel Facebook model. Dimensionality reduction of Facebook data was the core of his model. </p>
<h2>How accurate was it?</h2>
<p>Kogan suggested the exact model used doesn’t matter much, though – what matters is the accuracy of its predictions. According to Kogan, the “correlation between predicted and actual scores … was around [30 percent] for all the personality dimensions.” By comparison, a person’s previous Big Five scores are about <a href="https://doi.org/10.1016/j.jrp.2014.06.003">70 to 80 percent accurate</a> in predicting their scores when they retake the test. </p>
<p>Kogan’s accuracy claims cannot be independently verified, of course. And anyone in the midst of such a high-profile scandal might have incentive to understate his or her contribution. In his <a href="https://www.youtube.com/watch?v=APqU_EJ5d3U">appearance on CNN</a>, Kogan explained to a increasingly incredulous Anderson Cooper that, in fact, the models had actually not worked very well. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/APqU_EJ5d3U?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Aleksandr Kogan answers questions on CNN.</span></figcaption>
</figure>
<p>In fact, the accuracy Kogan claims seems a bit low, but plausible. Kosinski, Stillwell and Graepel reported comparable or slightly better results, as have several <a href="https://doi.org/10.1016/j.paid.2017.12.018">other academic studies</a> using digital footprints to predict personality (though some of those studies had more data than just Facebook “likes”). It is surprising that Kogan and Chancellor would go to the trouble of designing their own proprietary model if off-the-shelf solutions would seem to be just as accurate.</p>
<p>Importantly, though, the model’s accuracy on personality scores allows comparisons of Kogan’s results with other research. Published models with equivalent accuracy in predicting personality are all much more accurate at guessing demographics and political variables.</p>
<p>For instance, the similar Kosinski-Stillwell-Graepel SVD model was 85 percent accurate in guessing party affiliation, even without using any profile information other than likes. Kogan’s model had similar or better accuracy. Adding even a small amount of information about friends or users’ demographics would likely boost this accuracy above 90 percent. Guesses about gender, race, sexual orientation and other characteristics would probably be more than 90 percent accurate too.</p>
<p>Critically, these guesses would be especially good for the most active Facebook users – the people the model was primarily used to target. Users with less activity to analyze are likely not on Facebook much anyway. </p>
<h2>When psychographics is mostly demographics</h2>
<p>Knowing how the model is built helps explain Cambridge Analytica’s apparently contradictory statements about <a href="https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win">the role</a> – or <a href="https://www.c-span.org/video/?420077-1/google-hosts-post-election-review&start=6905">lack thereof</a> – that personality profiling and psychographics played in its modeling. They’re all technically consistent with what Kogan describes.</p>
<p>A model like Kogan’s would give estimates for every variable available on any group of users. That means it would automatically <a href="https://www.bloomberg.com/news/features/2015-11-12/is-the-republican-party-s-killer-data-app-for-real-">estimate the Big Five personality scores</a> for every voter. But these personality scores are the output of the model, not the input. All the model knows is that certain Facebook likes, and certain users, tend to be grouped together. </p>
<p>With this model, Cambridge Analytica could say that it was identifying people with low openness to experience and high neuroticism. But the same model, with the exact same predictions for every user, could just as accurately claim to be identifying less educated older Republican men. </p>
<p>Kogan’s information also helps clarify the confusion about whether Cambridge Analytica <a href="https://www.youtube.com/watch?v=MepM_YXZdYg">actually deleted its trove</a> of Facebook data, when models built from the data <a href="https://www.channel4.com/news/revealed-cambridge-analytica-data-on-thousands-of-facebook-users-still-not-deleted">seem to still be circulating</a>, and even <a href="https://gizmodo.com/aggregateiq-created-cambridge-analyticas-election-softw-1824026565">being developed further</a>. </p>
<p>The whole point of a dimension reduction model is to mathematically represent the data in simpler form. It’s as if Cambridge Analytica took a very high-resolution photograph, resized it to be smaller, and then deleted the original. The photo still exists – and as long as Cambridge Analytica’s models exist, the data effectively does too.</p><img src="https://counter.theconversation.com/content/94078/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matthew Hindman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>An email from Aleksandr Kogan sheds light on exactly how much your Facebook data reveals about you, and what data scientists can actually do with that information.Matthew Hindman, Associate Professor of Media and Public Affairs, George Washington UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/927912018-03-23T18:56:55Z2018-03-23T18:56:55Z‘Big Tech’ isn’t one big monopoly – it’s 5 companies all in different businesses<figure><img src="https://images.theconversation.com/files/211745/original/file-20180323-54898-1dnsu0o.png?ixlib=rb-1.1.0&rect=1029%2C0%2C1844%2C1255&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It may seem convenient to think of technology companies as similar, but they're really not.</span> <span class="attribution"><span class="source">The Conversation</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>Public <a href="https://www.theatlantic.com/technology/archive/2018/03/facebook-cambridge-analytica/555866/">concern</a> about Facebook’s power in society – and in politics – has skyrocketed in the wake of <a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">revelations</a> that users’ data was analyzed by a U.K.-based marketing firm and used to construct highly targeted political propaganda in advance of the 2016 U.S. presidential election. Other technology giants have also sparked concern: <a href="https://www.ft.com/content/9554a8bc-5b12-11e7-b553-e2df1b0c3220">Google</a>, <a href="https://www.cnbc.com/2017/05/08/apple-google-european-commission-spotify.html">Apple</a>, <a href="https://www.cnbc.com/2018/02/05/tech-investor-warns-amazon-against-abusing-its-power-to-influence-users.html">Amazon</a> and <a href="http://knowledge.wharton.upenn.edu/article/microsoft-a-case-of-justice-or-abuse-of-power/">Microsoft</a> have all faced objections from users, the public and even government agencies. </p>
<p>Because all of these companies provide services relating to computers, there is a tendency to lump them together, calling them “<a href="https://www.nytimes.com/2015/09/20/opinion/is-big-tech-too-powerful-ask-google.html">Big Tech</a>” or the “<a href="https://www.nytimes.com/2017/10/11/technology/the-frightful-five-want-to-rule-entertainment-they-are-hitting-limits.html">Frightful Five</a>” or even “<a href="https://qz.com/303947/us-cultural-imperialism-has-a-new-name-gafa/">GAFA</a>” – the acronym for the first four of them, leaving Microsoft out. Conceiving of “<a href="http://www.slate.com/articles/technology/technology/2017/11/how_silicon_valley_became_big_tech.html">big tech</a>” as a single industry makes the threat and influence overwhelming. </p>
<p>In the U.S., when an industry gets so large it exerts political pressure on society, people often label the industry as a whole, like “<a href="https://content.time.com/time/magazine/article/0,9171,920328,00.html">Big Oil</a>,” “<a href="https://www.huffingtonpost.com/entry/big-tobacco-is-still-in-the-business-of-deceiving-americans_us_5a202d96e4b0392a4ebbf5f3">Big Tobacco</a>” or “<a href="https://www.thedailybeast.com/big-pharma-is-americas-new-mafia">Big Pharma</a>.” The so-called big tech companies certainly are big: In 2017, they were the top five <a href="https://247wallst.com/investing/2017/10/29/market-cap-of-5-largest-us-companies-up-36-in-most-recent-year/">most valuable public companies</a> in the U.S. But, as a <a href="https://mitpress.mit.edu/books/we-now-disrupt-broadcast">scholar of the media marketplace</a> that many of these firms are beginning to explore, I know that lumping them together hides the fact they’re very separate and distinct – not just as companies, but in terms of their business models and practices.</p>
<p>Understanding these companies in their proper business contexts makes it easier to understand their power in the marketplace and society at large. It also suggests ways to assess, regulate and manage that power to protect competition and <a href="https://theconversation.com/facebook-is-killing-democracy-with-its-personality-profiling-data-93611">even democracy itself</a>. </p>
<h2>Google: Advertising revenue from searches</h2>
<p>Google and Facebook are most frequently discussed together, likely because of their domination of internet advertising. Together, the two companies <a href="https://www.emarketer.com/Article/Google-Facebook-Tighten-Grip-on-US-Digital-Ad-Market/1016494">collected 63 percent</a> of U.S. digital advertising dollars in 2017. Both companies earn most of their revenue from advertising: 97 percent for <a href="https://www.visualcapitalist.com/chart-5-tech-giants-make-billions/">Facebook</a> and 88 percent for <a href="https://www.investopedia.com/articles/investing/020515/business-google.asp">Google’s</a> parent company Alphabet in 2016. But what they offer to advertisers and what users want from them are very different.</p>
<p>Google’s value proposition is helping users find things. Many – even most – of the <a href="http://www.internetlivestats.com/google-search-statistics/">3.5 billion</a> searches Google performs each day aren’t monetized at all. Google only gets paid if a searcher clicks on a paid link; the top three results are often labeled as “Ads,” in addition to several on the right side of a computer user’s search results screen. </p>
<p>Advertisers like Google because they only pay if their <a href="https://adwords.google.com/home/pricing/">ads are clicked</a>. That is a far better deal than what is offered in traditional media advertising, where payment is for how many people are shown an ad, rather than customers’ responses. In addition, Google’s position as a <a href="https://www.thinkwithgoogle.com/consumer-insights/mobile-search-consumer-behavior-data/">leading place</a> where people look for information on products and services means an ad reaches a consumer exactly at the moment they’re looking for a product. This timing is more valuable than just showing ads to people in general – so much so that advertisers paid Google <a href="https://www.statista.com/statistics/266249/advertising-revenue-of-google/">US$79.38 billion</a> in 2016.</p>
<h2>Facebook: Ad revenue from attention-grabbing content</h2>
<p>Facebook operates more like a traditional ad-supported media company. It provides interesting content that attracts an audience, and sells their attention to advertisers – just as television, radio and print have done for decades. The key difference between Facebook and these legacy media businesses is where the content comes from: Rather than Facebook paying to create the material that draws users, the users add it themselves for free, posting personal messages and shared links. </p>
<p>Like traditional media, Facebook charges advertisers based on how many people see a message, not on how many take action by clicking. The value Facebook offers over traditional advertising is its ability to <a href="https://www.propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters">target very particular groups</a> with a <a href="https://www.marketplace.org/2010/11/26/tech/marketers-you">customized advertising message</a>. This is precisely the type of <a href="https://www.cnet.com/news/facebook-cambridge-analytica-data-mining-and-trump-what-you-need-to-know/">targeting</a> that happened during the 2016 U.S. presidential election, which generated widespread public criticism.</p>
<h2>Apple: Selling electronic hardware</h2>
<p>In contrast to the advertising businesses of Google and Facebook, Apple remains a hardware technology company, deriving <a href="https://www.visualcapitalist.com/chart-5-tech-giants-make-billions/">84 percent</a> of its 2016 revenue from the iPhone, iPad and Mac computers. The profits on those sales let Apple use <a href="https://www.fastcompany.com/1784824/great-tech-war-2012">very different strategies</a> than the non-hardware companies with which it is often compared. The profit margins on each device are so substantial it <a href="https://www.fastcompany.com/1784824/great-tech-war-2012">doesn’t have to dominate</a> the hardware market the way Google and Facebook control online advertising. Despite the seeming ubiquity of iPhones in some social circles, iPhones <a href="https://www.statista.com/statistics/216459/global-market-share-of-apple-iphone/">rarely top 20 percent</a> of worldwide phone sales, and account for <a href="https://9to5mac.com/2017/08/09/us-iphone-sales-ios-market-share-kantar/">about 30 percent</a> of U.S. sales.</p>
<p>Apple has other elements to its business, too – such as its iTunes music distribution business. But it’s important to keep the relative scale of those elements in mind. Mostly, they are <a href="https://www.thecontenttrap.com/">complementary businesses</a> that Apple uses strategically in support of its primary focus as a hardware company. Taken together, iTunes, its App Stores, iBooks Store, Apple Music, Apple Care, Apple Pay and other even more ancillary sales added up to <a href="http://investor.apple.com/secfiling.cfm?filingID=1628280-16-20309&CIK=320193#A201610-K9242016_HTM_SE78948B641FF55EDB70F7F75DDCB7673">just 11 percent</a> of the company’s revenue in 2016. Even the company’s plan to spend <a href="http://variety.com/2017/digital/news/apple-1-billion-original-tv-shows-movies-budget-1202529421/">$1 billion on original video</a> is hard to understand, except as a support to branding and marketing efforts that boost its hardware sales.</p>
<h2>Microsoft and Amazon: Mixed retail, computing and media</h2>
<p>Much like Apple, Microsoft blends many revenue streams: It sells Surface computers, Azure cloud services, software (like the Microsoft Office Suite), gaming consoles and search engine advertising. The company once stood alone as a <a href="http://eprints.lse.ac.uk/47043/1/CentrePiece_12_1.pdf">poster child</a> for massive technology corporations. Lately, it may draw less attention because competitors like Google’s G Suite have challenged its market share. Also, Microsoft has not aggressively entered social media, a sector now under great scrutiny.</p>
<p>Finally, Amazon also operates in many different business sectors. Primarily, it is a goods retailer: That’s where <a href="https://www.visualcapitalist.com/chart-5-tech-giants-make-billions/">70 percent</a> of its annual revenue came from in 2016. Its Amazon Web Services content hosting and cloud computing business contributed 9 percent, and Amazon’s media businesses provided roughly 18 percent of the company’s $136 billion of annual revenue. That $24 billion of media revenue is nearly three times that of Netflix, but still not Amazon’s core business.</p>
<h2>Regulate markets and behavior, not ‘tech’</h2>
<p>It’s not that these companies are so different as to be unrelated or incomparable to each other. They all involve – to varying degrees – computers and services built on internet connection that provide services to customers in ways that never existed before. All five gather data on their users and analyze behavior using algorithms to create personal experiences in ways that are new and have been challenging for companies with long histories in sectors such as media, transportation or retail to match.</p>
<p>But despite simple perception of them all as “<a href="https://www.inc.com/magazine/201605/marli-guzzetta/tech-company-definition.html">tech</a>” companies, their core revenue sources are clearly different. And those distinctions suggest ways people can understand and respond to anxieties about their growing economic and cultural influence.</p>
<p>In fact, what is most concerning is the extent to which these companies aren’t in the same businesses: They’re not competing with each other, or really anyone else.</p>
<p>In prior eras, Americans learned that major industries they first viewed as innovators and economic saviors were more complicated and less magnanimous than initially believed. So now today, big tech isn’t unlike everything that came before. In fact, big tech <a href="https://al3x.net/2012/05/08/what-is-and-is-not-a-technology-company.html">isn’t really a thing</a> at all. Assessing these companies based on what they do, rather than mythologizing them, is the first step forward.</p>
<p>
<section class="inline-content">
<img src="https://images.theconversation.com/files/248895/original/file-20181204-133100-t34yqm.png?w=128&h=128">
<div>
<header>Amanda Lotz is the author of:</header>
<p><a href="https://mitpress.mit.edu/books/we-now-disrupt-broadcast">We Now Disrupt This Broadcast:
How Cable Transformed Television and the Internet Revolutionized It All</a></p>
<footer>MIT Press provides funding as a member of The Conversation US.</footer>
</div>
</section>
</p><img src="https://counter.theconversation.com/content/92791/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Amanda Lotz does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>When thinking about regulating them, it’s useful to know Facebook, Amazon, Google, Apple and Microsoft have some similarities. But generally they’re not competing with each other – or anyone else.Amanda Lotz, Fellow, Peabody Media Center; Professor of Media Studies, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/931182018-03-21T10:43:14Z2018-03-21T10:43:14ZThink Facebook can manipulate you? Look out for virtual reality<figure><img src="https://images.theconversation.com/files/211198/original/file-20180320-31624-13znwph.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What these people are seeing isn't real – but they might think it is.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/APTOPIX-Spain-Wireless-Show-Flagship-Phones/55557e265ea948089fc69dadde97782a/5/0">AP Photo/Francisco Seco</a></span></figcaption></figure><p>As Facebook users around the world are coming to understand, some of their favorite technologies can be used against them. It’s not just the scandal over psychological profiling firm <a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">Cambridge Analytica getting access</a> to data from tens of millions of Facebook profiles. People’s filter bubbles are filled with carefully tailored information – and misinformation – altering their <a href="https://www.onlineprivacyfoundation.org/opf-research/psychographic-targeting/">behavior and thinking, and even their votes</a>.</p>
<p>People, both individually and as a society at large, are wrestling to understand <a href="https://techcrunch.com/2018/03/18/move-fast-and-fake-things/">how their newsfeeds turned against them</a>. They are coming to realize exactly how carefully controlled Facebook feeds are, with highly tailored ads. That set of problems, though, pales in comparison to those posed by the next technological revolution, which is already underway: virtual reality. </p>
<p>On one hand, virtual worlds hold almost limitless potential. VR games can <a href="https://www.tennessean.com/picture-gallery/news/2018/02/23/virtual-reality-games-used-in-drug-rehab-therapy/110761470/">treat drug addiction</a> and maybe help solve the <a href="https://theconversation.com/the-opioid-epidemic-in-6-charts-81601">opioid epidemic</a>. Prison inmates can use VR simulations to <a href="https://news.vice.com/en_us/article/bjym3w/this-prison-is-using-vr-to-teach-inmates-how-to-live-on-the-outside">prepare for life after their release</a>. People are racing to enter these immersive experiences, which have the potential to be more psychologically powerful than any other technology to date: The first modern equipment offering the opportunity <a href="https://www.telegraph.co.uk/technology/ces/12085175/Oculus-Rift-to-go-on-sale-in-March-for-599.html">sold out in 14 minutes</a>.</p>
<p>In these new worlds, every leaf, every stone on the virtual ground and every conversation is carefully constructed. In our research into the emerging definition of ethics in virtual reality, my colleagues and I interviewed the developers and early users of virtual reality to understand <a href="http://hdl.handle.net/1903/20513">what risks are coming and how we can reduce them</a>.</p>
<h2>Intensity is going to level up</h2>
<p>“VR is a very personal, intimate situation. When you wear a VR headset … you really believe it, it’s really immersive,” says one of the developers with whom we spoke. If someone harms you in VR, <a href="https://theconversation.com/sexual-assault-enters-virtual-reality-67971">you’re going to feel it</a>, and if someone manipulates you into believing something, it’s going to stick. </p>
<p>This immersion is what users want: “VR is really about being immersed … As opposed to a TV where I can constantly be distracted,” one user told us. That immersiveness is what gives VR unprecedented power: “really, what VR is trying to do here is duplicate reality where it tricks your mind.”</p>
<p>These tricks can be enjoyable – allowing people to <a href="https://vrsource.com/best-vr-flight-simulators-5901/">fly helicopters</a> or journey back to <a href="https://www.virtualiteach.com/single-post/2017/07/24/Uncover-the-Tomb-of-Tutankhamen-in-VR">ancient Egypt</a>. They can be helpful, offering <a href="https://www.tandfonline.com/doi/abs/10.1586/14737175.8.11.1667">pain management</a> or treatment for <a href="http://www.icdvrat.org/2008/papers/ICDVRAT2008_S01_N05_Rizzo_et_al.pdf">psychological conditions</a>.</p>
<p>But they can also be malicious. Even a common prank that friends play on each other online – logging in and posting as each other – can take on a whole new dimension. One VR user explains, “Someone can put on a VR head unit and go into a virtual world assuming your identity. I think that identity theft, if VR becomes mainstream, will become rampant.”</p>
<h2>Data will be even more personal</h2>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=454&fit=crop&dpr=1 600w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=454&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=454&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=571&fit=crop&dpr=1 754w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=571&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=571&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An image of what the Oculus DK2 sees via its infrared sensors.</span>
<span class="attribution"><a class="source" href="https://forums.oculusvr.com/community/discussion/11385/what-can-the-dk2-ir-camera-see">MaglevNL/reddit</a></span>
</figcaption>
</figure>
<p>VR will be able to collect data on a whole new level. Seemingly innocuous infrared sensors designed to help with motion sickness and alignment can capture near-perfect representations of users’ real-world surroundings. </p>
<p>Further, the data and interactions that give VR the power to treat and diagnose <a href="https://futurism.com/ai-and-vr-could-completely-transform-how-doctors-diagnose-and-treat-mental-disorders/">physical and mental health conditions</a> can be used to hyper-personalize experiences and information to the precise vulnerabilities of individual users.</p>
<p>Combined, the intensity of virtual reality experiences and the even more personal data they collect present the specter of fake news that’s much more powerful than text articles and memes. Rather, immersive, personalized experiences may thoroughly convince people of entirely alternate realities, to which they are perfectly susceptible. Such immersive VR advertisements are on the horizon <a href="https://www.wired.com/story/vr-ads-are-almost-here/">as early as this year</a>.</p>
<h2>Building a virtual future</h2>
<p>A person who uses virtual reality is, often willingly, being controlled to far greater extents than were ever possible before. Everything a person sees and hears – and perhaps even feels or smells – is totally created by another person. That surrender brings both promise and peril. Perhaps in carefully constructed virtual worlds, people can solve problems that have eluded us in reality. But these virtual worlds will be built inside a real world that can’t be ignored. </p>
<p>While technologists and users are cleaning up the malicious, manipulative past, they’ll need to go far beyond <a href="https://www.wired.com/story/what-would-healthy-twitter-look-like/">making social media healthier</a>. As carefully as developers are building virtual worlds themselves, society as a whole must intentionally and painstakingly construct the culture in which these technologies exist. </p>
<p>In many cases, developers are the first allies in this fight. Our research found that VR developers were more concerned about their users’ well-being than the users themselves. Yet, one developer admits that “the fact of the matter is … I can count on my fingers the number of experienced developers I’ve actually met.” Even <a href="http://doi.org/10.1145/2580723.2580730">experts have only begun to explore</a> ethics, security and privacy in virtual reality scenarios. </p>
<p>The developers we spoke with expressed a desire for guidelines on where to draw the boundaries, and how to prevent dangerous misuses of their platforms. As an initial step, we <a href="http://hdl.handle.net/1903/20513">invited VR developers and users</a> from nine online communities to work with us to create a set of guidelines for VR ethics. They made suggestions about inclusivity, protecting users from manipulative attackers and limits on data collection. </p>
<p>As the debacle with Facebook and Cambridge Analytica shows, though, people don’t always follow guidelines, or even <a href="https://www.washingtonpost.com/business/economy/facebooks-rules-for-accessing-user-data-lured-more-than-just-cambridge-analytica/2018/03/19/31f6979c-658e-43d6-a71f-afdd8bf1308b_story.html">platforms’ rules and policies</a> – and the effects could be all the worse in this new VR world. But, our initial success reaching agreement on VR guidelines serves as a reminder that people can go beyond reckoning with the technologies others create: We can work together to create beneficial technologies we want.</p><img src="https://counter.theconversation.com/content/93118/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elissa Redmiles receives research funding from a variety of sources including the National Science Foundation, National Center for Women in Technology, and Facebook.</span></em></p>As the internet-connected world reels from revelations about personalized manipulation based on Facebook data, a scholar of virtual reality warns there’s an even bigger crisis of trust on the horizon.Elissa M. Redmiles, Ph.D. Student in Computer Science, University of MarylandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/898222018-01-09T11:21:13Z2018-01-09T11:21:13ZDoes Apple have an obligation to make the iPhone safer for kids?<figure><img src="https://images.theconversation.com/files/202477/original/file-20180118-158519-wj8m0y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Kids shouldn't be expected to self-regulate the amount of time they spend on the device. And parents are finding it tougher and tougher to impose limits.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/hand-newborn-baby-reaches-mobile-phone-353768999?src=KNUQy6pOR00DxEktkgIa-A-1-4">Vitalinka/Shutterstock.com</a></span></figcaption></figure><p>The average teen spends <a href="https://www.commonsensemedia.org/research/the-common-sense-census-media-use-by-tweens-and-teens">at least six hours a day looking at a screen</a>, with most of it from using a smartphone. </p>
<p>Many parents, naturally, have wondered if so much time spent in front of a screen is safe.</p>
<p>Recent research suggests it’s not. Teens who spend five or more hours a day on electronic devices <a href="http://journals.sagepub.com/doi/full/10.1177/2167702617723376">are 71 percent more likely</a> to have a risk factor for suicide than those who spend less than an hour a day on a device. Digital media use is linked with more depression and less happiness, with <a href="http://online.liebertpub.com/doi/abs/10.1089/cyber.2016.0259?journalCode=cyber">experiments</a>, <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2916158">natural experiments</a> and <a href="https://www.ncbi.nlm.nih.gov/pubmed/28093386">longitudinal</a> <a href="https://www.sciencedirect.com/science/article/pii/S1755296616300862">studies</a> all showing that digital media <a href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0069841">use leads to unhappiness</a> rather than the other way around.</p>
<p>Steve Jobs might have been onto something when he told a surprised reporter in 2010 that <a href="https://www.nytimes.com/2014/09/11/fashion/steve-jobs-apple-was-a-low-tech-parent.html">he didn’t let his kids use iPads</a> and he generally restricted their screen time. </p>
<p>Indeed, there’s an <a href="http://www.businessinsider.com/jony-ive-apple-chief-design-officer-constant-iphone-use-misuse-2017-10?r=UK&IR=T">increasing consensus</a> that the technology companies who have led us into the digital age have a responsibility to build some safeguards. That’s why I helped draft <a href="https://thinkdifferentlyaboutkids.com/">a letter from Apple shareholders</a> spearheaded by Jana Partners and the California State Teachers’ Retirement System that asks the company to take steps to protect their youngest consumers. Not only is it the right thing to do, but it could also improve the company’s bottom line. </p>
<h2>Limitation, not elimination</h2>
<p>According to the research, the problem isn’t teens owning smartphones. In fact, teens who don’t use smartphones at all <a href="https://www.scientificamerican.com/article/the-ldquo-goldilocks-rdquo-level-of-teen-screen-use/">are actually a little less happy</a> than those who use them a limited amount. </p>
<p>It’s only when use goes beyond two hours a day that issues begin to appear, including <a href="http://www.sleep-journal.com/article/S1389-9457(17)30350-7/fulltext">less sleep</a> and a <a href="http://journals.sagepub.com/doi/full/10.1177/2167702617723376">higher risk of suicide-related outcomes</a> such as depression and making suicide plans. </p>
<p>The solution, then, seems easy: Limit the amount of time the device can be used and how it can be used. This works out fairly well for Apple; <a href="https://www.statista.com/statistics/382260/segments-share-revenue-of-apple/">most of their profit is locked in once someone buys an iPhone or iPad</a>, regardless of how much the owner uses it. </p>
<p>The problem is that most teens who are handed a smartphone aren’t going to use it for just an hour or two per day. Research suggests that digital media stimulates <a href="https://www.ncbi.nlm.nih.gov/pubmed/25864599">the same brain chemicals and regions as other addictive products</a>. Although some teens are able to limit their use, a substantial number end up spending the majority of their leisure time with their devices, which – as noted earlier – could lead to mental health issues. </p>
<p>Some have pointed out that <a href="https://www.pcmag.com/roundup/342731/the-best-parental-control-apps-for-your-phone">parents can use third-party apps</a> such as Kidslox or Norton Family Premier to limit time spent on the phone or on social media sites. Although some parents might find these apps helpful, others might be overwhelmed by the setup process or find the download fees too expensive. Clever teens might also find ways around these apps. </p>
<p>But what if Apple were to include the ability to limit screen time in the iPhone’s operating system? </p>
<p>For example, when registering and setting up the phone, Apple could include an option to select the age of the user. If you say the phone is for a 12-year-old, it could give parents the option to restrict the apps used, shut down the phone at night, limit the number of hours it can be used and permit communication with a preapproved list of phone numbers. As the child grows older, these restrictions could be changed or lifted. Making this part of the iOS would seamlessly integrate safety for children and teens into the iPhone – and seamless integration has always been Apple’s calling card.</p>
<h2>Better phones for happier kids</h2>
<p>This has another benefit for Apple: Parents might be more willing to buy their children smartphones if they were easier to regulate. Outside of buying an old-school flip phone – which are increasingly difficult to find – there’s currently no easy way to give a child a cellphone without opening up the world of unlimited internet access, constant social media and endless evenings spent arguing over putting the phone away at dinner. </p>
<p>As the parent of an 11-year-old, I would be much more comfortable giving my daughter a smartphone if I knew she wouldn’t be bullied on it, see things she shouldn’t see or stare at it for six hours a day.</p>
<p>Social media companies like <a href="http://fortune.com/2017/11/09/sean-parker-facebook-childrens-brains/">Facebook also have something to answer for here</a> – <a href="https://newsroom.fb.com/news/2017/12/hard-questions-is-spending-time-on-social-media-bad-for-us/">and they know it</a>. Given links between advertising revenue and time spent on the site, balancing profit and safety will be a tougher task for them. </p>
<p>But for Apple, it’s arguably a win-win: The safer their product is for kids, the more they could sell. So why not make it safer by offering parents more tools and options?</p><img src="https://counter.theconversation.com/content/89822/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jean Twenge consults for Jana Partners, LLC and has received funding in the past from the National Institutes of Health and the Russell Sage Foundation.</span></em></p>The problem isn’t kids owning smartphones. But when daily use exceeds two hours a day, mental health issues start to crop up.Jean Twenge, Professor of Psychology, San Diego State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/896462018-01-09T11:15:21Z2018-01-09T11:15:21ZYoung doctors struggle to learn robotic surgery – so they are practicing in the shadows<figure><img src="https://images.theconversation.com/files/200962/original/file-20180105-26139-952nox.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Surgeons in Switzerland use the robot da Vinci to aid a hernia operation. Over a third of US hospitals have at least one surgical robot.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Switzerland-Robotic-Surgery/b25d6d98375c408aa9de47c5c82b62e0/17/0">AP Photo/Keystone, Salvatore Di Nolfi</a></span></figcaption></figure><p>Artificial intelligence and robotics spell massive changes to the world of work. These technologies can automate new tasks, and we are making more of them, faster, better and cheaper than ever before. </p>
<p>Surgery was early to the robotics party: Over a third of U.S. hospitals <a href="http://phx.corporate-ir.net/phoenix.zhtml?c=122359&p=irol-irhome">have at least one surgical robot</a>. Such robots have been in widespread use by a growing variety of surgical disciplines, including urology and gynecology, for over a decade. That means the technology has been around for least two generations of surgeons and surgical staff.</p>
<p>I studied robotic surgery for over two years to understand how surgeons are adapting. I observed hundreds of robotic and “traditional” procedures at five hospitals and interviewed surgeons and surgical trainees at another 13 hospitals around the country. I found that robotic surgery disrupted approved approaches surgical training. Only a minority of residents found effective alternatives. </p>
<p>Like the surgeons I studied, we’re all going to have to adapt to AI and robotics. Old hands and new recruits will have to learn new ways to do their jobs, whether in construction, lawyering, retail, finance, warfare or childcare – no one is immune. How will we do this? And what will happen when we try?</p>
<h2>A shift in surgery</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/200863/original/file-20180104-26154-1v53p1q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The da Vinci Surgical Robot at a hospital in Pittsburgh.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Robotic-Hysterectomies/3701ec7c21d74497ab84e6ecd710de84/2/0">AP Photo/Keith Srakocic</a></span>
</figcaption>
</figure>
<p>In <a href="http://journals.sagepub.com/doi/full/10.1177/0001839217751692">my new paper</a>, published January 8, I specifically focus on how surgical trainees, known as residents, learned to use the 800-pound gorilla: Intuitive Surgical’s da Vinci surgical system. This is a four-armed robot that holds sticklike surgical instruments, controlled by a surgeon sitting at a console 15 or so feet away from the patient. </p>
<p>Robotic surgery presented a radically different work scenario for residents. In traditional (open) surgery, the senior surgeon literally couldn’t do most of the work without constant hands-in-the-patient cooperation from the resident. So residents could learn by sticking to strong “see one, do one, teach one” norms for surgical training. </p>
<p>This broke down in robotic surgery. Residents were stuck either “sucking” at the bedside – using a laparoscopic tool to remove smoke and fluids from the patient – or sitting in a second trainee console, watching the surgical action and waiting for a chance to operate. </p>
<p>In either case, surgeons didn’t need residents’ help, so they granted residents a lot less practice operating than they did in open procedures. The practice residents did get was lower-quality because surgeons “helicopter taught” – giving frequent and very public feedback to residents at the console and intermittently taking control of the robot away from them. </p>
<p>As one resident said: “If you’re on the robot and [control is] taken away, it’s completely taken away and you’re just left to think about exactly what you did wrong, like a kid sitting in the corner with a dunce cap. Whereas in open surgery, you’re still working.” </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/SoFzKPzYKHE?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Shadow learning</h2>
<p>Very few residents overcame these barriers to effectively learn how to perform this kind of surgery. The rest struggled – yet all were legally and professionally empowered to perform robotic surgeries when they finished their residencies. </p>
<p>Successful learners made progress through three norm-bending practices. Some focused on robotic surgery in the midst of medical school at the expense of generalist medical training. Others practiced extensively via simulators and watched recorded surgeries on YouTube when learning in real procedures was prized. Many learned through undersupervised struggle – performing robotic surgical work close to the edge of their capacity with little expert supervision. </p>
<p>Put together, I called these practices “shadow learning,” because they ran counter to norms and residents engaged in them out of the limelight. Also, none of this was openly discussed, let alone punished or forbidden. </p>
<p>Shadow learning came at a serious cost to successful residents, their peers and their profession. Shadow learners became hyperspecialized in robotic surgery, but most were destined for jobs that required generalist skills. They learned at the expense of their struggling peers, because they got more “console time” when senior surgeons saw they could operate well. The profession has been slow to adapt to all this practically invisible trouble. And these dynamics have restricted the supply of expert robotic surgeons. </p>
<p>As one senior surgeon told me, robotics has had an “opposite effect” on learning. Surgeons from top programs are graduating without sufficient skill with robotic tools, he said. “I mean these guys can’t do it. They haven’t had any experience doing it. They watched it happen. Watching a movie doesn’t make you an actor, you know what I’m saying?”</p>
<h2>The working world</h2>
<p>These insights are relevant for surgery, but can also help us all think more clearly about the implications of AI and robotics for the broader world of work. Businesses are buying <a href="https://www.businesswire.com/news/home/20171207005539/en/Global-Robotics-Market---Expected-Grow-CAGR">robots</a> and <a href="https://www.prnewswire.com/news-releases/artificial-intelligence-market-to-experience-massive-growth-of-629-cagr-by-2022-657227263.html">AI technologies</a> at a breakneck pace, based on the promise of improved productivity and the threat of being left behind.</p>
<p>Early on, journalists, social scientists and politicians focused on how these technologies would destroy or create jobs. These are important issues, but the global conversation has recently turned to a much bigger one: job change. According to one <a href="https://www.mckinsey.com/global-themes/digital-disruption/harnessing-automation-for-a-future-that-works">analysis from McKinsey</a>, 30 percent of the tasks in the average U.S. job could soon be profitably automated.</p>
<p>It’s often costly – in dollars, time and errors – to allow trainees to work with experts. In our quest for productivity, we are deploying many technologies and techniques that make trainee involvement optional. Wherever we do this, shadow learning may become more prevalent, with similar, troubling implications: a shrinking, hyperspecialized minority; a majority that is losing the skill to do the work effectively; and organizations that don’t know how learning is actually happening. </p>
<p>If we’re not careful, we may unwittingly improve our way out of the skill we need to meet the needs of a changing world.</p><img src="https://counter.theconversation.com/content/89646/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matt Beane does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>There are more robots than ever in the operating room – but that’s led to fewer opportunities for surgical trainees. Now, some new doctors are teaching themselves in secret.Matt Beane, Project Scientist, Incoming Assistant Professor, University of California, Santa BarbaraLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/875662018-01-04T04:33:37Z2018-01-04T04:33:37ZTrust in digital technology will be the internet’s next frontier, for 2018 and beyond<figure><img src="https://images.theconversation.com/files/199508/original/file-20171215-17857-cns8cs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Trust in online systems varies around the world.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/technologies-connect-people-mixed-media-588071525">Sergey Nivens/Shutterstock.com</a></span></figcaption></figure><p>After decades of unbridled enthusiasm – bordering on <a href="https://www.npr.org/sections/health-shots/2017/05/18/527799301/is-internet-addiction-real">addiction</a> – about all things digital, the public may be <a href="https://www.nytimes.com/2017/10/11/insider/tech-column-dread.html">losing trust in technology</a>. <a href="https://www.washingtonpost.com/news/theworldpost/wp/2017/10/09/pierre-omidyar-6-ways-social-media-has-become-a-direct-threat-to-democracy/">Online information isn’t reliable</a>, whether it appears in the form of news, search results or user reviews. Social media, in particular, is <a href="https://www.pbs.org/newshour/show/social-media-giants-are-vulnerable-to-foreign-propaganda-what-can-they-do-to-change">vulnerable to manipulation</a> by hackers or foreign powers. Personal data <a href="https://hbr.org/2017/12/what-would-you-pay-to-keep-your-digital-footprint-100-private">isn’t necessarily private</a>. And people are increasingly worried about automation and artificial intelligence <a href="https://www.nytimes.com/2017/11/30/technology/ai-will-transform-the-economy-but-how-much-and-how-soon.html">taking humans’ jobs</a>.</p>
<p>Yet, around the world, people are both increasingly dependent on, and distrustful of, digital technology. They don’t behave as if they mistrust technology. Instead, people are using technological tools more intensively in all aspects of daily life. In recent research on <a href="https://sites.tufts.edu/digitalplanet/executive-summary/">digital trust in 42 countries</a> (a collaboration between Tufts University’s Fletcher School of Law and Diplomacy, where I work, and Mastercard), my colleagues and I found that this paradox is a global phenomenon. </p>
<p>If today’s technology giants don’t do anything to address this unease in an environment of growing dependence, people might start looking for more trustworthy companies and systems to use. Then Silicon Valley’s powerhouses could see their business boom go bust.</p>
<h2>Economic power</h2>
<p>Some of the concerns have to do with how big a role the technology companies and their products play in people’s lives. <a href="http://www.cnn.com/2016/06/30/health/americans-screen-time-nielsen/index.html">U.S. residents already spend 10 hours a day</a> in front of a screen of some kind. One in 5 Americans say they are online “<a href="http://www.pewresearch.org/fact-tank/2015/12/08/one-fifth-of-americans-report-going-online-almost-constantly/">almost constantly</a>.” The tech companies have enormous reach and power. <a href="http://money.cnn.com/2017/06/27/technology/facebook-2-billion-users/index.html">More than 2 billion people</a> use Facebook every month.</p>
<p><a href="http://gs.statcounter.com/search-engine-market-share">Ninety percent of search queries worldwide</a> go through Google. Chinese e-retailer, Alibaba, organizes the biggest shopping event worldwide every year on Nov. 11, which this year brought in <a href="http://www.businessinsider.com/alibabas-singles-day-bigger-than-black-friday-cyber-monday-combined-2017-11">US$25.3 billion in revenue</a>, more than twice what U.S. retailers sold between Thanksgiving and Cyber Monday last year. </p>
<p>This results in enormous wealth. All six companies in the world <a href="https://www.bloomberg.com/news/articles/2017-11-21/tencent-s-292-billion-rally-ousts-facebook-from-global-top-five">worth more than $500 billion</a> are tech firms. The <a href="https://business.linkedin.com/talent-solutions/blog/employer-brand/2017/revealing-the-25-most-sought-after-employers-globally">top six most sought-after companies to work for</a> are also in tech. Tech <a href="https://www.wsj.com/articles/tech-boom-creates-new-order-for-world-markets-1511260200">stocks are booming</a>, in ways reminiscent of the giddy days of the <a href="http://www.businessinsider.com/heres-why-the-dot-com-bubble-began-and-why-it-popped-2010-12">dot-com bubble</a> of 1997 to 2001. With emerging technologies, including the “<a href="https://www.fool.com/investing/2017/12/13/2-tech-giants-are-teaming-up-for-the-internet-of-t.aspx">internet of things</a>,” <a href="http://www.sciencemag.org/news/2017/12/are-we-going-too-fast-driverless-cars">self-driving cars</a>, <a href="https://www.wired.com/story/future-of-bitcoin-blockchain-2018/">blockchain</a> systems and <a href="https://economictimes.indiatimes.com/jobs/by-2020-artificial-intelligence-will-create-more-jobs-than-it-eliminates-gartner/articleshow/62053363.cms">artificial intelligence</a>, tempting investors and entrepreneurs, the reach and power of the industry is only likely to grow. </p>
<p>This is particularly true because <a href="https://www.cisco.com/c/en/us/solutions/service-provider/vni-network-traffic-forecast/infographic.html">half the world’s population</a> is still not online. But networking giant Cisco projects that <a href="https://www.cisco.com/c/en/us/solutions/service-provider/vni-network-traffic-forecast/infographic.html">58 percent of the world</a> will be online by 2021, and the volume of internet traffic per month per user will grow 150 percent from 2016 to 2021.</p>
<p>All these users will be deciding on how much to trust digital technologies.</p>
<h2>Data, democracy and the day job</h2>
<p>Even now, the reasons for collective unease about technology are piling up. Consumers are learning to be worried about the security of their personal information: News about a data breach involving <a href="https://www.ft.com/content/6943d9ab-c91b-3718-928e-67a802a9c463">57 million</a> Uber accounts follows on top of reports of a breach of <a href="https://www.nytimes.com/2017/10/02/business/equifax-breach.html">the 145.5 million consumer data records</a> on Equifax and every Yahoo account – <a href="http://money.cnn.com/2017/10/03/technology/business/yahoo-breach-3-billion-accounts/index.html">3 billion</a> in all. </p>
<p><a href="https://www.nytimes.com/2017/10/31/us/politics/facebook-twitter-google-hearings-congress.html">Russia was able to meddle</a> with Facebook, Google and Twitter during the 2016 election campaign. That has raised concerns about whether the openness and reach of digital media is a threat to the functioning of democracies.</p>
<p>Another technological threat to society comes from workplace automation. The management consulting firm, McKinsey, estimates that it could <a href="https://www.mckinsey.com/global-themes/future-of-organizations-and-work/what-the-future-of-work-will-mean-for-jobs-skills-and-wages">displace one-third of the U.S. workforce</a> by 2030, even if a different set of technologies create new <a href="https://www.mckinsey.com/global-themes/future-of-organizations-and-work/the-digital-future-of-work-is-the-9-to-5-job-going-the-way-of-the-dinosaur">“gig” opportunities</a>.</p>
<p>The challenge for tech companies is that they operate in global markets and the extent to which these concerns affect behaviors online varies significantly around the world. </p>
<h2>Mature markets differ from emerging ones</h2>
<p><a href="https://sites.tufts.edu/digitalplanet/executive-summary/">Our research</a> uncovers some interesting differences in behaviors across geographies. In areas of the world with smaller digital economies and where technology use is still growing rapidly, users tend to exhibit more trusting behaviors online. These users are more likely to stick with a website even if it loads slowly, is hard to use or requires many steps for making an online purchase. This could be because the experience is still novel and there are fewer convenient alternatives either online or offline.</p>
<p>In the mature digital markets of Western Europe, North America, Japan and South Korea, however, people have been using the internet, mobile phones, social media and smartphone apps for many years. Users in those locations are less trusting, prone to switching away from sites that don’t load rapidly or are hard to use, and abandoning online shopping carts if the purchase process is too complex.</p>
<p>Because people in more mature markets have less trust, I would expect tech companies to invest in trust-building in more mature digital markets. For instance, they might speed up and streamline processing of e-commerce transactions and payments, or more clearly label the sources of information presented on social media sites, as the <a href="https://www.scu.edu/ethics/focus-areas/journalism-ethics/programs/the-trust-project/">Trust Project</a> is doing, helping to identify authenticated and reliable news sources.</p>
<p>Consider Facebook’s situation. In response to criticism for allowing fake Russian accounts to distribute fake news on its site, CEO Mark Zuckerberg boldly <a href="https://www.cnbc.com/2017/11/01/facebook-says-costs-will-rise-to-go-after-fake-news.html">declared that</a>, “Protecting our community is more important than maximizing our profits.” However, according to the company’s chief financial officer, Facebook’s 2018 operating expenses could increase by <a href="https://www.cnbc.com/2017/11/01/facebook-says-costs-will-rise-to-go-after-fake-news.html">45 to 60 percent</a> if it were to invest significantly in building trust, such as <a href="https://www.popsci.com/Facebook-hiring-3000-content-monitors">hiring more humans to review posts</a> and <a href="https://thenextweb.com/facebook/2017/08/03/facebook-enlists-ai-in-war-on-fake-news/">developing artificial intelligence systems</a> to help them. Those costs would lower Facebook’s profits.</p>
<p>To strike a balance between profitability and trustworthiness, Facebook will have to set priorities and deploy advanced trust-building technologies (e.g. vetting locally generated news and ads) in only some geographic markets.</p>
<h2>The future of digital distrust</h2>
<p>As the boundaries of the digital world expand, and more people become familiar with internet technologies and systems, their distrust will grow. As a result, companies seeking to enjoy consumer trust will need to invest in becoming more trustworthy more widely around the globe. Those that do will likely see a competitive advantage, winning more loyalty from customers.</p>
<p>This risks creating a new type of digital divide. Even as one global inequality disappears – more people have an opportunity to go online – some countries or regions may have significantly more trustworthy online communities than others. Especially in the less-trustworthy regions, users will need governments to enact strong digital policies to protect people from fake news and fraudulent scams, as well as regulatory oversight to protect consumers’ data privacy and human rights.</p>
<p>All consumers will need to remain on guard against overreach by heavy-handed authorities or autocratic governments, particularly in parts of the world where consumers are new to using technology and, therefore, more trusting. And they’ll need to keep an eye on companies, to make sure they invest in trust-building more evenly around the world, even in less mature markets. Fortunately, digital technology makes watchdogs’ work easier, and also can serve as a megaphone – such as on social media – to issue alerts, warnings or praise.</p><img src="https://counter.theconversation.com/content/87566/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bhaskar Chakravorti directs the Institute for Business in the Global Context that receives funding from Mastercard, Microsoft and the Gates Foundation. </span></em></p>Around the world, people are both increasingly dependent on, and distrustful of, digital technology. New research suggests ways this conflict could unfold.Bhaskar Chakravorti, Senior Associate Dean, International Business & Finance, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/888272018-01-02T01:16:25Z2018-01-02T01:16:25ZSocial media companies should ditch clickbait, and compete over trustworthiness<figure><img src="https://images.theconversation.com/files/198329/original/file-20171208-27674-1pqhrhp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's time to build trust.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/large-group-people-seen-above-gathered-255266638">Arthimedes/Shutterstock.com</a></span></figcaption></figure><p>Social media websites and online services, created to profit from connecting people and <a href="http://www.wiley.com/WileyCDA/WileyTitle/productCd-0745656196.html">encouraging global conversations</a>, have a deep and troubling dark side. Malicious users have exploited these forums for free speech in ways that <a href="https://www.youtube.com/watch?v=_v9VKE3JLN8">weaken shared norms</a> of civility, trust and openness. This includes not just <a href="http://www.daniellecitron.com/hate-crimes-in-cyberspace/">bullying and shaming of individuals</a>, but also <a href="https://www.theverge.com/2017/12/11/16761016/former-facebook-exec-ripping-apart-society">dealing significant damage to society</a> as a whole.</p>
<p>Americans – and <a href="https://www.nytimes.com/2017/09/20/world/africa/kenya-court-election.html">people around the world</a> – will spend much of 2018 discussing how to handle the problem of Facebook, Twitter, Google and their ilk reaping massive profits while <a href="https://www.washingtonpost.com/news/theworldpost/wp/2017/10/09/pierre-omidyar-6-ways-social-media-has-become-a-direct-threat-to-democracy/">threatening democracy</a> and <a href="https://www.poynter.org/news/poynter-releases-new-study-examining-trust-media">undermining trust in public discourse</a>. As scholars of public accountability and digital media systems, we suggest these companies could find a new way to compete that promotes trust and accuracy, bringing both private profits and public benefits.</p>
<h2>Social media’s original sin</h2>
<p>Many problems have arisen because of how social media companies began, and how their power has grown in society. Like most Silicon Valley startups, Google, Facebook and Twitter were incubated in a <a href="https://www.inc.com/bruce-gibney/silicon-valleys-libertarian-problem.html">libertarian, free market environment</a>. These conditions reward people and companies who best provide powerful and convenient ways for people around the world to connect. Yet, they are engineered for the benefit of their private stockholders, not their public stakeholders.</p>
<p>From Google’s search algorithm to Facebook’s news feed algorithm, the processes that shape our online experience are not only complex and secretive, they are intentionally opaque. That is, in fact, their primary business model. </p>
<p>Think about it: If everyone knew <a href="https://moz.com/google-algorithm-change">exactly how Google’s algorithms worked</a>, then unscrupulous websites could hack their way to the top of search listings, rather than earning a high ranking by improving their products and services – or by paying Google for their ads to appear alongside search results. Similarly, if <a href="https://adespresso.com/blog/top-updates-facebook-monthly-need-know-now/">Facebook revealed the method</a> by which it selects items to appear on users’ news feeds, brands and content providers would no longer need to pay the company <a href="http://www.adweek.com/digital/facebook-raked-in-9-16-billion-in-ad-revenue-in-the-second-quarter-of-2017/">tens of billions of dollars per year</a> to reach their own customers on the platform. </p>
<p>These companies have become <a href="http://www.pewinternet.org/fact-sheet/social-media/">massive</a>, with <a href="https://techcrunch.com/2017/06/27/facebook-2-billion-users/">billions of users</a> spending <a href="http://www.adweek.com/digital/mediakix-time-spent-social-media-infographic/">hours a day</a> on their systems. Business is <a href="https://www.facebook.com/zuck/posts/10104146268321841">booming</a> – but their lack of transparency and accountability is increasingly understood as a <a href="https://www.seattletimes.com/opinion/profit-vs-access-on-facebook-our-digital-town-square/">threat to civil society</a>.</p>
<h2>Polluting the public sphere</h2>
<p>The erosion of civility, trust and respect for truth in U.S. society is what economists call a “<a href="http://economics.fundamentalfinance.com/negative-externality.php">negative externality</a>.” That is a cost of a product or service that is paid by society at large, rather than the company that supplies it or the customer who buys it. A common example is <a href="https://publicecon.wikispaces.com/Negative+Externalities+and+the+Environment">industrial pollution</a>, when manufacturing companies don’t pay the costs of health and environmental problems that their plants’ pollution causes.</p>
<p>Social media companies earn extraordinary profits by collecting personal information and <a href="https://gigaom.com/report/the-revolution-will-be-targeted-rtb-and-the-future-of-programmatic-advertising/">selling ads targeted with algorithms</a>. This has allowed the rise of new kinds of social pollution: <a href="http://www.politifact.com/truth-o-meter/article/2016/dec/13/2016-lie-year-fake-news/">fake news</a>, purposely divisive messages distributed by <a href="https://www.washingtonpost.com/world/europe/pro-putin-politics-bots-are-flooding-russian-twitter-oxford-based-studysays/2017/06/20/19c35d6e-5474-11e7-840b-512026319da7_story.html">fake identities</a> – even the creation of <a href="https://www.thedailybeast.com/exclusive-russia-used-facebook-events-to-organize-anti-immigrant-rallies-on-us-soil">real-world political events</a> based on these false and anti-social messages.</p>
<p>Just as society expects oil companies to <a href="http://latimesblogs.latimes.com/greenspace/2010/05/gulf-oil-spill-bp-accepts-responsibility-for-oil-cleanup.html">take moral and legal responsibility</a> for environmental pollution if they spill in the oceans and aquifers, we believe social media companies must help fix and fight the social pollution that their platforms have enabled.</p>
<h2>Accountability and transparency are key</h2>
<p>In our view, social media companies need to move beyond their “free market” foundations. Like any other set of institutions essential to our social infrastructures and economies, they should develop methods to provide the transparency and public accountability necessary to address the social ills their platforms have enabled. </p>
<p>Transparency is the best way to drive out hate speech and fake news. Without it, customers won’t have confidence in the quality of the information they receive, or the goodwill of information providers. Social media companies need to be <a href="https://www.nytimes.com/2017/12/13/technology/tech-companies-social-responsibility.html">more responsive to the needs of society as a whole</a>, and accept responsibility for monitoring the integrity of their own platforms. They must be held publicly accountable for their platforms’ capacity to be used in ways that undermine our civil society and political institutions.</p>
<p>This is not a simple proposition, nor a change that social media platforms can make with a mere flip of a switch. But in our view, it’s necessary.</p>
<p>Facebook has begun this process, allowing users to <a href="https://newsroom.fb.com/news/2017/10/update-on-our-advertising-transparency-and-authenticity-efforts/">identify all ads a particular page buys</a> across the site, regardless of how the ads were micro-targeted. Twitter has <a href="https://blog.twitter.com/official/en_us/topics/product/2017/New-Transparency-For-Ads-on-Twitter.html">taken similar steps</a>. But these are only initial efforts in what should be a much longer process.</p>
<h2>A new business opportunity</h2>
<p>If these companies don’t change, the very societies that provide their economic lifeblood will diminish and fail. Reform is in their own interest – and everyone else’s too.</p>
<p>In the short term, social media companies should make their algorithms, and the data they analyze, more transparent to the public. One possibility could be developing centralized websites where people can <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">check the sources of content and funding for advertisements</a>. A promising approach, currently under way at some companies, involves <a href="https://newsroom.fb.com/news/2017/10/update-on-our-advertising-transparency-and-authenticity-efforts/">adapting algorithms to reduce the prominence of fake news</a> posts in users’ feeds. Yet, despite their benefits, these initiatives don’t fix the contradiction at the heart of social media: Their obscurity-based business models conflict directly with their increasingly central role as platforms for public discourse and the democratic process.</p>
<p>In the longer term, major internet companies will need to rethink their strategies entirely. This means developing business models that privilege transparency over obscurity, accessibility over secrecy, and accountability over accounting. Though this is a tall order, we propose that social media enterprises could pivot and expand into the business of verification and certification.</p>
<p>What does this mean, exactly? Facebook, Twitter, Google and others like them should start competing to provide the most accurate news instead of the most click-worthy, and the most trustworthy sources rather than the most sensational. Once these companies understand that maintaining a healthy public sphere pays better than aiding social pollution, they —- and society as a whole – can begin to clean up the mess we’ve made together.</p><img src="https://counter.theconversation.com/content/88827/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Aram Sinnreich is affiliated with the Internet Society, Washington DC chapter.</span></em></p><p class="fine-print"><em><span>Barbara Romzek does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Social media companies arose from libertarian, free-market origins but must embrace social benefits and democracy to survive.Barbara Romzek, Professor of Public Administration and Policy, American University School of Public AffairsAram Sinnreich, Associate Professor of Communication Studies, American University School of CommunicationLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/865972017-12-04T13:54:44Z2017-12-04T13:54:44ZTaking a second look at the learn-to-code craze<figure><img src="https://images.theconversation.com/files/195068/original/file-20171116-15412-kukk7v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Are computers in the classroom more helpful to students – or the companies that sell the machines?</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Schools-Too-Much-Testing/04b897dcb320476699eec2eac3db7992/13/0">AP Photo/Sue Ogrocki</a></span></figcaption></figure><p>Over the past several years, the idea that computer programming – or “coding” – is the key to the future for both children and adults alike has become received wisdom in the United States. The aim of making <a href="https://obamawhitehouse.archives.gov/the-press-office/2016/01/30/fact-sheet-president-obama-announces-computer-science-all-initiative-0">computer science</a> a “<a href="https://obamawhitehouse.archives.gov/blog/2016/01/30/computer-science-all">new basic</a>” skill for all Americans has driven the formation of dozens of <a href="http://girlswhocode.com">nonprofit</a> <a href="http://code.org">organizations</a>, <a href="http://flatironschool.com/">coding</a> <a href="https://www.codecademy.com/">schools</a> and <a href="https://www.congress.gov/bill/115th-congress/house-bill/3316">policy programs</a>.</p>
<p>As this year’s annual <a href="https://csedweek.org/">Computer Science Education Week</a> begins, it is worth taking a closer look at this recent coding craze. The Obama administration’s “<a href="https://obamawhitehouse.archives.gov/blog/2016/01/30/computer-science-all">Computer Science For All</a>” initiative and the <a href="https://www.recode.net/2017/9/25/16276904/president-donald-trump-ivanka-tech-stem-computer-science-coding-education-amazon-google">Trump administration’s effort</a> are both based on the idea that computer programming is not only a fun and exciting activity, but a necessary skill for the jobs of the future.</p>
<p>However, the American history of these education initiatives shows that their primary beneficiaries aren’t necessarily students or workers, but rather the <a href="https://blogs.microsoft.com/on-the-issues/2016/01/30/microsoft-supports-white-house-initiative-to-expand-access-to-computer-science/">influential tech companies</a> that <a href="https://www.recode.net/2017/9/26/16364662/amazon-facebook-google-tech-300-million-donald-trump-ivanka-computer-science">promote the programs</a> in the first place. The current campaign to teach American kids to code may be the latest example of <a href="http://technet.org/membership/members">tech companies</a> using concerns about <a href="http://www.newschools.org/">education</a> to achieve their own goals. This raises some important questions about who stands to gain the most from the recent computer science push. </p>
<h2>Old rhetoric about a ‘new economy’</h2>
<p>One of the earliest corporate efforts to get computers into schools was Apple’s <a href="http://hackeducation.com/2015/02/25/kids-cant-wait-apple">“Kids Can’t Wait” program</a> in 1982. Apple co-founder Steve Jobs <a href="http://americanhistory.si.edu/comphist/sj1.html#kids">personally lobbied</a> Congress to pass the <a href="https://www.congress.gov/bill/97th-congress/house-bill/5573">Computer Equipment Contribution Act</a>, which would have allowed companies that donated computers to schools, libraries and museums to deduct the equipment’s value from their corporate income tax bills. While his efforts in Washington failed, he succeeded in his home state of California, where companies could claim a <a href="https://www.ftb.ca.gov/Archive/Law/legis/1981_FedTax.pdf">tax credit for 25 percent</a> of the value of computer donations.</p>
<p>The bill was clearly a corporate tax break, but it was framed in terms of educational gaps: According to a <a href="https://digitalcommons.law.ggu.edu/cgi/viewcontent.cgi?httpsredir=1&article=1472&context=caldocs_assembly">California legislative analysis</a>, the bill’s supporters felt that “computer literacy for children is becoming a necessity in today’s world” and that the bill would help in “placing needed ‘hardware’ in schools unable to afford computers in any other way.”</p>
<p>Kids Can’t Wait took advantage of Reagan-era concerns that Americans were “<a href="https://press.princeton.edu/titles/10208.html">falling behind</a>” global competitors in the “new economy.” In 1983, a U.S. Department of Education report titled “<a href="https://www2.ed.gov/pubs/NatAtRisk/index.html">A Nation at Risk</a>” warned that the country’s “once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world.” The report’s authors blamed the American education system for turning out graduates who were underprepared for a fast-changing, technology-infused workplace. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=396&fit=crop&dpr=1 600w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=396&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=396&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=498&fit=crop&dpr=1 754w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=498&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/195720/original/file-20171121-6061-1siqhkb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=498&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Federal officials, including then House Speaker Newt Gingrich, launched an effort to get classrooms online in 1995.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-DC-USA-APHS256304-House-Speaker-New-/651639f3821f422998d24c11d7050d1d/171/0">AP Photo/Dennis Cook</a></span>
</figcaption>
</figure>
<p>Over the past 30 years, the same rhetoric has appeared again and again. In 1998, Bill Clinton <a href="http://www.presidency.ucsb.edu/ws/?pid=58384">proclaimed</a> that “access to new technology means … access to the new economy.” In 2016, U.S. Chief Technology Officer Megan Smith described the Obama administration’s coding initiative as an “<a href="https://www.theatlantic.com/education/archive/2016/02/obamas-push-for-computer-science-education/459276/">ambitious, all-hands-on-deck effort</a> to get every student in America an early start with the skills they’ll need to be part of the new economy.”</p>
<p>While technology is often framed as the solution for success in a globalized labor market, the evidence is less clear. In his 2001 book “<a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">Oversold and Underused: Computers in the Classroom</a>,” education researcher Larry Cuban warned that technology on its own would not solve “education’s age-old problems,” such as <a href="https://www.theatlantic.com/business/archive/2016/08/property-taxes-and-unequal-schools/497333/">inequitable funding</a>, <a href="https://www.washingtonpost.com/local/education/crumbling-school-facilities-causing-anxiety-for-parents/2015/05/12/ca83a91a-f800-11e4-a13c-193b1241d51a_story.html">inadequate facilities</a> and <a href="https://news.vice.com/story/american-educators-teach-longer-for-less-pay-than-their-foreign-peers">overworked teachers</a>.</p>
<p>Cuban found that some educational technology initiatives from the 1990s did help students get access to computers and learn basic skills. But that didn’t necessarily <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">translate into higher-wage jobs</a> when those students entered the workforce. However, the equipment and software needed to teach them brought large windfalls for tech companies – in 1995 the industry was <a href="http://www.nytimes.com/1995/09/11/business/apple-holds-school-market-despite-decline.html">worth US$4 billion</a>.</p>
<h2>Under pressure</h2>
<p>If computers in schools didn’t work as promised two decades ago, then what’s behind the current coding push? Cuban <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674011090">points out</a> that few school boards and administrators can resist pressure from business leaders, public officials and <a href="http://news.gallup.com/poll/184637/parents-students-computer-science-education-school.aspx">parents</a>. Organizations like the <a href="http://www.csforall.org/">CS For All Consortium</a>, for example, have a large membership of education companies who are taking advantage of funding from <a href="https://cardenas.house.gov/media-center/press-releases/c-rdenas-416d65726963612043616e20436f646520">state legislatures</a>.</p>
<p>A huge boost comes from the tech giants, too. Amazon, Facebook, Google, Microsoft and others are collectively <a href="http://blogs.edweek.org/edweek/DigitalEducation/2017/10/300_million_computer_science_pledge.html">contributing $300 million</a> to the Trump administration’s new federal initiative – no doubt seeing, as The New York Times observed, the potential to “<a href="https://www.nytimes.com/2017/09/26/technology/computer-science-stem-education.html">market their own devices and software</a> in schools as coding classes spread.” </p>
<p>This isn’t always the best deal for students. In 2013, the Los Angeles Unified School District planned to give Apple iPads to every student in every school – at a cost of <a href="https://www.wired.com/2015/05/los-angeles-edtech/">$1.3 billion</a>. The program was a fiasco: The iPads had technical problems and incomplete software that made them <a href="https://gizmodo.com/the-la-school-systems-1-3-billion-ipad-fiasco-comes-to-1733569377">essentially useless</a>. The fallout included <a href="http://www.govtech.com/education/What-Went-Wrong-with-LA-Unifieds-iPad-Program.html">investigations by the FBI and the U.S. Securities and Exchange Commission</a>, and a legal settlement in which Apple and its partners <a href="http://www.latimes.com/local/lanow/la-me-ln-la-unified-ipad-settlement-20150925-story.html">repaid the school district $6.4 million</a>.</p>
<p>However, tech companies are framing their efforts in more noble terms. In June 2017, Microsoft president Brad Smith compared the efforts of tech industry nonprofit <a href="https://code.org/">Code.org</a> to previous efforts to improve science and technology training in the United States. Recalling the <a href="https://www.gpo.gov/fdsys/pkg/STATUTE-72/pdf/STATUTE-72-Pg1580.pdf">focus on scientific research</a> that drove the <a href="https://www.history.com/topics/space-race">Space Race</a>, Smith <a href="https://www.nytimes.com/2017/06/27/technology/education-partovi-computer-science-coding-apple-microsoft.html">said</a>, “We think computer science is to the 21st century what physics was to the 20th century.” </p>
<p>Indeed, tech companies are having a very hard time <a href="https://www.bostonglobe.com/business/2016/02/19/the-war-for-tech-talent-escalates/ejUSbuPCjPLCMRYlRZIKoJ/story.html">hiring and retaining software engineers</a>. With new concerns about <a href="https://www.wired.com/2017/04/trumps-executive-order-wont-give-tech-clarity-h-1b-visas/">restrictions on visas</a> for skilled immigrant workers, the industry could definitely benefit from a workforce trained with public dollars. </p>
<p>For some tech companies, this is an explicit goal. In 2016, Oracle and Micron Technology helped write a state <a href="https://legislature.idaho.gov/wp-content/uploads/sessioninfo/2016/legislation/H0379.pdf">education bill</a> in Idaho which read, “It is essential that efforts to increase computer science instruction, kindergarten through career, be driven by the needs of industry and be developed in partnership with industry.” While two lawmakers <a href="http://www.spokesman.com/blogs/boise/2016/feb/02/house-backs-launching-computer-science-initiative-idaho-schools-though-2-members-object/">objected to the corporate influence</a> on the bill, it passed with an overwhelming majority.</p>
<h2>History repeating?</h2>
<p>Some critics argue that the goal of the coding push is to massively increase the number of programmers on the market, <a href="https://www.theguardian.com/technology/2017/sep/21/coding-education-teaching-silicon-valley-wages">depressing wages</a> and bolstering tech companies’ profit margins. Though there is no concrete evidence to support this claim, the fact remains that <a href="http://www.epi.org/files/2013/bp359-guestworkers-high-skill-labor-market-analysis.pdf">only half of college students</a> who majored in science, technology, engineering or math-related subjects get jobs in their field after graduation. That certainly casts doubt on the idea that there is a “<a href="https://www.technologyreview.com/s/608707/the-myth-of-the-skills-gap/">skills gap</a>” between workers’ abilities and employers’ needs. Concerns about these disparities has helped <a href="https://www.theatlantic.com/education/archive/2016/02/obamas-push-for-computer-science-education/459276/">justify investment</a> in tech education over the <a href="https://www.youtube.com/watch?v=w2zU9g3WU5M">past 20 years</a>. </p>
<p>As millions of dollars flow to technology companies in the name of education, they often bypass other major needs of U.S. schools. Technology in the classroom can’t solve the problems that <a href="https://www.npr.org/sections/ed/2017/05/22/529534031/president-trumps-budget-proposal-calls-for-deep-cuts-to-education">budget cuts</a>, <a href="https://www.theatlantic.com/education/archive/2015/07/too-many-kids/397451/">large class sizes</a> and <a href="https://www.washingtonpost.com/news/answer-sheet/wp/2016/08/16/think-teachers-arent-paid-enough-its-worse-than-you-think/">low teacher salaries</a> create. Worse still, new research is finding that <a href="https://press.princeton.edu/titles/11029.html">contemporary tech-driven educational reforms</a> may end up intensifying the problems they were trying to fix. </p>
<p>Who will benefit most from this new computer science push? History tells us that it may not be students.</p>
<p><em>Editor’s notes: This is an updated version of an article originally published Dec. 4, 2017. It was updated Dec. 8, 2017, to correct the year Larry Cuban’s book was first published.</em></p><img src="https://counter.theconversation.com/content/86597/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Kate M. Miltner does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Past efforts to teach American students computer skills haven’t always helped workers get better-paying jobs. But spending on hardware and software for schools has certainly enriched tech companies.Kate M. Miltner, Ph.D. Candidate in Communication, USC Annenberg School for Communication and JournalismLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/879722017-12-01T00:39:23Z2017-12-01T00:39:23ZWhy Silicon Valley wants you to text and drive<figure><img src="https://images.theconversation.com/files/196794/original/file-20171128-28866-sv5qh1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Tech companies want to reduce conflict between texting and driving.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/texting-while-driving-car-irresponsible-man-719930443">Tero Vesalainen/Shutterstock.com</a></span></figcaption></figure><p>As self-driving cars come closer to being common on American roads, much of the rhetoric promoting them has to do with safety. <a href="https://www.nhtsa.gov/press-releases/nhtsa-data-shows-traffic-deaths-77-percent-2015">About 40,000 people die</a> on U.S. roads every year, and driver errors are linked to <a href="https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812115">more than 90 percent of crashes</a>. But many of the biggest advocates of autonomous vehicles aren’t car companies looking to improve the safety of their existing products. Huge backing for self-driving technologies is coming from Silicon Valley giants like <a href="https://www.google.com/selfdrivingcar/">Google</a> and <a href="https://www.engadget.com/2017/11/22/apple-research-self-driving-cars/">Apple</a>.</p>
<p>Those of us who have studied the relationship between technology and society tend to look more carefully at the motivations behind any technological push. In this case, it’s clear that in addition to addressing safety concerns, Silicon Valley firms have a strong incentive to create a new venue for increasing the use of their digital devices. Every minute people spend on their mobile phones provides data – and often money – to tech companies.</p>
<p>At present, digital devices and driving are in conflict: There are serious, often fatal, consequences when <a href="https://www.nhtsa.gov/risky-driving/distracted-driving">drivers use smartphones to talk or to text</a>. Regulators and safety advocates look to resolve that conflict by banning phone use while driving – as has happened in <a href="http://www.ncsl.org/research/transportation/cellular-phones-use-and-texting-while-driving-laws.aspx">virtually every state</a>. But the tech companies are taking a different approach. The obvious answer for Silicon Valley is creating an automobile in which continuous cellphone use no longer poses a threat to anyone.</p>
<h2>Not a new idea</h2>
<p>The idea of a car so capable a driver is not needed isn’t new. As far back as the 1950s, the Saturday Evening Post ran an illustration imagining a family playing a board game (in a convertible!) as the car conducts itself down the road. When self-driving cars actually take to the streets in large numbers, today’s families likely won’t be playing Scrabble – though Words With Friends and other mobile games are a near certainty. Every passenger is likely to be using a mobile device.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=607&fit=crop&dpr=1 600w, https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=607&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=607&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=763&fit=crop&dpr=1 754w, https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=763&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/195880/original/file-20171122-6027-1syyjk2.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=763&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A self-driving car depicted in the 1950s.</span>
<span class="attribution"><span class="source">Saturday Evening Post</span></span>
</figcaption>
</figure>
<p>In recent years, the amount of time adults spend on their mobile devices (beyond actual phone calls) has grown rapidly. At the moment, it’s <a href="https://hackernoon.com/how-much-time-do-people-spend-on-their-mobile-phones-in-2017-e5f90a0b10a6">around four hours a day</a> for the average adult in the U.S. However, that rapid growth is likely to slow down as people run out of time that’s available for them to use their devices.</p>
<p>Unless, of course, there’s a new block of time that suddenly opens up. The average American now spends <a href="http://www.newsroom.aaa.com/2016/09/americans-spend-average-17600-minutes-driving-year/">about 48 minutes in a car every day</a>, a sizable opportunity for increased cellphone use.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Uj-rK8V-rik?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Chris Urmson, then director of Google’s self-driving car program, discusses Google’s efforts to advance autonomous vehicles.</span></figcaption>
</figure>
<p>Chris Urmson, former director of Google’s self-driving car project, made this interest clear in a 2016 talk, saying that autonomous vehicles offer the “exciting” possibility of creating “<a href="http://www.youtube.com/watch?v=Uj-rK8V-rik">another room for you</a>” where, among other activities, you can watch videos. The investment analysts at Morgan Stanley have talked about autonomous cars becoming a “<a href="http://www.investors.com/news/technology/apple-alphabet-could-gain-if-self-driving-cars-are-4th-video-screen/">fourth screen</a>” in Americans’ lives (in addition to the home TV, personal computer and mobile phones or tablets). Perhaps the most explicit declaration of this interest came from Jia Yueting, co-founder of the budding Chinese automaker LeEco, when he said, “We see the car in the future as <a href="http://www.reuters.com/article/us-autoshow-beijing-china-leeco-insight-idUSKCN0XL11X">an extension of the internet</a>, another entry point for us to sell web-based content and services.”</p>
<p>So as the public conversation around autonomous cars highlights the safety advantages, don’t forget the tech industry’s powerful desire for more profits, which goes well beyond simply saving us from ourselves.</p><img src="https://counter.theconversation.com/content/87972/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jack Barkenbus does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Why do tech companies care so much about self-driving cars? If drivers no longer need to pay attention to the road, they can use their mobile devices even more.Jack Barkenbus, Visiting Scholar, Vanderbilt Institute for Energy & Environment, Vanderbilt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/874262017-11-26T23:45:42Z2017-11-26T23:45:42ZFighting online abuse shouldn’t be up to the victims<figure><img src="https://images.theconversation.com/files/195275/original/file-20171117-11477-1xknfj5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/cyberbully-coming-out-computer-internet-bully-702224215">Leremy/Shutterstock.com</a></span></figcaption></figure><p>The fight against online abuse has put increasing pressure on social media corporations to take responsibility for the content that appears on their platforms. As a result, Twitter, Facebook and other sites have created buttons for <a href="https://techcrunch.com/2017/03/01/twitter-adds-more-anti-abuse-measures-focused-on-banning-accounts-silencing-bullying/">reporting harassment</a> and blocking unwanted contact – and they <a href="https://qz.com/651001/getting-banned-from-facebook-can-have-unexpected-and-professionally-devastating-consequences/">occasionally ban</a> particularly egregious offenders. Facebook’s latest effort has the company asking its users in Australia to <a href="https://www.usatoday.com/story/tech/news/2017/11/08/facebook-tests-fighting-revenge-porn-asking-users-file-nude-photos-first/843364001/">send in nude photographs</a> of themselves.</p>
<p>This, Facebook says, would help <a href="https://www.theguardian.com/technology/2017/nov/07/facebook-revenge-porn-nude-photos">build a database of images</a> the company could analyze to teach its computer systems how to detect when a person posts a nude image. The goal, the company says, is to <a href="https://newsroom.fb.com/news/h/non-consensual-intimate-image-pilot-the-facts/">prevent people from posting</a> nude photos of others online without consent – a practice sometimes called “<a href="https://www.theguardian.com/technology/2017/nov/07/facebook-revenge-porn-nude-photos">revenge porn</a>.” The Australian government has joined Facebook for the pilot project, which according to USA Today is also <a href="https://www.usatoday.com/story/tech/news/2017/11/08/facebook-tests-fighting-revenge-porn-asking-users-file-nude-photos-first/843364001/">available in the U.S., U.K. and Canada</a>.</p>
<p>Several fundamental problems with this idea are readily apparent. The most obvious is that <a href="https://theconversation.com/after-the-nsa-hack-cybersecurity-in-an-even-more-vulnerable-world-64090">putting anything online</a> – especially nude photos of oneself – in any format <a href="https://theconversation.com/what-are-software-vulnerabilities-and-why-are-there-so-many-of-them-77930">risks exposure to hackers</a>. Then it could be redistributed in any number of ways, including formats and forums not yet invented. </p>
<p>Though Facebook says its aim is to help control online abuse, anyone who submits photos to this effort will surrender any control they might have of the images they send. Facebook’s request for nude images unfairly puts the burden of work and risk on women in the name of protecting themselves from harassment and abuse. There are other layers of protection that could keep women safer, while helping police online communications.</p>
<h2>Online abuse</h2>
<p>Posting nude photographs online without their subjects’ consent is perhaps the most clear example of the role of gender in online abuse: Anything relating to something remotely sexual is <a href="http://doi.org/10.1177/1461444816688457">far more likely to target women</a> than men.</p>
<p>A 2016 survey found that <a href="https://datasociety.net/blog/2016/12/13/nonconsensual-image-sharing/">at least 4 percent of Americans</a> online had experienced someone sharing “sensitive images” without consent or threatening to do so. That number <a href="https://datasociety.net/blog/2016/12/13/nonconsensual-image-sharing/">climbed to 10 percent</a> among women under age 30. </p>
<p>Multiple studies, <a href="http://doi.org/10.1177/1461444816688457">including my own</a>, have documented that women face great risks of online harassment, especially when they challenge the political status quo. When I interviewed 109 women bloggers in the U.S., U.K., Germany and Switzerland, 80 said they were harassed – online or offline – due to their blogging. This included not just receiving insults, sexually charged comments and trolling responses, but extended to rape threats, death threats, stalking, doxing, plagiarism and identity theft. Of those 80, 73.8 percent identified their writing as feminist; of the 29 who had not been harassed, only 44.8 percent said so.</p>
<p>Online platforms do give women, ethnic and cultural minorities and others typically muted in public discourse opportunities to speak out. But that visibility also make them more vulnerable, continuing a pattern of women getting attacked for publicly speaking that’s <a href="http://adanewmedia.org/2015/11/issue8-lane/">as old as ancient Greece</a>.</p>
<h2>The bigger picture</h2>
<p>Currently <a href="http://www.internetworldstats.com/stats.htm">just over half of the global population is online</a>. Without action, the problems of today’s internet will only expand as more people join the online community.</p>
<p>Social media companies are key to the solutions, but laws and law enforcement can assist, as can guidelines for software development and cultural changes.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/195707/original/file-20171121-6013-3j2mpq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In 2015, Shon Handrahan of Layton, Utah, pleaded guilty in a case that spurred state lawmakers to revise Utah laws against revenge porn.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Revenge-Porn-/64e3c3079ae74ae192aba51cac7f84ec/14/0">AP Photo/Davis County Sheriff’s Office</a></span>
</figcaption>
</figure>
<p>At present, laws about online abuse are mostly anchored in state or national laws: <a href="http://www.bbc.com/news/uk-37601431">The U.K.</a> <a href="https://techcrunch.com/2017/10/02/germanys-social-media-hate-speech-law-is-now-in-effect/">and Germany</a> have relatively new regulations about online harassment. In the U.S., a California “revenge porn” law helped prosecute a woman who <a href="https://www.nbcnews.com/news/us-news/former-playmate-dani-mathers-gets-probation-graffiti-cleanup-body-shaming-n764371">secretly took a nude picture of another woman</a> and posted it online. Yet in Ohio a law that currently <a href="http://www.governor.ohio.gov/Media-Room/Press-Releases/ArticleId/416/kasich-signs-four-bills-5-17-16">strengthens the legal understanding</a> of online abuse <a href="http://radio.wosu.org/post/political-commentators-sue-ohio-over-online-harassment-ban#stream/0">has been contested</a> on the grounds that it would curtail free speech.</p>
<p>To be effective, laws would have to <a href="http://genderpolicyreport.umn.edu/net-neutrality-too-neutral-on-online-abuse/">go beyond state and national borders</a> because of the international nature of the internet.</p>
<h2>Help with enforcement</h2>
<p>Laws are only as good as their enforcement. Studies that look at how police react to victims’ reports of online abuse indicate that <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674368293">officers aren’t aware of the nature of the problem</a> or trained to investigate it. For example, in Germany a meager two percent of cyberstalking cases brought to the police <a href="http://www.newsweek.com/2014/08/22/how-law-standing-cyberstalking-264251.html">led to convictions</a>. This leaves women unprotected. In my study, of the <a href="http://doi.org/10.1177/1461444816688457">80 women who had been harassed</a>, nine went to the police. Four women said police helped or took their cases seriously; five said police did not respond or help.</p>
<p>Software companies could help too, but they are not required to protect their users, even though that is mandatory in other industries that provide basic elements of societal interaction, like <a href="https://www.compliancealliance.com/laws-regulations/federal-bank-regulations">banking</a>, <a href="https://icsw.nhtsa.gov/cars/rules/import/FMVSS/">car manufacturing</a> and <a href="https://www.faa.gov/other_visit/aviation_industry/airline_operators/airline_safety/">airline travel</a>. Even the most basic standards of cybersecurity defenses are inconsistent and incomplete: Organizations as diverse as the <a href="http://www.businessinsider.com/cheating-affair-website-ashley-madison-hacked-user-data-leaked-2015-7">Ashley Madison</a> dating site, the <a href="https://www.nytimes.com/2017/05/23/business/target-security-breach-settlement.html">Target</a> retail chain and even the U.S. <a href="https://www.nytimes.com/2017/11/12/us/nsa-shadow-brokers.html">National Security Agency</a> have been breached. </p>
<h2>Broader social action</h2>
<p>Perhaps the most important element to addressing online harassment is behaving like it is happening in the “real world.” Abuse is abuse. Online spaces are created, shaped and used by real humans, with real bodies and real feelings.</p>
<p>Harassment in online spaces is just as real and harmful as when it happens on the street, in schools and in workplaces. </p>
<p><a href="https://doi.org/10.1177%2F1461444816688457">Targets of online abuse suffer</a> emotionally, psychologically, <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674368293">economically and professionally</a>. An <a href="https://www.amnestyusa.org/press-releases/amnesty-reveals-alarming-impact-of-online-abuse-against-women/">Amnesty International study</a> of women across eight countries found 55 percent said they had suffered from stress, anxiety or panic attacks after experiencing online abuse. </p>
<p>Facebook’s attempt to battle the problem of online abuse by putting the burden on users suggests the company may feel relatively helpless to act on its own. Governments and society as a whole must step up to figure out how to better protect members of communities, both online and offline, from harassment and abuse.</p><img src="https://counter.theconversation.com/content/87426/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stine Eckert has received funding in the past from grants from the World Health Organization and the U.S. Agency for International Development. She is Chair of the Feminist Scholarship Division (FSD) of the International Communication Association (ICA) and a member of the Association for Education in Journalism and Mass Communication (AEJMC).</span></em></p>Companies and governments should do more to prevent ‘revenge porn’ without asking potential victims to send their nude photos to Facebook.Stine Eckert, Assistant Professor of Communication, Wayne State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/858102017-11-16T01:41:16Z2017-11-16T01:41:16ZHow Silicon Valley industry polluted the sylvan California dream<figure><img src="https://images.theconversation.com/files/192293/original/file-20171027-2402-15ejnas.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Aerial view of San Jose, California, 2016.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/gordon-s/29670306746/in/photolist-ZHvaXX-Uo5oeM-RcXnY3-UjXT4J-N7tLMF-T9Tnno-Xea8Ym-McS5uS-Ui2ybJ-qEMBub-PfD9EU-e9RRfi-VWgfbi-QiHUXk-S4wGvz-LzFHTp-S64S7H-VWge8X-ABtoak-qg77S5-URsuhd-SrcUo8-eUPCUc-AePQJj-qzF7PW-Vy2pDG-pjpyPc-BE9Ed4-Rvoc4U-szHCZC-QBgQpX-Hg3Lgy-PtWFnc-Gjc4CG-PJMPp1-Liz43E-TTfx1R-ML7E2u-Sht5zW-eTpurL-TgKPJq-S64THD-8hxSP2-8hBsFA-Jy7ccp-TLXcao-pjb9gm-hskJKv-ACsLw6-rqdGgk">Gordon-Shukwit</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>On Labor Day 1956, a caravan of moving trucks wound their way into Santa Clara County, just south of San Francisco, carrying the possessions of 600 families and equipment for the missile and space labs of the Lockheed Corporation. One month later, Lockheed’s Sunnyvale campus opened for business. Many of the arriving families were relocating to Sunnyvale from the company’s facility in Burbank, in Southern California.</p>
<p>The draw included good jobs in the emerging businesses of electronics research and development, as well as manufacturing of semiconductors and other electronic components for machinery and computers. Affordable housing, a pastoral landscape and a pleasant environment proved very attractive for newcomers. Local boosters, corporate executives and new residents alike <a href="https://books.google.com/books?id=KwvEBAAAQBAJ&pg=PT54&dq=Margaret+O%27Mara+environmental+contradictions&hl=en&sa=X&ved=0ahUKEwjDxbPal7LXAhUS3YMKHbuvBiwQ6AEIKDAA#v=onepage&q&f=false">envisioned a modern future</a> in stark contrast with the declining dirty urban industrial model of the Northeast and Midwest. </p>
<p>This type of industrial work and manufacturing didn’t need smokestacks, large warehouses, or other markers of the industrial age. The Santa Clara Valley’s promise for leading Northern California into a bright economic future quickly brought the area the nickname “Silicon Valley.” But in the book I am writing, I note that if this convergence of natural surroundings, suburban homes and high-tech industrialization represented a facet of the California dream, it also betrayed it.</p>
<h2>A bright illusion of the future</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1160&fit=crop&dpr=1 600w, https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1160&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1160&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1458&fit=crop&dpr=1 754w, https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1458&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/191217/original/file-20171020-13995-1qaicjp.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1458&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A promising advertisement for homes in San Jose.</span>
<span class="attribution"><span class="source">San Jose Mercury, January 18, 1956</span></span>
</figcaption>
</figure>
<p>In addition to jobs in electronics and aerospace, the emerging suburbs of Silicon Valley promised newcomers a countryside experience. David Beers, whose father worked at the Sunnyvale Lockheed campus, <a href="https://books.google.com/books?id=bVMuLOrHoU8C&pg=PT50&dq=Beers+%22all-year+garden%22&hl=en&sa=X&ved=0ahUKEwjz7KDatP_WAhWB64MKHRy-BGgQ6AEIKDAA#v=onepage&q=Beers%20%22all-year%20garden%22&f=false">remembered</a> the chamber of commerce brochures claiming an “all-year garden” and “the most beautiful valleys in the world.” Such advertisements were common, assuring home buyers “good living,” the “calm of the country” and “a beautiful walnut and cherry orchard” that “the builder is leaving … for your enjoyment.” The white-collar workers of high tech could make their homes in what appeared to be the countryside.</p>
<p>Workplaces, too, were different, with manufacturing happening in places that didn’t look like the old industries of the East. The Stanford Industrial Park, founded in the early 1950s, had <a href="https://www.cityofpaloalto.org/civicax/filebank/documents/58349">strict building guidelines</a> that made it look more like a suburban area than a manufacturing center. Crucially, 60 percent of each lot had to be preserved as open green space, and no smokestacks were allowed. “Everyone thought of smokestacks,” <a href="https://purl.stanford.edu/dv559gn8984">recalled Alf Brandin</a>, Stanford’s business manager in the 1940s and 1950s. “These new people who came out from the East and settled here thought, ‘Don’t change it. We just left all the smoke and all that junk. Don’t change this.’”</p>
<p>The overall feeling was of much more than just a good job and a nice place to live: a new world was opening, based on computing. Promising young engineers could come west, buy a home and work in the future of the nation’s industry. “There’s a sense of being pioneers here,” Mark Leslie, founder of Synapse Computers, <a href="https://www.inc.com/magazine/19820901/3259.html">told a reporter</a> in 1982. “I view myself as the kind of guy who would have been living in Detroit in 1910. The future depends on high technology, and we are spearheading it.”</p>
<p>Recent college graduates and white-collar workers flocked to the valley to work at companies like Fairchild, Intel, Hewlett-Packard, International Business Machines and Lockheed. The county’s population <a href="http://www.bayareacensus.ca.gov/bayarea70.htm">more than quadrupled</a> in 30 years, from 290,547 in 1950 to 1,265,200 in 1980. But the clean, gleaming future they imagined was already being tarnished.</p>
<p><iframe id="GAWiv" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/GAWiv/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<h2>Fairchild contamination</h2>
<p>Semiconductor manufacturing involves very carefully connecting microscopic electrical components to each other on large plates of silicon. Pieces of dust can block sensitive circuits, and the smallest scratches can render everything useless. So to clean the silicon wafers and the parts joined to them, manufacturers used <a href="https://books.google.com/books?id=-L0ODAAAQBAJ&pg=PA185&dq=semiconductor+chemical+solvents+cleaning+TCE&hl=en&sa=X&ved=0ahUKEwjxra6JuP_WAhXE1IMKHUqAD6cQ6AEINzAD#v=onepage&q=semiconductor%20chemical%20solvents%20cleaning%20TCE&f=false">harsh chemical solvents</a> like <a href="https://www.epa.gov/assessing-and-managing-chemicals-under-tsca/risk-management-trichloroethylene-tce">1,1,1 trichloroethane</a>, <a href="https://www.atsdr.cdc.gov/mmg/mmg.asp?id=291&tid=53">xylene</a> and <a href="https://toxtown.nlm.nih.gov/text_version/chemicals.php?id=77">methanol</a>. These chemicals were stored on-site in containers designed to safely hold them.</p>
<p>But in December 1981, construction workers discovered a leaking chemical solvents tank at Fairchild Semiconductor’s southern San José facility. A cancer-causing chemical, TCE, had found its way into <a href="https://www.inc.com/magazine/19820901/3259.html">nearby drinking-water wells</a>. The water company promptly shut off pumping water from those wells. A month later, the San Jose Mercury broke the story of the chemical leak. TCE accumulated in wells at nearly 20 times the permissible limit established by the Environmental Protection Agency. Over the course of two years, <a href="https://nepis.epa.gov/Exe/ZyNET.exe/9100976C.txt?ZyActionD=ZyDocument&Client=EPA&Index=1986%20Thru%201990&Docs=&Query=&Time=&EndTime=&SearchMethod=1&TocRestrict=n&Toc=&TocEntry=&QField=&QFieldYear=&QFieldMonth=&QFieldDay=&UseQField=&IntQFieldOp=0&ExtQFieldOp=0&XmlQuery=&File=D%3A%5CZYFILES%5CINDEX%20DATA%5C86THRU90%5CTXT%5C00000020%5C9100976C.txt&User=ANONYMOUS&Password=anonymous&SortMethod=h%7C-&MaximumDocuments=1&FuzzyDegree=0&ImageQuality=r75g8/r75g8/x150y150g16/i425&Display=hpfr&DefSeekPage=x&SearchBack=ZyActionL&Back=ZyActionS&BackDesc=Results%20page&MaximumPages=1&ZyEntry=4">more than 60,000 gallons</a> of toxic chemicals had leaked from the tank, spreading underground more than half a mile into the surrounding neighborhood of Los Paseos.</p>
<h2>Neighbors speak up</h2>
<p>For the residents of the Los Paseos neighborhood, just across the street from Fairchild, the news of the chemical leak suddenly explained the stories of birth defects among their neighbors. <a href="https://www.inc.com/magazine/19820901/3259.html">Lorraine Ross</a>, whose daughter had her first open-heart surgery at nine months old, couldn’t help but wonder if the four birth defects, two miscarriages and one stillbirth of Los Paseos in the past two years were <a href="http://www.nytimes.com/1982/05/20/us/leaking-chemicals-in-california-s-silicon-valley-alarm-neighbors.html">connected to water contamination</a>. She organized others in the neighborhood to ask questions, eventually partnering with a young lawyer, Ted Smith, who founded a new advocacy organization called the <a href="http://svtc.org/">Silicon Valley Toxics Coalition</a>. The Silicon Valley Toxics Coalition was designed to advocate for neighborhoods, helping draft new county and city ordinances related to the storage, transportation and disposal of chemicals and gases in Santa Clara County.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=359&fit=crop&dpr=1 600w, https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=359&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=359&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=451&fit=crop&dpr=1 754w, https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=451&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/190905/original/file-20171018-32348-9u7c5d.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=451&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Silicon Valley Toxics Coalition flyer.</span>
<span class="attribution"><a class="source" href="http://www.oac.cdlib.org/findaid/ark:/13030/kt2b69r7hf/">Folder 3, Box 11, Silicon Valley Toxics Coalition Papers, San Jose State University</a></span>
</figcaption>
</figure>
<p>News of the Fairchild leak captured the attention of the San Francisco Bay Area. The presence of these chemicals and synthetics were a revelation. “There was no doubt in my mind that this was a clean industry,” <a href="http://www.nytimes.com/1982/05/20/us/leaking-chemicals-in-california-s-silicon-valley-alarm-neighbors.html">remarked</a> San José Mayor Janet Gray Hayes. Lorraine Ross echoed this sentiment, telling a reporter that “we thought we were living with a clean industry.” But it wasn’t true.</p>
<h2>Widespread pollution</h2>
<p>Fairchild wasn’t alone in leaking pollution into the vibrant environment and thriving communities around its industrial sites. By 1992, one study found that <a href="https://nyupress.org/books/9780814767092/">57 private and 47 public drinking wells</a> were contaminated. Santa Clara County authorities determined that 65 of the 79 companies they investigated had contaminated the soil beneath their facilities. Several companies were forced to pay several million dollars for the cleanup of polluted sites, as well as install new monitoring equipment to prevent leaks for occurring again. Fairchild Semiconductor and other companies in the Los Paseos area found to have contaminated the water agreed to pay a multi-million-dollar settlement to 530 residents in southern San José.</p>
<p>The U.S. Environmental Protection Agency eventually <a href="http://dissertation.jasonheppler.org/visualizations/companies/">determined 29 polluted sites were eligible for Superfund</a> cleanup money over the course of the 1980s – 24 of which resulted from high-tech industries. Under <a href="https://www.epa.gov/superfund">Superfund</a>, polluted sites that particularly threaten wildlife or human health become eligible for federal funding to help clean up hazardous and contaminated sites. By the end of the 1980s, Santa Clara County had <a href="https://qz.com/1017181/silicon-valley-pollution-there-are-more-superfund-sites-in-santa-clara-than-any-other-us-county/">more Superfund sites</a> than any other county in the United States. <a href="https://www.epa.gov/superfund/search-superfund-sites-where-you-live">Twenty-three of the sites</a> remain in remediation today.</p>
<p>By accident and by neglect, the promise of clean industrialization proved elusive. Thousands of people migrated to the Santa Clara Valley hoping to take part in the remarkable convergence of affordable housing and new jobs. And while smokestacks were absent from electronics manufacturing, the presence of highly toxic chemicals – trichloroethane and chlorinated solvents – shattered the illusion behind the tech industry’s green image. The industry permanently altered the land and human bodies.</p><img src="https://counter.theconversation.com/content/85810/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jason A. Heppler does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Silicon Valley brought together natural surroundings, suburban homes and futuristic high-tech work. But industrial pollution betrayed the California dream.Jason A. Heppler, Digital Engagement Librarian and Assistant Professor of History, University of Nebraska OmahaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/869092017-11-14T02:42:44Z2017-11-14T02:42:44ZHow social media fires people’s passions – and builds extremist divisions<figure><img src="https://images.theconversation.com/files/194018/original/file-20171109-13337-wt1fzf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Passionate feelings can lead to extreme divisions.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/quarrel-between-woman-man-screaming-each-645677146">pathdoc/Shutterstock.com</a></span></figcaption></figure><p>The people of the United States continue to learn how polarized and divided the nation has become. In one study released in late October by the Pew Research Center, Americans were found to have <a href="http://www.pewresearch.org/fact-tank/2017/10/23/in-polarized-era-fewer-americans-hold-a-mix-of-conservative-and-liberal-views">become increasingly partisan</a> in their views. On issues as diverse as health care, immigration, race and sexuality, Americans today hold more extreme and more divergent views than they did a decade ago. The reason for this dramatic shift is a device owned by <a href="http://techlatino.org/2017/01/pew-u-s-smartphone-ownership-broadband-penetration-reached-record-levels-in-2016/">more than three out of every four Americans</a>. </p>
<figure><img src="http://assets.pewresearch.org/wp-content/uploads/sites/12/2014/06/polarization505px_30fps.gif"><figcaption><span class="caption">Americans’ political beliefs have become increasingly polarized. <a href="http://www.pewresearch.org/fact-tank/2014/06/12/7-things-to-know-about-polarization-in-america/">Pew Research Center</a></span></figcaption></figure>
<p>As social media has emerged over the last two decades, I have been studying how <a href="https://doi.org/10.1177/0276146708325382">it changes innovation</a>, and researching the effects of internet communications on consumer opinions and <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2259683">marketing</a>. I developed <a href="https://en.wikipedia.org/wiki/Netnography">netnography</a>, one of the most widely used qualitative research techniques for understanding how people behave on social media. And I have used that method to better understand a variety of challenging problems that face not only businesses but governments and society at large.</p>
<p>What I have found has shaken up some of the most firmly held ideas that marketers had about consumers – such as how <a href="https://doi.org/10.1016/S0263-2373(99)00004-3">internet interest groups</a> can drive online purchasing and the power of stories, utopian messages and moral lessons to <a href="https://doi.org/10.1509/jmkg.67.3.19.18657">connect buyers with brands</a> and each other. In one of my latest studies, my co-authors and I debunk the idea that technology might <a href="https://www.gsb.stanford.edu/insights/how-digital-age-rewrites-rule-book-consumer-behavior">make consumers more rational</a> and price-conscious. Instead, we found that smartphones and web applications were increasing people’s passions while also <a href="https://doi.org/10.1093/jcr/ucw061">driving them to polarizing extremes</a>. </p>
<h2>How social media divides people</h2>
<p>When people express themselves through social media, they communicate collectively. Rachel Ashman, Tony Patterson and I studied sharing of images of food in an intensive three-year ethnographic and netnographic study of a variety of online and physical sites. We collected and analyzed thousands of pictures, conducted 17 personal interviews and set up a dedicated research webpage where dozens of people shared their “food porn” stories. </p>
<p>Our results indicate that people share images of food for a number of reasons, including the desire to nurture others with photos of home-cooked food, to express belonging to certain interest groups like vegans or paleos, or to compete about, for example, who could make the most decadent dessert. But this sharing can become competitive, pushing participants to one-up each other, sharing images of food that look less and less like what regular people eat every day. </p>
<p>Here is how it works. Many people start by sharing food images only with people they know well. But once they broaden out to a wider group on social media, several unexpected and startling things begin to happen. First, they find sites where they can feel comfortable expressing their opinions to a like-minded “audience.” </p>
<p>This audience creates a community-type feeling, expressing respect and belonging for certain kinds of messages and outrage or contempt for others. Communications innovators in social media communities often also create new language forms, such as the frustrated guys in men’s-rights-oriented social media forums on Reddit bringing new life to the 19th-century word “<a href="https://qz.com/1092037/the-alt-right-is-creating-its-own-dialect-heres-a-complete-guide/">hypergamy</a>,” or young people creating sophisticated emoji codes in their <a href="https://www.wired.com/2016/08/how-teens-use-social-media/">relationship texting</a>. </p>
<p>Through language and example, community members educate one another. They reinforce each others’ thinking and communication. Members of social media communities direct raw emotions into particular interests. For example, a general fear about job security might become channeled through the feedback loops on Facebook into an <a href="http://www.pe.com/2017/09/15/immigration-talk-was-often-heated-but-social-media-experiment-proves-we-can-talk-to-one-another/">interest in immigrant jobs</a> and immigration policy.</p>
<p>Those feedback loops have even more sensational effects. People use social media to communicate their need for things like money, attention, security and prestige. But once those people become a part of a social media platform, our research reveals how they start to look for wider audiences. Those audiences show their interest and approval by liking, sharing and commenting. And those mechanisms drive future social media behavior.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=750&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=750&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=750&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=942&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=942&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193619/original/file-20171107-1041-54fii7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=942&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A monstrous example of ‘food porn.’</span>
<span class="attribution"><a class="source" href="http://www.bitypic.com/media/1634833723450822346_1243351468">Priyan Shailesh Parab</a></span>
</figcaption>
</figure>
<p>In our study of food image sharing, we wondered why the most popular food porn images depicted massive hamburgers that were impossible to eat, dripping with bacon grease, gummy worms and sparklers. Or super pizza that contained tacos, macaroni and cheese and fried chicken. The answer was that the algorithms that drive participation and attention-getting in social media, the addictive “gamification” aspects such as likes and shares, invariably favored the odd and unusual. When someone wanted to broaden out beyond his or her immediate social networks, one of the most effective ways to achieve mass appeal turned out to be by turning to the extreme. </p>
<p>Taking an existing norm in the community (massive burgers, say) and expanding upon it almost guaranteed a poster a few hundred likes, a dozen supportive comments and 15 minutes of social media glory. As each user tried to top the outrageous image of the user coming before, the extremes of food porn ratcheted toward ever more sensational towering burgers and cakes. Desire for what was once the extremes began to seem normal. And the ends separated farther from the few who remained in the middle.</p>
<h2>The extreme state of the world</h2>
<p>In our research, we suggested that the exact same mechanisms are at work in general society. As the <a href="http://www.pewresearch.org/fact-tank/2017/10/23/in-polarized-era-fewer-americans-hold-a-mix-of-conservative-and-liberal-views/">Pew research</a> revealed, American beliefs have become more partisan and more extreme. Religious beliefs are more fundamentalist. Political figures around the world are more polarized. Language is more crude. </p>
<p>Although the divided state of Americans is a bellwether for some of these unwelcome developments, the phenomenon seems to be global. A recent <a href="http://mashable.com/2017/10/24/facebook-social-media-rohingya-muslim-myanmar-fake-news/">Mashable article</a> blamed social media for fueling the horrific ethnic cleansing of the <a href="https://theconversation.com/the-history-of-the-persecution-of-myanmars-rohingya-84040">Rohingya Muslims in Myanmar</a>, a country where Facebook viewed on mobile devices has become for many people the sole source of news. Hate speech on social media has been a major and growing problem in Europe and <a href="http://www.worldpolicy.org/blog/2015/04/21/addressing-hate-speech-african-digital-media">Africa</a> for several years now. Around the world, social media is feeding strong partisan talk with attention. Moderation and a balanced approach to ideas and discourse seem to be fading away.</p>
<p>The fault for these developments lies, at least in part, in people’s consumption of technology. Even without foreign interference, our research demonstrates that social media is built for polarization and extremes. The basic engagement mechanisms of popular social media sites like Facebook drive people to think and communicate in ever more extreme ways.</p>
<p>As people experience how these technological and social changes play out online, they will have to figure out how to adapt and change their behaviors – or risk becoming increasingly divided and driven to extremes.</p><img src="https://counter.theconversation.com/content/86909/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Robert Kozinets does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The way people use social media – and the algorithms inside those systems – increases passions, and drives people to polarizing extremes.Robert Kozinets, Hufschmid Chair of Strategic Public Relations, USC Annenberg School for Communication and JournalismLicensed as Creative Commons – attribution, no derivatives.