tag:theconversation.com,2011:/us/topics/moores-law-56929/articlesMoore's Law – The Conversation2024-02-13T12:45:09Ztag:theconversation.com,2011:article/2229582024-02-13T12:45:09Z2024-02-13T12:45:09ZChina’s chip industry is gaining momentum – it could alter the global economic and security landscape<figure><img src="https://images.theconversation.com/files/574634/original/file-20240209-20-qhpgx6.jpg?ixlib=rb-1.1.0&rect=7%2C0%2C4977%2C3337&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/cropped-image-engineer-showing-computer-microchip-151125485">Dragon Images/Shutterstock</a></span></figcaption></figure><p>China’s national champions for computer chip – or semiconductor – design and manufacturing, HiSilicon and Semiconductor Manufacturing International Corporation (SMIC), are making waves in Washington. </p>
<p>SMIC was long considered a laggard. Despite being the recipient of billions of dollars from the Chinese government since its founding in 2000, it remained far from the technological frontier. But that perception — and the self-assurance it gave the US — is changing. </p>
<p>In August 2023, Huawei launched its high-end Huawei Mate 60 smartphone. According to the Center for Strategic and International Studies (an American think tank based in Washington DC), the launch <a href="https://www.ft.com/content/327414d2-fe13-438e-9767-333cdb94c7e1">“surprised the US”</a> as the chip powering it showed that Chinese self-sufficiency in HiSilicon’s semiconductor design and SMIC’s manufacturing capabilities were catching up at an alarming pace.</p>
<p>More recent news that Huawei and SMIC are scheming to mass-produce so-called 5-nanometre processor chips in <a href="https://www.ft.com/content/b5e0dba3-689f-4d0e-88f6-673ff4452977">new Shanghai production facilities</a> has only stoked further fears about leaps in their next-generation prowess. These chips remain a generation behind the current cutting-edge ones, but they show that China’s move to create more advanced chips is well on track, despite US export controls.</p>
<p>The US has long managed to maintain its clear position as the frontrunner in chip design, and has ensured it was close allies who were supplying the manufacturing of cutting-edge chips. But now it faces formidable competition from China, who’s technological advance carries profound economic, geopolitical and security implications.</p>
<h2>Semiconductors are a big business</h2>
<p>For decades, chipmakers have sought to make ever more compact products. Smaller transistors result in lower energy consumption and faster processing speeds, so massively improve the performance of a microchip. </p>
<p>Moore’s Law — the expectation that the number of transistors on a microchip doubles every two years — has remained valid in chips designed in the Netherlands and the US, and manufactured in Korea and Taiwan. Chinese technology has therefore remained years behind. While the world’s frontier has moved to 3-nanometre chips, Huawei’s <a href="https://thediplomat.com/2023/09/what-does-huaweis-homemade-chip-really-mean-for-chinas-semiconductor-industry/">homemade chip</a> is at 7 nanometres. </p>
<p>Maintaining this distance has been important for economic and security reasons. Semiconductors are the backbone of the modern economy. They are critical to telecommunications, defence and artificial intelligence.</p>
<p>The US push for <a href="https://eastasiaforum.org/2021/05/19/geopolitics-and-the-push-for-made-in-the-usa-semiconductors/">“made in the USA”</a> semiconductors has to do with this systemic importance. Chip shortages <a href="https://www.cnbc.com/2023/07/28/how-the-world-went-from-a-semiconductor-shortage-to-a-major-glut.html">wreak havoc</a> on global production since they power so many of the products that define contemporary life. </p>
<p>Today’s military prowess even directly relies on chips. In fact, according to the <a href="https://www.csis.org/analysis/semiconductors-and-national-defense-what-are-stakes">Center for Strategic and International Studies</a>, “all major US defence systems and platforms rely on semiconductors.” </p>
<p>The prospect of relying on Chinese-made chips — and the backdoors, Trojan horses and control over supply that would pose — are unacceptable to Washington and its allies.</p>
<h2>Stifling China’s chip industry</h2>
<p>Since the 1980s, the US has helped establish and maintain a distribution of chip manufacturing that is dominated by South Korea and Taiwan. But the US has recently sought to safeguard its technological supremacy and independence by bolstering its <a href="https://www.cnbc.com/2023/10/17/how-the-chips-act-is-aiming-to-restore-a-us-lead-in-semiconductors.html">own manufacturing ability</a>.</p>
<p>Through large-scale <a href="https://www.whitehouse.gov/briefing-room/statements-releases/2022/08/09/fact-sheet-chips-and-science-act-will-lower-costs-create-jobs-strengthen-supply-chains-and-counter-china/">industrial policy</a>, billions of dollars are being poured into US chip manufacturing facilities, including a multi-billion dollar <a href="https://www.theguardian.com/business/2023/aug/28/phoenix-microchip-plant-biden-union-tsmc">plant in Arizona</a>. </p>
<figure class="align-center ">
<img alt="A large factory under construction on a clear, sunny day." src="https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=402&fit=crop&dpr=1 600w, https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=402&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=402&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=505&fit=crop&dpr=1 754w, https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=505&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/574637/original/file-20240209-16-wo3zz4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=505&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">TSMC, the world’s largest chipmaker, building an advanced semiconductor factory in the US state of Arizona.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/phoenix-arizona-march-08-2023-ongoing-2272665185">Around the World Photos/Shutterstock</a></span>
</figcaption>
</figure>
<p>The second major tack is exclusion. The Committee on Foreign Investment in the United States has subjected <a href="https://theconversation.com/whats-at-stake-in-trumps-war-on-huawei-control-of-the-global-computer-chip-industry-124079">numerous investment and acquisition deals</a> to review, ultimately even blocking some in the name of US national security. This includes the high-profile case of <a href="https://www.economist.com/business/2018/03/08/cfius-intervenes-in-broadcoms-attempt-to-buy-qualcomm">Broadcom’s attempt to buy Qualcomm</a> in 2018 due to its China links.</p>
<p>In 2023, the US government issued an <a href="https://sanctionsnews.bakermckenzie.com/us-government-issues-executive-order-restricting-us-outbound-investment-in-advanced-technologies-involving-countries-of-concern-china/">executive order</a> inhibiting the export of advanced semiconductor manufacturing equipment and technologies to China. By imposing stringent export controls, the US aims to impede China’s access to critical components. </p>
<p>The hypothesis has been that HiSilicon and SMIC would continue to stumble as they attempt self-sufficiency at the frontier. The US government has called on its friends to adopt a unified stance around excluding chip exports to China. Notably, ASML, a leading Dutch designer, has <a href="https://www.theguardian.com/technology/2024/jan/02/asml-halts-hi-tech-chip-making-exports-to-china-reportedly-after-us-request#:%7E:text=1%20month%20old-,ASML%20halts%20hi%2Dtech%20chip%2Dmaking%20exports%20to,China%20reportedly%20after%20US%20request&text=A%20Dutch%20manufacturer%20has%20cancelled,government%2C%20it%20has%20been%20reported.">halted shipments</a> of its hi-tech chips to China on account of US policy. </p>
<p>Washington has also <a href="https://economictimes.indiatimes.com/tech/technology/china-quietly-recruits-overseas-chip-talent-as-us-tightens-curbs/articleshow/103004607.cms?from=mdr">limited talent flows</a> to the Chinese semiconductor industry. The regulations to limit the movements of talent are motivated by the observation that even “godfathers” of semiconductor manufacturing in Japan, Korea and Taiwan <a href="https://eastasiaforum.org/2022/09/28/washington-shores-up-friends-in-the-semiconductor-industry/">went on to work</a> for Chinese chipmakers — taking their know-how and connections with them. </p>
<p>This, and the <a href="https://www.reddit.com/r/taiwan/comments/154x9vt/tsmc_delays_us_chip_fab_opening_says_us_talent_is/">recurring headlines</a> about the need for more semiconductor talent in the US, has fuelled the clampdown on the outflow of American talent. </p>
<p>Finally, the US government has explicitly targeted China’s national champion firms: Huawei and SMIC. It banned the sale and import of equipment from <a href="https://asia.nikkei.com/Politics/International-relations/US-China-tensions/After-Huawei-5G-chip-debut-U.S.-lawmakers-call-for-tighter-export-controls#:%7E:text=After%20the%20U.S.%20government%20put,SMIC%20has%20also%20been%20blacklisted.">Huawei in 2019</a> and has <a href="https://www.aljazeera.com/economy/2023/9/15/us-republicans-demand-full-sanctions-charges-against-chinas-huawei-smic">imposed sanctions on SMIC</a> since 2020. </p>
<h2>What’s at stake?</h2>
<p>The <a href="https://ig.ft.com/sites/business-book-award/books/2022/winner/chip-war-by-chris-miller/">“chip war”</a> is about economic and security dominance. Beijing’s ascent to the technological frontier would mean an economic boom for China and bust for the US. And it would have profound security implications.</p>
<p>Economically, China’s emergence as a major semiconductor player could disrupt existing supply chains, reshape the division of labour and distribution of human capital in the global electronics industry. From a security perspective, China’s rise poses a heightened risk of vulnerabilities in Chinese-made chips being exploited to compromise critical infrastructure or conduct cyber espionage. </p>
<p>Chinese self-sufficiency in semiconductor design and manufacturing would also undermine Taiwan’s “silicon shield”. Taiwan’s status as the <a href="https://theconversation.com/the-microchip-industry-would-implode-if-china-invaded-taiwan-and-it-would-affect-everyone-206335">leading manufacturer</a> of semiconductors has so far deterred China from using force to attack the island.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-microchip-industry-would-implode-if-china-invaded-taiwan-and-it-would-affect-everyone-206335">The microchip industry would implode if China invaded Taiwan, and it would affect everyone</a>
</strong>
</em>
</p>
<hr>
<p>China is advancing its semiconductor capabilities. The economic, geopolitical and security implications will be profound and far-reaching. Given the stakes that both superpowers face, what we can be sure about is that Washington will not easily acquiesce, nor will Beijing give up.</p><img src="https://counter.theconversation.com/content/222958/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>China is making chip progress despite US efforts to contain its industry.Robyn Klingler-Vidra, Associate Dean, Global Engagement | Associate Professor in Entrepreneurship and Sustainability, King's College LondonSteven Hai, Affiliate Fellow, King’s Institute for Artificial Intelligence, King’s College London, King's College LondonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2200442023-12-18T16:17:12Z2023-12-18T16:17:12ZA new supercomputer aims to closely mimic the human brain — it could help unlock the secrets of the mind and advance AI<figure><img src="https://images.theconversation.com/files/566252/original/file-20231218-15-hajmbj.jpg?ixlib=rb-1.1.0&rect=19%2C9%2C6470%2C3940&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/businessman-touching-digital-human-brain-cell-582507070">Sdecoret / Shutterstock</a></span></figcaption></figure><p>A supercomputer scheduled to go online in April 2024 will rival the estimated rate of operations in the human brain, <a href="https://www.westernsydney.edu.au/newscentre/news_centre/more_news_stories/world_first_supercomputer_capable_of_brain-scale_simulation_being_built_at_western_sydney_university">according to researchers in Australia</a>. The machine, called DeepSouth, is capable of performing 228 trillion operations per second. </p>
<p>It’s the world’s first supercomputer capable of simulating networks of neurons and synapses (key biological structures that make up our nervous system) at the scale of the human brain.</p>
<p>DeepSouth belongs to an approach <a href="https://www.nature.com/articles/s43588-021-00184-y">known as neuromorphic computing</a>, which aims to mimic the biological processes of the human brain. It will be run from the International Centre for Neuromorphic Systems at Western Sydney University.</p>
<p>Our brain is the most amazing computing machine we know. By distributing its
computing power to billions of small units (neurons) that interact through trillions of connections (synapses), the brain can rival the most powerful supercomputers in the world, while requiring only the same power used by a fridge lamp bulb.</p>
<p>Supercomputers, meanwhile, generally take up lots of space and need large amounts of electrical power to run. The world’s most powerful supercomputer, the <a href="https://www.hpe.com/uk/en/compute/hpc/cray/oak-ridge-national-laboratory.html">Hewlett Packard Enterprise Frontier</a>, can perform just over one quintillion operations per second. It covers 680 square metres (7,300 sq ft) and requires 22.7 megawatts (MW) to run. </p>
<p>Our brains can perform the same number of operations per second with just 20 watts of power, while weighing just 1.3kg-1.4kg. Among other things, neuromorphic computing aims to unlock the secrets of this amazing efficiency.</p>
<h2>Transistors at the limits</h2>
<p>On June 30 1945, the mathematician and physicist <a href="https://www.ias.edu/von-neumann">John von Neumann</a> described the design of a new machine, the <a href="https://ieeexplore.ieee.org/document/194089">Electronic Discrete Variable Automatic Computer (Edvac)</a>. This effectively defined the modern electronic computer as we know it. </p>
<p>My smartphone, the laptop I am using to write this article and the most powerful supercomputer in the world all share the same fundamental structure introduced by von Neumann almost 80 years ago. <a href="https://www.sciencedirect.com/topics/computer-science/von-neumann-architecture">These all have distinct processing and memory units</a>, where data and instructions are stored in the memory and computed by a processor.</p>
<p>For decades, the number of transistors on a microchip doubled approximately every two years, <a href="https://ieeexplore.ieee.org/abstract/document/591665">an observation known as Moore’s Law</a>. This allowed us to have smaller and cheaper computers. </p>
<p>However, transistor sizes are now approaching the atomic scale. At these tiny sizes, excessive heat generation is a problem, as is a phenomenon called quantum tunnelling, which interferes with the functioning of the transistors. <a href="https://qz.com/852770/theres-a-limit-to-how-small-we-can-make-transistors-but-the-solution-is-photonic-chips#:%7E:text=They're%20made%20of%20silicon,we%20can%20make%20a%20transistor.">This is slowing down</a> and will eventually halt transistor miniaturisation.</p>
<p>To overcome this issue, scientists are exploring new approaches to
computing, starting from the powerful computer we all have hidden in our heads, the human brain. Our brains do not work according to John von Neumann’s model of the computer. They don’t have separate computing and memory areas. </p>
<p>They instead work by connecting billions of nerve cells that communicate information in the form of electrical impulses. Information can be passed from <a href="https://qbi.uq.edu.au/brain-basics/brain/brain-physiology/action-potentials-and-synapses">one neuron to the next through a junction called a synapse</a>. The organisation of neurons and synapses in the brain is flexible, scalable and efficient. </p>
<p>So in the brain – and unlike in a computer – memory and computation are governed by the same neurons and synapses. Since the late 1980s, scientists have been studying this model with the intention of importing it to computing.</p>
<figure class="align-center ">
<img alt="Microchip." src="https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/566265/original/file-20231218-25-yjbwxy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The continuing miniaturisation of transistors on microchips is limited by the laws of physics.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/close-presentation-new-generation-microchip-gloved-691548583">Gorodenkoff / Shutterstock</a></span>
</figcaption>
</figure>
<h2>Imitation of life</h2>
<p>Neuromorphic computers are based on intricate networks of simple, elementary processors (which act like the brain’s neurons and synapses). The main advantage of this is that these machines <a href="https://www.electronicsworld.co.uk/advances-in-parallel-processing-with-neuromorphic-analogue-chip-implementations/34337/">are inherently “parallel”</a>. </p>
<p>This means that, <a href="https://www.pnas.org/doi/full/10.1073/pnas.95.3.933">as with neurons and synapses</a>, virtually all the processors in a computer can potentially be operating simultaneously, communicating in tandem.</p>
<p>In addition, because the computations performed by individual neurons and synapses are very simple compared with traditional computers, the energy consumption is orders of magnitude smaller. Although neurons are sometimes thought of as processing units, and synapses as memory units, they contribute to both processing and storage. In other words, data is already located where the computation requires it.</p>
<p>This speeds up the brain’s computing in general because there is no separation between memory and processor, which in classical (von Neumann) machines causes a slowdown. But it also avoids the need to perform a specific task of accessing data from a main memory component, as happens in conventional computing systems and consumes a considerable amount of energy. </p>
<p>The principles we have just described are the main inspiration for DeepSouth. This is not the only neuromorphic system currently active. It is worth mentioning the <a href="https://www.humanbrainproject.eu">Human Brain Project (HBP)</a>, funded under an <a href="https://ec.europa.eu/futurium/en/content/fet-flagships.html">EU initiative</a>. The HBP was operational from 2013 to 2023, and led to BrainScaleS, a machine located in Heidelberg, in Germany, that emulates the way that neurons and synapses work. </p>
<p><a href="https://www.humanbrainproject.eu/en/science-development/focus-areas/neuromorphic-computing/hardware/">BrainScaleS</a> can simulate the way that neurons “spike”, the way that an electrical impulse travels along a neuron in our brains. This would make BrainScaleS an ideal candidate to investigate the mechanics of cognitive processes and, in future, mechanisms underlying serious neurological and neurodegenerative diseases.</p>
<p>Because they are engineered to mimic actual brains, neuromorphic computers could be the beginning of a turning point. Offering sustainable and affordable computing power and allowing researchers to evaluate models of neurological systems, they are an ideal platform for a range of applications. They have the potential to both advance our understanding of the brain and offer new approaches to artificial intelligence.</p><img src="https://counter.theconversation.com/content/220044/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Domenico Vicinanza does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Neuromorphic computers aim to one day replicate the amazing efficiency of the brain.Domenico Vicinanza, Associate Professor of Intelligent Systems and Data Science, Anglia Ruskin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1466592020-10-05T12:11:28Z2020-10-05T12:11:28ZNeuronlike circuits bring brainlike computers a step closer<figure><img src="https://images.theconversation.com/files/360941/original/file-20200930-16-1uqk1tg.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C5991%2C4491&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Brainlike computer chips promise powerful computers that use little energy.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/digital-brain-royalty-free-image/517254617?adppopup=true">D3Damon/E+ via Getty Images</a></span></figcaption></figure><p><em>The <a href="https://theconversation.com/us/topics/research-brief-83231">Research Brief</a> is a short take about interesting academic work.</em></p>
<h2>The big idea</h2>
<p>For the first time, my colleagues and <a href="https://scholar.google.com/citations?user=dAFE2L8AAAAJ&hl=en">I</a> have built a single electronic device that is <a href="https://doi.org/10.1038/s41586-020-2735-5">capable of copying the functions of neuron cells</a> in a brain. We then connected 20 of them together to perform a complicated calculation. This work shows that it is scientifically possible to make an advanced computer that does not rely on transistors to calculate and that uses much less electrical power than today’s data centers.</p>
<p>Our research, which I began in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that performs most of the known functions of neurons in a brain? If so, can we use it as a building block to build useful computers? </p>
<p>Neurons are very finely tuned, and so are electronic elements that emulate them. I co-authored a <a href="https://doi.org/10.1038/nmat3510">research paper</a> in 2013 that laid out in principle what needed to be done. It took my colleague <a href="https://orcid.org/0000-0002-6772-7250">Suhas Kumar</a> and others five years of careful exploration to get exactly the right material composition and structure to produce the necessary property predicted from theory. </p>
<p>Kumar then went a major step further and built a circuit with 20 of these elements connected to one another through a network of devices that can be programmed to have particular capacitances, or abilities to store electric charge. He then mapped a mathematical problem to the capacitances in the network, which allowed him to use the device to find the solution to a small version of a problem that is important in a wide range of modern analytics.</p>
<p>The simple example we used was to look at the possible mutations that have occurred in a family of viruses by comparing pieces of their genetic information.</p>
<h2>Why it matters</h2>
<p>The performance of computers is <a href="https://theconversation.com/with-silicon-pushed-to-its-limits-what-will-power-the-next-electronics-revolution-46287">rapidly reaching a limit</a> because the size of the smallest transistor in integrated circuits is now approaching 20 atoms wide. Any smaller and the physical principles that determine transistor behavior no longer apply. There is a high-stakes competition to see if someone can build a much better transistor, a method for stacking transistors or some other device that can perform the tasks that currently require thousands of transistors. </p>
<p>This quest is important because people have become used to the exponential improvement of computing capacity and efficiency of the past 40 years, and many business models and our economy have been built on this expectation. Engineers and computer scientists have now constructed machines that <a href="https://www.statista.com/statistics/638621/worldwide-data-center-storage-used-by-big-data/">collect enormous amounts of data</a>, which is the ore from which the most valuable commodity, information, is refined. The volume of that data is almost doubling every year, which is outstripping the capability of today’s computers to analyze it. </p>
<h2>What other research is being done in this field</h2>
<p>The fundamental theory of neuron function was first proposed by <a href="https://dx.doi.org/10.1113%2Fjphysiol.2012.230458">Alan Hodgkin and Andrew Huxley</a> about 70 years ago, and it is still in use today. It is very complex and difficult to simulate on a computer, and only recently has it been <a href="https://doi.org/10.1088/0957-4484/24/38/383001">reanalyzed and cast in the mathematics of modern nonlinear dynamics theory</a> by <a href="https://www.eurekalert.org/pub_releases/2020-02/s-loc022520.php">Leon Chua</a>. </p>
<p>I was inspired by this work and have spent much of the past 10 years learning the necessary math and figuring out how to build a real electronic device that works as the theory predicts. </p>
<p>There are numerous research teams around the world taking <a href="https://cacm.acm.org/magazines/2020/8/246356-neuromorphic-chips-take-shape/fulltext">different approaches</a> to building brainlike, or neuromorphic, computer chips.</p>
<h2>What’s next</h2>
<p>The technological challenge now is to scale up our proof-of-principles demonstration to something that can compete against today’s digital behemoths.</p><img src="https://counter.theconversation.com/content/146659/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>R. Stanley Williams was previously employed by Hewlett Packard Enterprise and presently owns stock in the company. He has received research funding from Texas A&M University. He is member of the IEEE.</span></em></p>Artificial brains are far in the future, but computer chips that work like brains could keep computers advancing when today’s silicon transistor chips reach their limit.R. Stanley Williams, Professor of Electrical and Computer Engineering, Texas A&M UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1207062019-07-24T16:37:47Z2019-07-24T16:37:47ZNeven’s Law: why it might be too soon for a Moore’s Law for quantum computers<figure><img src="https://images.theconversation.com/files/285614/original/file-20190724-110149-1ipk2c9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A dilution refrigerator used to test quantum processor prototypes.</span> <span class="attribution"><span class="source">Agnese Abrusci</span>, <span class="license">Author provided</span></span></figcaption></figure><p>A new disruptive technology is on the horizon and it promises to take computing power to unprecedented and unimaginable heights. And to predict the speed of progress of this new “<a href="https://theconversation.com/explainer-quantum-computation-and-communication-technology-7892">quantum computing</a>” technology, the director of Google’s Quantum AI Labs, <a href="https://ai.google/research/people/HartmutNeven">Hartmut Neven</a>, has <a href="https://www.quantamagazine.org/does-nevens-law-describe-quantum-computings-rise-20190618/">proposed a new rule</a> similar to the Moore’s Law that has measured the progress of computers for more than 50 years.</p>
<p>But can we trust “Neven’s Law” as a true representation of what is happening in quantum computing and, most importantly, what is to come in the future? Or is it simply too early on in the race to come up with this type of judgement?</p>
<p>Unlike conventional computers that store data as electrical signals that can have one of two states (1 or 0), <a href="https://theconversation.com/how-we-created-the-first-ever-blueprint-for-a-real-quantum-computer-72290">quantum computers</a> can use many physical systems to store data, such as electrons and photons. These can be engineered to encode information in multiple states, which enables them to do calculations exponentially faster that traditional computers.</p>
<p>Quantum computing is still in its infancy, and no one has yet built a quantum computer that can outperform conventional supercomputers. But, despite some <a href="https://theconversation.com/hype-and-cash-are-muddying-public-understanding-of-quantum-computing-82647">scepticism</a>, there is widespread excitement about how fast progress is <a href="https://theconversation.com/ibm-launches-commercial-quantum-computing-were-not-ready-for-what-comes-next-110331">now being made</a>. As such, it would be helpful to have an idea of what we can expect from quantum computers in years to come. </p>
<p><a href="https://theconversation.com/moores-law-is-50-years-old-but-will-it-continue-44511">Moore’s Law</a> describes the way that the processing power of traditional digital computers has tended to double roughly every two years, creating what we call exponential growth. Named after Intel co-founder, Gordon Moore, the law more accurately describes the rate of increase in the number of transistors that can be integrated into a silicon microchip.</p>
<p>But quantum computers are designed in a very different way around the laws of <a href="https://theconversation.com/explainer-quantum-physics-570">quantum physics</a>. And so Moore’s Law does not apply. This is where Neven’s Law comes in. It states that quantum computing power is experiencing “doubly exponential growth relatively to conventional computing”.</p>
<p>Exponential growth means something grows by powers of two: 2¹ (2), 2² (4), 2³ (8), 2⁴ (16) and so on. Doubly exponential growth means something grows by powers of powers of two: 2² (4), 2⁴ (16), 2⁸ (256), 2¹⁶ (65,536) and so on. To put this into perspective, if traditional computers had seen doubly exponential growth under Moore’s Law (instead of singly exponential), we would have had today’s laptops and smartphones by 1975.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/285615/original/file-20190724-110162-ze17t7.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Quantum computers may be developing much faster than conventional ones.</span>
<span class="attribution"><span class="source">Agnese Abrusci</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>This enormously fast pace should soon lead, Neven hopes, to the so-called quantum advantage. This is a much-anticipated milestone where a relatively small quantum processor overtakes the most powerful conventional supercomputers.</p>
<p>The reason for this doubly exponential growth is based on an in-house observation. <a href="https://www.quantamagazine.org/does-nevens-law-describe-quantum-computings-rise-20190618">According to an interview with Neven,</a> Google scientists are getting better at decreasing the error rate of their quantum computer prototypes. This allows them to build more complex and more powerful systems with every iteration.</p>
<p>Neven maintains that this progress itself is exponential, much like Moore’s Law. But a quantum processor is inherently and exponentially better than a classical one of equal size. This is because it exploits a quantum effect called <a href="https://theconversation.com/einstein-vs-quantum-mechanics-and-why-hed-be-a-convert-today-27641">entanglement</a> that allows different computational tasks to be done at the same time, producing exponential speed ups.</p>
<p>So, simplistically, if quantum processors are developing at an exponential rate and they are exponentially faster than classical processors, quantum systems are developing at a doubly exponential rate in relation to their classical counterparts.</p>
<h2>A note of caution</h2>
<p>While this sounds exciting, we need to exercise some caution. For starters, Neven’s conclusion seems to be based on a handful of prototypes and progress measured over a relatively short timeframe (a year or less). So few data points could easily be made to fit many other patterns of extrapolated growth.</p>
<p>There is also a practical issue that, as quantum processors become increasingly complex and powerful, technical problems that are minor now could become much more important. For example, the presence of even modest electrical noise in a quantum system could lead to computational errors that become more and more frequent as the processor complexity grows.</p>
<p>This issue could be solved by implementing <a href="https://iopscience.iop.org/article/10.1088/0034-4885/76/7/076001/meta">error correction protocols</a>, but this would effectively mean adding lots of backup hardware to the processor that is otherwise redundant. So the computer would have to become much more complex without gaining much extra power, if any. This kind of problem could affect Neven’s prediction, but at the moment it’s just too soon to call.</p>
<p>Despite being just an empirical observation and not a fundamental law of nature, Moore’s Law foresaw the progress of conventional computing with remarkable accuracy for about 50 years. In some sense, it was more than just a prediction, as it <a href="https://www.cnet.com/news/moores-law-is-the-reason-why-your-iphone-is-so-thin-and-cheap/">stimulated the microchip industry</a> to adopt a consistent roadmap, develop regular milestones, assess investment volumes and evaluate prospective revenues. </p>
<p>If Neven’s observation proves to be as prophetic and self-fulling as Moore’s Law, it will certainly have ramifications well beyond the mere prediction of quantum computing performance. For one thing, at this stage, nobody knows whether quantum computers will become widely commercialised or remain the toys of specialised users. But if Neven’s Law holds true, it won’t be long until we find out.</p><img src="https://counter.theconversation.com/content/120706/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alessandro Rossi receives funding from the UKRI Industrial Strategy Challenge Fund through the Measurement Fellowship Scheme at the National Physical Laboratory. He also holds a Chancellor's Fellowship at the University of Strathclyde.</span></em></p><p class="fine-print"><em><span>Fernando Gonzalez-Zalba receives funding from the European Union’s Horizon 2020 Research and Innovation Programme
under grant agreement No 688539, the Royal Society Short Industry Fellowship Programme and the Winton Programme for the Physics of Sustainability. He is a Research Fellow at Hughes Hall, Cambridge, and a Senior Research Scientist at Hitachi Cambridge Laboratory.</span></em></p>The head of Google’s Quantum AI Labs, Hartmut Neven, claims the current speed of development means a quantum computing breakthrough is near.Alessandro Rossi, Chancellor's Fellow, Department of Physics, University of Strathclyde M. Fernando Gonzalez-Zalba, Research Fellow, University of CambridgeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1060602018-11-28T04:07:25Z2018-11-28T04:07:25ZComputing faces an energy crunch unless new technologies are found<figure><img src="https://images.theconversation.com/files/247429/original/file-20181127-130884-1qm1olz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The tools on our smartphones are enabled by a huge network of mobile phone towers, Wi-Fi networks and server farms.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>There’s little doubt the information technology revolution has improved our lives. But unless we find a new form of electronic technology that uses less energy, computing will become limited by an “energy crunch” within decades.</p>
<p>Even the most common events in our daily life – making a phone call, sending a text message or checking an email – use computing power. Some tasks, such as watching videos, require a lot of processing, and so consume a lot of energy.</p>
<p>Because of the energy required to power the massive, factory-sized data centres and networks that connect the internet, computing already <a href="https://www.sciencedirect.com/science/article/pii/S2214629618301051">consumes 5% of global electricity</a>. And that electricity load is doubling every decade.</p>
<p>Fortunately, there are new areas of physics that offer promise for massively reduced energy use.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/bitcoins-high-energy-consumption-is-a-concern-but-it-may-be-a-price-worth-paying-106282">Bitcoin's high energy consumption is a concern – but it may be a price worth paying</a>
</strong>
</em>
</p>
<hr>
<h2>The end of Moore’s Law</h2>
<p>Humans have an insatiable demand for computing power.</p>
<p>Smartphones, for example, have become one of the most important devices of our lives. We use them to access weather forecasts, plot the best route through traffic, and watch the latest season of our favourite series. </p>
<p>And we expect our smartphones to become even more powerful in the future. We want them to translate language in real time, transport us to new locations via virtual reality, and connect us to the “Internet of Things”.</p>
<p>The computing required to make these features a reality doesn’t actually happen in our phones. Rather it’s enabled by a huge network of mobile phone towers, Wi-Fi networks and massive, factory-sized data centres known as “server farms”.</p>
<p>For the past five decades, our increasing need for computing was largely satisfied by incremental improvements in conventional, silicon-based computing technology: ever-smaller, ever-faster, ever-more efficient chips. We refer to this constant shrinking of silicon components as “Moore’s Law”.</p>
<p>Moore’s law is named after Intel co-founder Gordon Moore, who observed <a href="https://www.investopedia.com/terms/m/mooreslaw.asp">that</a>:</p>
<blockquote>
<p>the number of transistors on a chip doubles every year while the costs are halved. </p>
</blockquote>
<p>But as we hit limits of basic physics and economy, Moore’s law is winding down. We could see the end of efficiency gains using current, silicon-based technology as soon as 2020. </p>
<p>Our growing demand for computing capacity must be met with gains in computing efficiency, otherwise the information revolution will slow down from power hunger. </p>
<p>Achieving this sustainably means finding a new technology that uses less energy in computation. This is referred to as a “beyond CMOS” solution, in that it requires a radical shift from the silicon-based CMOS (complementary metal–oxide–semiconductor) technology that has been the backbone of computing for the last five decades. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/moores-law-is-50-years-old-but-will-it-continue-44511">Moore's Law is 50 years old but will it continue?</a>
</strong>
</em>
</p>
<hr>
<h2>Why does computing consume energy at all?</h2>
<p>Processing of information takes energy. When using an electronic device to watch TV, listen to music, model the weather or any other task that requires information to be processed, there are millions and millions of binary calculations going on in the background. There are zeros and ones being flipped, added, multiplied and divided at incredible speeds. </p>
<p>The fact that a microprocessor can perform these calculations billions of times a second is exactly why computers have revolutionised our lives. </p>
<p>But information processing doesn’t come for free. Physics tells us that every time we perform an operation – for example, adding two numbers together – we must pay an energy cost. </p>
<p>And the cost of doing calculations isn’t the only energy cost of running a computer. In fact, anyone who has ever used a laptop balanced on their legs will attest that most of the energy gets converted to heat. This heat comes from the resistance that electricity meets when it flows through a material. </p>
<p>It is this wasted energy due to electrical resistance that researchers are hoping to minimise. </p>
<h2>Recent advances point to solutions</h2>
<p>Running a computer will always consume some energy, but we are a long way (several orders of magnitude) away from computers that are as efficient as the laws of physics allow. Several recent advances give us hope for entirely new solutions to this problem via new materials and new concepts.</p>
<h3>Very thin materials</h3>
<p>One recent step forward in physics and materials science is being able to build and control materials that are only one or a few atoms thick. When a material forms such a thin layer, and the movement of electrons is confined to this sheet, it is possible for electricity to flow without resistance.</p>
<p>There are a range of different materials that show this property (or might show it). Our research at the ARC Centre for Future Low-Energy Electronics Technologies (<a href="http://www.fleet.org.au/">FLEET</a>) is focused on studying these materials. </p>
<h3>The study of shapes</h3>
<p>There is also an exciting conceptual leap that helps us understand this property of electricity flow without resistance. </p>
<p>This idea comes from a branch of mathematics called “topology”. Topology tells us how to compare shapes: what makes them the same and what makes them different. </p>
<p>Image a coffee cup made from soft clay. You could slowly squish and squeeze this shape until it looks like a donut. The hole in the handle of the cup becomes the hole in the donut, and the rest of the cup gets squished to form part of the donut. </p>
<p>Topology tells us that donuts and coffee cups are equivalent because we can deform one into the other without cutting it, poking holes in it, or joining pieces together.</p>
<p>It turns out that the strange rules that govern how electricity flows in thin layers can be understood in terms of topology. This insight was the focus of the <a href="https://www.nobelprize.org/prizes/physics/2016/press-release/">2016 Nobel Prize</a>, and it’s driving an enormous amount of current research in physics and engineering.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/LV4uFHqdio4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/physicists-explore-exotic-states-of-matter-inspired-by-nobel-winning-research-66543">Physicists explore exotic states of matter inspired by Nobel-winning research</a>
</strong>
</em>
</p>
<hr>
<p>We want to take advantage of these new materials and insights to develop the next generation of low-energy electronics devices, which will be based on topological science to allow electricity to flow with minimal resistance. </p>
<p>This work creates the possibility of a sustainable continuation of the IT revolution – without the huge energy cost.</p><img src="https://counter.theconversation.com/content/106060/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jared Cole receives funding for his research from the Australian Research Council. </span></em></p><p class="fine-print"><em><span>Daisy Wang does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The energy required to power the massive, factory-sized data centres that computers rely on already consumes 5% of global electricity. And that energy load is doubling every decade.Daisy Wang, Postdoctoral Fellow, UNSW School of Physics, UNSW SydneyJared Cole, Professor of Physics, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1000652018-07-17T14:09:43Z2018-07-17T14:09:43ZHappy 50th birthday Intel, you look a lot like the next Kodak<figure><img src="https://images.theconversation.com/files/228014/original/file-20180717-44088-1jd61wp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Chipped china?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/thedailyexposition/24646411287/in/photolist-DxVipD-5r7ZNB-KViyiE-6qg5GA-KgHu5S-KE19j9-dHGje7-9Ggp8b-7oie2b-iZdjeg-JLpgep-amZMq-9wQ6MM-eN14aW-iZe7r7-eegsDQ-aBDMuC-9ckAnv-6qg67s-8eHhPe-9D7M4y-bpwqhm-ygGej-6qbUwV-otaixG-5z3CoG-7yVudg-qKUSfG-4Wk6Kd-e81y6d-5yYjT4-6qbVmD-dKci48-oy6xAs-9GgyxA-4MoVB5-dKhN7N-7xbpVv-auVkQQ-e7UTJe-jTXHK-7FUuuR-5vooDy-e7UT9D-bjDbDH-27YYsG9-86DXrQ-bjDauR-yBe6U-5wuyjs">The Daily Exposition</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><blockquote>
<p>I am easily a foot taller than Andy Grove. But whenever I was with him, I felt that he was the giant.</p>
</blockquote>
<p>That’s what the bestselling Harvard business professor, Clayton Christensen, <a href="https://hbr.org/2016/03/clayton-christensen-what-ill-miss-about-andy-grove">wrote</a> about the former Intel chief executive when he passed away in 2017. Christensen, who <a href="http://www.claytonchristensen.com/books/the-innovators-dilemma/">coined the term</a> “disruptive technology”, said he would most miss Grove’s ability to understand how a complex organisation works, and to wield it to Intel’s advantage. </p>
<p>It allowed Grove, who started at the company the day it was incorporated on July 18, 1968, to famously re-orient the business in the 1980s. Intel shifted <a href="https://anthonysmoak.com/2016/03/27/andy-grove-and-intels-move-from-memory-to-microprocessors/">away from</a> memory chips for mainframe computers towards the microprocessor – the engine that spurs into motion when you turn on your computer. </p>
<p>Propelled by a deal with IBM to put Intel processors into all its personal computers, the company came to provide Silicon Valley with one of its most essential technologies. Intel Inside and the accompanying jingle became one of the most memorable advertising slogans of the modern era. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/aEDpqFHTSVM?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Even after five decades of dominance, no other company in the world can produce a better and faster microprocessor. Intel is at the pinnacle of an industry that manages to engineer miracles like no other. We tend to perceive innovation as something uncertain, particularly where it’s so reliant on scientists to drive it forward. Yet Intel is anything but ambiguous. It has released successive advances in processor engineering like clockwork. </p>
<p>In 1965, future co-founder Gordon Moore <a href="https://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html">made a bold prediction</a> about the exponential growth of computing power. He predicted that the number of microchip transistors etched into a fixed area of a computer microprocessor would double every two years – and so, therefore, would computing power. Intel has since delivered on this improbable promise, immortalising “Moore’s law”. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=785&fit=crop&dpr=1 600w, https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=785&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=785&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=986&fit=crop&dpr=1 754w, https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=986&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/228011/original/file-20180717-44079-1ss2z7v.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=986&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Intel’s Andy Grove, Robert Noyce and Gordon Moore, 1978.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/intelfreepress/8267616249">Intel Free Press</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>It’s difficult for anyone to fathom the effects of exponential growth. But it is why a single iPhone today <a href="https://www.zmescience.com/research/technology/smartphone-power-compared-to-apollo-432/">possesses</a> many times more computing power than the entire spacecraft for the NASA Apollo moon mission of 1969. Without Moore’s law, there would be no Google, no Facebook, no Uber, no Airbnb. Silicon Valley would be like any other valley.</p>
<h2>The big miss</h2>
<p>And yet, the iPhone is also what Intel missed. Immediately after <a href="https://www.cultofmac.com/431760/today-in-apple-history-steve-jobs-announces-intel-powered-macs/">the company won</a> Apple’s Mac business in 2005, Steve Jobs <a href="https://www.theatlantic.com/technology/archive/2013/05/paul-otellinis-intel-can-the-company-that-built-the-future-survive-it/275825/">came asking</a> for another chip for his smartphone. Intel certainly wanted to dominate this emerging sector but the price Jobs was offering was below its forecasted cost and it misjudged the size of the iPhone market. The company passed. </p>
<p>Apple had <a href="https://appleinsider.com/articles/15/01/19/how-intel-lost-the-mobile-chip-business-to-apples-ax-arm-application-processors">no choice but</a> to build its own chipsets by licensing technologies from <a href="https://www.arm.com">ARM</a>, a British-based company controlled by Japanese interests. If Apple and its iPhone had been the only competitors, Intel might have been able to gradually adapt. But Google came in soon after with Android, a free operating system that Samsung, Huawei and HTC all adopted. Qualcomm, Nvidia, and Texas Instruments, all licensed by ARM, became the phone makers’ go-to suppliers for energy-efficient, low-cost computing devices. </p>
<p>These American rivals are not trying to beat Intel. Qualcomm specialises in mobile phones and Nvidia specialises in graphics in video games. They all outsource production to third parties in Asia. But an Intel microprocessor sells for around US$100 while ARM-based chips sell for around US$10, and often less than a dollar. That’s how ARM-based designs are now found in more than 95% of the world’s smartphones. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=693&fit=crop&dpr=1 600w, https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=693&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=693&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=870&fit=crop&dpr=1 754w, https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=870&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/228028/original/file-20180717-44088-1l0qwr3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=870&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Harvard’s Clayton Christensen.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Clayton_Christensen_World_Economic_Forum_2013.jpg#/media/File:Clayton_Christensen_World_Economic_Forum_2013.jpg">Wikimedia</a></span>
</figcaption>
</figure>
<p>In other words, Intel failed to compete in smartphones against those who have far less resources. It’s a great irony when you reflect that Grove once invited Christensen to the Intel HQ in Santa Clara, California, to explain his theory on disruption. Grove later <a href="https://www.bostonglobe.com/business/2015/10/24/clay-christensen-explains-defends-disruptive-innovation/fmYOKIJXOSPPMquj8HQM1O/story.html">credited</a> the meeting as the main driver for Intel’s decision to launch the Celeron chip in 1998, a cheap product aimed at low-end PCs, which within a year captured 35% of the market. </p>
<h2>The new goldrush</h2>
<p>Now the big question is whether Intel is repeating its previous mistake with iPhones – this time in driverless cars. Last March it <a href="https://www.nytimes.com/2017/03/13/business/dealbook/intel-mobileye-autonomous-cars-israel.html?ref=business">purchased</a> Mobileye, an Israeli company that makes digital vision technology, for US$15.3 billion. It was a big bet in a sector that has huge potential: as autonomous driving takes off, vehicles are becoming computers on wheels. They will require more and more microchips and Intel hopes to dominate. </p>
<p>Except for one glitch. Everything Intel has done in the last 50 years is geared towards general purpose, high-end chipsets. Its integrated model – where the company designs and manufactures its processors – means it absorbs an enormous amount of fixed cost, in research and design as well as manufacturing. </p>
<p>The <a href="https://newsroom.intel.com/editorials/krzanich-ai-day/">only way</a> to offset these burdens is to sell a high volume of devices at high margins. The result is that the company is obsessed with technological progress, but has a rigid business model which limits what it can and cannot do. There’s a monster inside Intel with a ferocious appetite. </p>
<p>But what if autonomous driving doesn’t actually require the computing power Intel is counting on? This is the competing vision of Huawei. When I recently visited Shenzhen, executives from the Chinese telecom giant explained to me that much of the city’s infrastructure will be digitalised and that Huawei will saturate it with <a href="https://theconversation.com/what-is-5g-the-next-generation-of-wireless-explained-96165">a 5G network</a>. This will drastically reduce any speed and latency problems for computers. </p>
<p>This means the computing inside cars can be mostly offloaded to the city’s infrastructure. It is a radical vision, but clearly a viable alternative. The implication is that a BMW or Toyota doesn’t need that many high-end chipsets after all. It’s smartphones all over again. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=632&fit=crop&dpr=1 600w, https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=632&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=632&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=795&fit=crop&dpr=1 754w, https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=795&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/228001/original/file-20180717-44082-1ye7e0r.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=795&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The future once.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/launceston-australiafebruary-2-2012-old-kodak-483403420?src=gO4387mUsXCGBcoqiimWhw-1-37">Steve Lovegrove</a></span>
</figcaption>
</figure>
<p>Christensen’s insight was that successful companies die not because of complacency to change. <a href="https://www.forbes.com/sites/chunkamui/2012/01/18/how-kodak-failed/">Kodak</a>, <a href="https://insights.som.yale.edu/insights/what-was-polaroid-thinking">Polaroid</a>, <a href="https://www.forbes.com/sites/gregsatell/2014/09/05/a-look-back-at-why-blockbuster-really-failed-and-why-it-didnt-have-to/#1c3b6cfb1d64">Blockbuster</a> and <a href="https://www.forbes.com/2001/01/19/0915malone.html#53dbd1631f37">DEC</a> all understood the shifting landscape. </p>
<p>But in each case, their business model and the demands of existing shareholders formed an intractable nexus that even the most courageous executives found impossible to navigate. Grove once said, “only the paranoid survive”. Maybe he was right.</p><img src="https://counter.theconversation.com/content/100065/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Howard Yu does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Silicon Valley’s chip supplier de choix scored a massive own goal with smartphones. If it has got driverless cars wrong too, it could be goodnight Santa Clara.Howard Yu, Professor of Management and Innovation, International Institute for Management Development (IMD)Licensed as Creative Commons – attribution, no derivatives.