tag:theconversation.com,2011:/nz/topics/nanophotonics-27167/articlesNanophotonics – The Conversation2021-06-16T07:00:52Ztag:theconversation.com,2011:article/1626152021-06-16T07:00:52Z2021-06-16T07:00:52ZSeeing the invisible: tiny crystal films could make night vision an everyday reality<figure><img src="https://images.theconversation.com/files/406613/original/file-20210616-3738-1hg0363.png?ixlib=rb-1.1.0&rect=8%2C2%2C902%2C444&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Artist's impression of the view through future night-vision glasses.</span> <span class="attribution"><span class="source">Lei Xu / NTU</span>, <span class="license">Author provided</span></span></figcaption></figure><p>It’s a familiar vision to anyone who has watched a lot of action movies or played Call of Duty: a ghostly green image that makes invisible objects visible. Since the development of the first night-vision devices in the mid-1960s, the technology has captured the popular imagination.</p>
<p>Night vision goggles, infrared cameras and other similar devices detect infrared light reflected from objects or rather detect infrared light emitted from objects in the form of heat. Today these devices are widely used not only by the military, but also by law enforcement and emergency services, the security and surveillance industries, wildlife hunters, and camping enthusiasts.</p>
<p>But current technology is not without its problems. Commercial infrared cameras block visible light, disrupting normal vision. The gear is bulky and heavy, and requires low temperatures — and, in some cases, even cryogenic cooling — to work.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/406606/original/file-20210616-15-1skecj7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Rocio Camacho Morales in the optics lab.</span>
<span class="attribution"><span class="source">Jamie Kidston / ANU</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>We have proposed a new technology that uses ultra-thin layers of nanocrystals to make infrared light visible, addressing many of the longstanding problems with current devices. Our research is published in <a href="https://doi.org/10.1117/1.AP.3.3.036002">Advanced Photonics</a>. </p>
<p>Our eventual goal is to produce a light, film-like layer that can sit on glasses or other lenses, powered by a tiny built-in laser, allowing people to see in the dark.</p>
<h2>Conventional infrared detection</h2>
<p>Commercial infrared cameras convert infrared light to an electric signal, which is then shown on a display screen. They require low temperatures, because of the low energy and frequency of infrared light. This makes conventional infrared detectors bulky and heavy – some security personnel have reported
chronic neck injury due to <a href="https://doi.org/10.3357/AMHP.4027.2015">regular use of night vision goggles</a> . </p>
<p>Another drawback of the current technology is that it blocks the transmission of visible light, thereby disrupting normal vision. In some cases, infrared images could be sent to a display monitor, leaving normal vision intact. However, this solution is not feasible when users are on the move.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/looking-at-the-universe-through-very-different-eyes-86068">Looking at the universe through very different 'eyes'</a>
</strong>
</em>
</p>
<hr>
<h2>All-optical alternatives</h2>
<p>There are also some all-optical <a href="https://doi.org/10.1063/1.1651902">alternatives</a>, which do not involve electrical signals. Instead, they directly convert infrared light into visible light. The visible light can then be captured by the eye or a camera.</p>
<p>These technologies work by combining incoming infrared light with a strong light source – a laser beam – inside a material known as “nonlinear crystal”. The crystal then emits light in the visible spectrum. </p>
<p>However, nonlinear crystals are bulky and expensive, and can only detect light in a narrow band of infrared frequencies.</p>
<h2>Metasurfaces provide the solution</h2>
<p>Our work advances this all-optical approach. Instead of a non-linear crystal, we set out to use carefully designed layers of nanocrystal called “metasurfaces”. Metasurfaces are ultra-thin and ultra-light, and can be tweaked to manipulate the color or frequency of the light that passes through them.</p>
<p>This makes metasurfaces an attractive platform to convert infrared photons to the visible. Importantly, transparent metasurfaces could enable infrared imaging and allow for normal vision at the same time.</p>
<p>Our group set out to demonstrate infrared imaging with metasurfaces. We designed a metasurface composed of hundreds of incredibly tiny crystal antennas made of the semiconductor gallium arsenide. </p>
<p>This metasurface was designed to amplify light by resonance at certain infrared frequencies, as well as the frequency of the laser and the visible light output. We then fabricated the metasurface and transferred it to a transparent glass, forming a layer of nanocrystals on a glass surface.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=416&fit=crop&dpr=1 600w, https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=416&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=416&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=523&fit=crop&dpr=1 754w, https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=523&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/406071/original/file-20210614-23-1cjuozk.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=523&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A scanning electron microscope image shows the nanocrystal structures of the metasurface used to make infrared light visible.</span>
<span class="attribution"><span class="source">Mohsen Rahmani/ NTU</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>To test our metasurface, we illuminated it with infrared images of a target and saw that the infrared images were converted to visible green images. We tested this with various positions of the target, and also with no target at all — so we could see the green emission of the metasurface itself. In the images obtained, the dark stripes correspond to the infrared target, surrounded by the green visible emission.</p>
<p>Despite different parts of the infrared images being up-converted by independent nanocrystals composing the metasurface, the images were well reproduced in visible light.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=367&fit=crop&dpr=1 600w, https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=367&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=367&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=461&fit=crop&dpr=1 754w, https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=461&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/406072/original/file-20210614-25-1ngrt1e.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=461&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">These pairs of images show the shape of the infrared target at left and the visible-light view through the metasurface at right.</span>
<span class="attribution"><span class="source">Rocio Camacho Morales</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>While our experiment is only a proof of concept, this technology can in principle do many things that are not possible with conventional systems, such as a broader angle of view and multi-colour infrared imaging.</p>
<h2>The future of metasurfaces in novel technologies</h2>
<p>The demand for detecting infrared light, invisible to human eyes, is constantly growing, due to a wide variety of applications beyond night vision. The technology could be used in the agricultural industry to help monitor and maintain food quality control, and in remote sensing techniques such as LIDAR – a technology that is helping to map natural and manmade environments. </p>
<p>In a wider context, the use of metasurfaces to detect, generate and manipulate light is booming. Harnessing the power of metasurfaces will bring us closer to technologies such as real-time holographic displays, artificial vision for autonomous systems, and ultra-fast light-based wifi. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/small-and-bright-what-nanophotonics-means-for-you-58747">Small and bright: what nanophotonics means for you</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/162615/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rocio Camacho Morales would like to acknowledge the support of the ARC Centre of Excellence for Transformative Meta-Optical Systems (TMOS) and the Consejo Nacional de Ciencia y Tecnología (CONACYT),</span></em></p>New ‘nanocrystal metasurfaces’ can convert infrared light into the visible spectrumRocio Camacho Morales, Postdoctoral fellow, ARC Center of Excellence for Transformative Meta-Optical Systems (TMOS), Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1364112020-04-16T09:28:27Z2020-04-16T09:28:27ZHow did insects get their colours? Crystal-covered beetle discovery sheds light<figure><img src="https://images.theconversation.com/files/328419/original/file-20200416-192698-1282qkf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Artist's impression of the weevil.</span> <span class="attribution"><span class="source">University College Cork</span></span></figcaption></figure><p>The natural world is full of colour, and few groups of animals are as colourful as insects. From the dramatic black and yellow stripes of wasps and striking spots of ladybirds to the dazzling metallic sheen of jewel beetles, insects show a kaleidoscopic array of hues, patterns and optical effects.</p>
<p>But exactly why insects are so colourful isn’t always clear. How and when did insects evolve colours, and have their roles always been the same? We recently discovered some spectacularly preserved blue-green colours in the scales of 13,000-year-old fossilised weevil beetles. Our find, <a href="https://royalsocietypublishing.org/doi/10.1098/rsbl.2020.0063">published in Biology Letters</a>, sheds light on the evolution of the most complex colour-producing structures known in insects: 3D biophotonic crystals.</p>
<p>Until now, we had only ever found one example of such preserved crystals in a fossil. Our new specimen supports the idea that 3D colour-producing structures may have evolved as a means of camouflage rather than to attract attention. But more importantly, the discovery indicates that these fossils may be much more common than we previously thought. This opens up greater potential for us to learn far more about the evolution of these “structural colours”, and the biophotonic crystals that produce them.</p>
<p>These futuristic-sounding structures are part of a family of materials that often have a regular, self-repeating architecture at a nanoscopic level. <a href="https://www.nature.com/articles/nature01941">Such structures</a> are often able to scatter specific wavelengths of light, producing so-called structural colours that have particular optical properties. We encounter these every day: the rainbow sheen on a DVD, the swirling colours of a soap bubble, and the fire-like flash in a crystal of labradorite or opal.</p>
<p>The structural colours produced by biological nanostructures are the brightest and most intense in nature. Classic examples include the dazzling blue flash of a <a href="https://royalsocietypublishing.org/doi/abs/10.1098/rspb.2002.2019?casa_token=cG8wI56dg9UAAAAA:_HCj8AAEShnB4i_Syn7AL3dxRaZ6J0Rr2koCfMxR2-RTFCyic3TPpR9IjM9eVfrFwn8LbZio_cKudg"><em>Morpho</em> butterfly’s wing</a> and the golden mirror-like reflection from a <a href="https://royalsocietypublishing.org/doi/full/10.1098/rsif.2017.0129"><em>Chrysina</em> beetle</a>, both produced by microscopic layers in the insects’ tissues. </p>
<p>These structures produce flashy colours that you can only see from a narrow range of viewing angles and that change depending on viewing angle (a phenomenon known as iridescence). These optical effects, and the associated vivid colours, happen to be very useful for startling predators and attracting mates.</p>
<p>But there are also <a href="https://www.nature.com/articles/nature01941">3D biophotonic crystal structures</a> that can manipulate light in all directions. In insects, these are found only in the scales of weevils, longhorn beetles, butterflies and moths, where they can form intricate arrays of chitin (the material that makes up the bulk of the exoskeleton of insects) and air.</p>
<p>Studying where these structures have appeared throughout evolutionary history could help us understand why they appeared in the first place. The problem is that the fossil record of 3D biophotonic crystals is virtually non-existent. There is only one known example, <a href="https://royalsocietypublishing.org/doi/10.1098/rsif.2014.0736">a 735,000-year-old fossilised beetle</a> found by one of us (Maria) in 2014 in rock made from layers of sediment deposited by a glacier in Canada. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=235&fit=crop&dpr=1 600w, https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=235&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=235&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=295&fit=crop&dpr=1 754w, https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=295&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/328300/original/file-20200416-140735-1n50keu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=295&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The crystals were identified with a microscope and electron microscope.</span>
<span class="attribution"><span class="source">University College Cork</span></span>
</figcaption>
</figure>
<p>Our new discovery provides another fossilised example of 3D biophotonic crystals from a different type of location, suggesting their preservation is probably more widespread than previously thought. The specimens are 13,000-year-old weevils found by our colleague <a href="https://theconversation.com/profiles/scott-armstrong-elias-118184">Scott Elias</a> (formerly of Royal Holloway University), in sediments from the ancient lake of Lobsigensee in Switzerland. </p>
<p>The newly-discovered insects appear rather underwhelming, preserved as small brown fragments of wing cases. But at high magnification, the scales’ colours are astonishing: vivid greens, blues and hints of yellow. We examined the scales using powerful electron microscopes, which confirmed the presence of ordered nanostructured arrays. Preservation of this level of tissue nanostructure is mind-boggling, even for us hardened professionals.</p>
<p>With our microscope studies, we had good evidence that the structures were 3D biophotonic crystals, but to prove it required structural diagnoses and optical modelling. This was done by our colleague Vinod Saranathan, who examined the scales using X-ray analysis at the Argonne particle accelerator near Chicago. Saranathan’s work confirmed that the fossil scales contain a single diamond photonic crystal nanostructure. And so the brilliant green, yellow and blue colours are indeed fossilised structural colours.</p>
<p>However, the colours from the individual microscopic crystals appear to mix at a visible level, suppressing iridescence and producing an overall greenish colour. The result is a matt rather than a shiny colour, unlike that of most insects with 3D nanostructures. This suggests the weevil’s crystals evolved as a form of camouflage, matching it to its leafy background habitat. </p>
<h2>Where are the other fossils?</h2>
<p>But if the fossilisation of 3D biophotonic crystals is more common than we thought, why haven’t we found more specimens? Maria’s <a href="https://pubs.geoscienceworld.org/gsa/geology/article-abstract/41/4/487/131196/The-fossil-record-of-insect-color-illuminated-by?redirectedFrom=fulltext">previous research</a> confirmed the crystals should survive the rigours of decay and burial during fossilisation. Instead, the poor fossil record of these structures probably reflects the fact that the scales likely fall off after death. </p>
<p>What’s more, scales bearing structural colours are usually less than 100 microns across, effectively invisible to the naked eye. So it’s likely that many other examples of fossilised 3D crystals have actually been overlooked, due to the small size of insect scales.</p>
<p>What now? Clearly we need to search deeper in time for more examples. Good targets include fossils from the Cenozoic Era (from 66 million years ago to today) that preserve <a href="https://royalsocietypublishing.org/doi/10.1098/rspb.2011.1677">other types of structural colour</a>, and insects hosted in amber, which can preserve scales with <a href="https://advances.sciencemag.org/content/4/4/e1700988">evidence of colour</a>. </p>
<p>Most useful of all would be studies of the earliest weevils, from the Late Jurassic and Early Cretaceous periods (163-100 million years ago). These would allow us to test whether the evolution of 3D biophotonic crystals was linked with the proliferation of flowering plants that took place at this time. Close examination of the insect fossil record will likely reveal many more examples, helping us understand the environmental and ecological factors driving evolution of these incredibly complex tissue structures and their functions.</p><img src="https://counter.theconversation.com/content/136411/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Maria McNamara receives funding from the European Research Council via Starting Grant ERC-2014-StG-637691-ANICOLEVO.</span></em></p><p class="fine-print"><em><span>Luke McDonald is supported by European Research Council Starter Grant ERC-2014-StG-637691-ANICOLEVO awarded to Maria McNamara. </span></em></p>Researchers realised a dull-looking 13,000-year-old weevil was actually covered in brilliant green, blue and yellow nanoscopic crystals.Maria McNamara, Senior Lecturer in Geology, University College CorkLuke McDonald, Postdoctoral Researcher, School of Biological, Earth and Environmental Sciences, University College CorkLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/689032016-11-29T02:25:35Z2016-11-29T02:25:35ZThe future of electronics is light<figure><img src="https://images.theconversation.com/files/147248/original/image-20161123-19717-hrc3xx.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A basic design of a light-based chip.</span> <span class="attribution"><span class="source">Arnab Hazari</span>, <span class="license">Author provided</span></span></figcaption></figure><p>For the past four decades, the electronics industry has been driven by what is called “<a href="http://www.mooreslaw.org/">Moore’s Law</a>,” which is not a law but more an axiom or observation. Effectively, it suggests that the electronic devices double in speed and capability about every two years. And indeed, every year tech companies come up with new, faster, smarter and better gadgets.</p>
<p>Specifically, Moore’s Law, as articulated by Intel cofounder Gordon Moore, is that “The number of transistors incorporated in a chip will <a href="https://www-ssl.intel.com/content/www/us/en/history/museum-gordon-moore-law.html">approximately double every 24 months</a>.” Transistors, tiny electrical switches, are the fundamental unit that drives all the electronic gadgets we can think of. As they get smaller, they also <a href="http://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html">get faster and consume less electricity</a> to operate.</p>
<p>In the technology world, one of the biggest questions of the 21st century is: How small can we make transistors? If there is a limit to how tiny they can get, we might reach a point at which we can no longer continue to make smaller, more powerful, more efficient devices. It’s an industry with <a href="https://www.statista.com/statistics/272115/revenue-growth-ce-industry/">more than US$200 billion</a> in annual revenue in the U.S. alone. Might it stop growing?</p>
<h2>Getting close to the limit</h2>
<p>At the present, companies like Intel are mass-producing transistors <a href="https://www-ssl.intel.com/content/www/us/en/silicon-innovations/intel-14nm-technology.html">14 nanometers across</a> – just 14 times wider than <a href="https://dx.doi.org/10.1016/0022-2836(81)90099-1">DNA molecules</a>. They’re made of silicon, the <a href="http://hyperphysics.phy-astr.gsu.edu/hbase/Tables/elabund.html">second-most abundant material</a> on our planet. Silicon’s atomic size is <a href="http://www.extremetech.com/computing/97469-is-14nm-the-end-of-the-road-for-silicon-lithography">about 0.2 nanometers</a>.</p>
<p>Today’s transistors are about 70 silicon atoms wide, so the possibility of making them even smaller is itself shrinking. We’re getting very close to the limit of how small we can make a transistor.</p>
<p>At present, transistors use electrical signals – electrons moving from one place to another – to communicate. But if we could use light, made up of photons, instead of electricity, we could make transistors even faster. My work, on finding ways to integrate light-based processing with existing chips, is part of that nascent effort.</p>
<h2>Putting light inside a chip</h2>
<p>A <a href="https://reibot.org/2011/09/06/a-beginners-guide-to-the-mosfet/">transistor has three parts</a>; think of them as parts of a digital camera. First, information comes into the lens, analogous to a transistor’s source. Then it travels through a channel from the image sensor to the wires inside the camera. And lastly, the information is stored on the camera’s memory card, which is called a transistor’s “drain” – where the information ultimately ends up.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=550&fit=crop&dpr=1 600w, https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=550&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=550&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=691&fit=crop&dpr=1 754w, https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=691&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/147791/original/image-20161128-22729-tc0olq.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=691&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Light waves can have different frequencies.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:VisibleEmrWavelengths.svg">maxhurtz</a></span>
</figcaption>
</figure>
<p>Right now, all of that happens by moving electrons around. To substitute light as the medium, we actually need to move photons instead. Subatomic particles like electrons and photons travel in a wave motion, vibrating up and down even as they move in one direction. The length of each wave depends on what it’s traveling through. </p>
<p>In silicon, the most efficient wavelength for photons is <a href="http://www.its.bldrdoc.gov/fs-1037/dir-040/_5927.htm">1.3 micrometers</a>. This is very small – a human hair is <a href="http://www.nano.gov/nanotech-101/what/nano-size">around 100 micrometers across</a>. But <a href="http://homepages.rpi.edu/%7Esawyes/Models_review.pdf">electrons in silicon</a> are even smaller – with wavelengths <a href="http://hyperphysics.phy-astr.gsu.edu/hbase/quantum/debrog2.html">50 to 1,000 times shorter</a> than photons.</p>
<p>This means the equipment to handle photons needs to be bigger than the electron-handling devices we have today. So it might seem like it would force us to build larger transistors, rather than smaller ones.</p>
<p>However, for two reasons, we could keep chips the same size and deliver more processing power, shrink chips while providing the same power, or, potentially both. First, a <a href="http://www.nature.com/lsa/focus/circuits/index.html">photonic chip</a> needs only a few light sources, generating photons that can then be directed around the chip with very small lenses and mirrors.</p>
<p>And second, light is much faster than electrons. On average photons can travel about <a href="http://education.jlab.org/qa/electron_01.html">20 times faster</a> than electrons in a chip. That means computers that are 20 times faster, a speed increase that would take about 15 years to achieve with current technology.</p>
<p>Scientists have demonstrated <a href="http://www.nature.com/lsa/focus/circuits/index.html">progress toward photonic chips</a> in recent years. A key challenge is making sure the new light-based chips can work with all the existing electronic chips. If we’re able to figure out how to do it – or even to use light-based transistors to enhance electronic ones – we could see significant performance improvement.</p>
<h2>When can I get a light-based laptop or smartphone?</h2>
<p>We still have some way to go before the first consumer device reaches the market, and progress takes time. The first transistor was made in the year 1907 using vacuum tubes, which were <a href="http://www.edisontechcenter.org/VacuumTubes.html">typically between one and six inches tall</a> (on average 100 mm). By 1947, the current type of transistor – the one that’s now just 14 nanometers across – was invented and it was <a href="https://en.wikipedia.org/wiki/History_of_the_transistor#The_first_transistor">40 micrometers long</a> (about 3,000 times longer than the current one). And in 1971 the first commercial microprocessor (the powerhouse of any electronic gadget) was <a href="https://en.wikipedia.org/wiki/Intel_4004">1,000 times bigger</a> than today’s when it was released.</p>
<p>The vast research efforts and the consequential evolution seen in the electronics industry are only starting in the photonic industry. As a result, current electronics can perform tasks that are far more complex than the best current photonic devices. But as research proceeds, light’s capability will catch up to, and ultimately surpass, electronics’ speeds. However long it takes to get there, the future of photonics is bright.</p><img src="https://counter.theconversation.com/content/68903/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Arnab Hazari's research group receives funding from the National Science Foundation, under MRSEC program.</span></em></p>As electronic transistors get tinier, they approach a point at which they won’t be able to get smaller. How can we keep shrinking our devices, and making them more powerful at the same time? Light.Arnab Hazari, Ph.D. student in Electrical Engineering, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/587472016-05-03T20:32:30Z2016-05-03T20:32:30ZSmall and bright: what nanophotonics means for you<figure><img src="https://images.theconversation.com/files/120943/original/image-20160503-19535-16aqe0o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Nanophotonics uses photons to do amazing things.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Twenty fifteen was UNESCO’s <a href="http://www.light2015.org/Home.html">International Year of Light and Light based Technologies</a>. It was a celebration of past milestones in optics and photonics and a look forward into its future. </p>
<p>We celebrated 1,000 years of <a href="http://www.light2015.org/Home/ScienceStories/1000-Years-of-Arabic-Optics.html">Arabic optics</a>, 150 years since <a href="https://en.wikipedia.org/wiki/Maxwell%27s_equations">James Maxwell’s electrodynamics</a>, 100 years since Albert Einstein’s <a href="https://theconversation.com/au/topics/general-relativity">general relativity</a> and 50 years since the invention of optical fibres. This year we celebrate 100 year since <a href="https://www.nyu.edu/pages/linguistics/courses/v610003/shan.html">Claude Shannon</a>, who introduced the theory of information, was born. </p>
<p>Optics began with the development of lenses by the ancient Egyptians and Mesopotamians, followed by theories on light and vision developed by ancient Greek philosophers. </p>
<p>The basic principles of optics are familiar: we wear glasses that rely on refraction to bend light in ways that magnify and sharpen images, use microscopes to see into microscopic worlds and telescopes to look to the stars.</p>
<p>We are probably less familiar with photonics. <a href="https://theconversation.com/au/topics/photonics">Photonics</a> deals with the generation, detection and manipulation of photons, the building blocks of light. The field sprang from the invention of the laser and <a href="https://theconversation.com/au/topics/fiber-optics">fibre optics</a> in the 1960s. </p>
<p>Optical fibres are silica glass wires the size of a human hair that transmit vast amounts of laser-generated information, forming the backbone of today’s internet. </p>
<p>The smartphone also exemplifies the importance of photonics: we use lasers to machine the casing; optics are used in the lithography that manufactures the microelectronic circuits; and the display and the network that connects the phones are both photonics based. </p>
<p>The next milestone will be when the photonics is integrated into the smartphone itself.</p>
<h2>Dawn of nanophotonics</h2>
<p>The 21st century will be the century of photonics and nanotechnology – nanophotonics – which deals the study of the behaviour of light on the nanometre scale, and of the interaction of nanometre-scale objects with light.</p>
<p>It is worth noting that the nanoscale is usually cited as 1–100 nanometres, so a nanometre is a billionth of a metre. In photonics, we are dealing with light waves that have a wavelength around a micron (one thousand nanometres). </p>
<p>However, these light waves interact at around the nanometre scale. So too are the structures that matter when it comes to manipulating this light.</p>
<p>At the University of Sydney we have been creating a new optical processing technology based on nanophotonics. This research is being undertaken by the <a href="http://www.cudos.org.au/">CUDOS ARC Centre of Excellence</a>, which is headquartered in the School of Physics and the <a href="http://sydney.edu.au/nano/hub/index.shtml">Sydney Nanoscience Hub</a> at the University of Sydney with nodes at ANU, RMIT University, Macquarie University, Monash University, Swinburne University and UTS.</p>
<p>At CUDOS we want to take the next step in the evolution of this technology. We want to build a truly photonic chip that will essentially put the entire optical network on to a chip the size of your thumbnail. </p>
<p>By doing this, we can leverage the massive semiconductor industry to harness the processing power of light on a length scale that can be mass produced and integrated into smart devices.</p>
<p>Fortunately silicon – which is the basis of microelectronics – is compatible with photonics. Most silicon chips today, such as the one in your computer and smartphone, use electrons to transmit information and perform computations. The trick has been getting these chips to work with light as well as electrons. </p>
<p>We now can build photonic circuits into the same silicon, although we are not talking about replacing the transistors in conventional chips with optical transistors. Photonics complements and interfaces with electronics.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/53FbwBYrPXI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">How nanophotonics can combine with microprocessors.</span></figcaption>
</figure>
<p>Photonic chips, or photonic integrated circuits (PICs), represent a new paradigm in information processing. Over the past decade, CUDOS and other researchers around the world have created PICs for a range of applications spanning communications, computing, defence and security, medicine and sensing. </p>
<p>In communication systems, photonic chips can increase the capacity of our communications networks. In data centres, they are reducing the energy consumption, which matters because every Google search today consumes the energy required to boil a cup of water. </p>
<p>In defence photonic chips can enhance radar technology that helps protect our assets and personnel. And in health, we can reduce the scale and complexity of medical devices that are used to diagnose disease.</p>
<p>Another benefit is in “switching”, which is central to all communications networks. At the new Sydney Nanoscience Hub, we are building nanoscale switching technologies that can switch at the speed of light, thousands of times faster than current switching technology.</p>
<p>We are using state of the art lithography, such as the tools in the Nanoscience Hub’s clean room, to fabricate nanoscale circuits and structures. Lithography literally means printing, but in this context we are printing circuits on silicon wafers with nanometre scale features. </p>
<h2>Bright future</h2>
<p>So what’s next? We need to transform PICs into active devices that sense and interact, analyse, respond to and manipulate their environment. </p>
<p>We are already building photonic spectroscopy techniques into the same silicon chip that performs electronic processing in your smartphone. This will potentially enable your smartphone to perform tasks such as medical diagnosis, including analysing blood or saliva, or sense pollutants in the environment via spectroscopy technologies. </p>
<p>But photonics is not well suited to some of these tasks.</p>
<p>So we need moving parts that can manipulate the microscopic world; we need mechanical actuation at the nanoscale, and we really would prefer a chip with no moving parts. </p>
<p>Our approach is to use sound waves that can be generated on the chip. These are not the traditional sound waves that we hear or use in ultrasound, but ultrahigh frequency sound waves. We refer to them as “phonons”, which are particles of sound, just as photons are particles of light. </p>
<p>We are talking about hypersound, phonons with frequencies from 100 megahertz to tens of gigahertz. We are building a completely new chip that incorporates a photonic circuit for these hypersound phonons. </p>
<p>Harnessing hypersound on a chip enables the manipulation of microscale biological and chemical elements, which means we can mix, sort and select and even create a centrifuge on a chip. This is a laboratory-on-a-chip that can be integrated into the smart phone.</p>
<p>This represents a new paradigm for information processing. The speed of sound is about 100,000 times slower than the speed of light. We can couple information from the light wave to hypersound and store information.</p>
<p>The phonon frequencies coincide with the radio frequencies that are important in next generation mobile communications and radar, which allows us to process these microwave waves via the interaction between optical and phonon waves.</p>
<p>Australia has always punched well above it’s weight in photonics research and commercialisation. We now have the nanoscience and nanotechnology infrastructure and capacity to take the next big step, which is to bring photonics on to the chip where it will transform our lives.</p><img src="https://counter.theconversation.com/content/58747/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ben Eggleton receives funding from the Australian Research Council, the NSW Department of Trade and Investment and The US Air Force Office of Research. He is on the Council of the Australian Optical Society and the Board of Governors for IEEE Photonics Society.</span></em></p>Nanophotonics deals with photons at the nanometre scale, and it’s set to transform everything from internet speeds to turning your smartphone into a portable science lab.Benjamin J. Eggleton, Professor; ARC Laureate Fellow, Director, ARC Centre of Excellence for Ultrahigh bandwidth Devices for Optical Systems, University of SydneyLicensed as Creative Commons – attribution, no derivatives.