tag:theconversation.com,2011:/au/topics/imaging-technologies-44353/articles
Imaging technologies – The Conversation
2018-05-18T10:41:43Z
tag:theconversation.com,2011:article/94282
2018-05-18T10:41:43Z
2018-05-18T10:41:43Z
75 years of instant photos, thanks to inventor Edwin Land’s Polaroid camera
<figure><img src="https://images.theconversation.com/files/219447/original/file-20180517-26274-1f6mmvc.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2618%2C2070&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Edwin Land, on the left, invented and commercialized a number of technologies, most of which centered on light.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-OH-USA-APHS150797-Polaroid-Land-Camera/155ca24494f748d3aae778e1db3f8755/2/0">AP Photo</a></span></figcaption></figure><p>It probably happens every minute of the day: A little girl demands to see the photo her parent has just taken of her. Today, thanks to smartphones and other digital cameras, we can see snapshots immediately, whether we want to or not. But in 1943 when <a href="https://www.acs.org/content/acs/en/education/whatischemistry/landmarks/land-instant-photography.html">3-year-old Jennifer Land</a> asked to see the family vacation photo that her dad had just taken, the <a href="https://www.library.hbs.edu/hc/polaroid/instant-photography/the-idea-of-instant-photography/">technology didn’t exist</a>. So her dad, <a href="https://www2.rowland.harvard.edu/book/export/html/16141">Edwin Land, went to work inventing it</a>.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroid camera faces the viewer" src="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=884&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=884&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=884&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1111&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1111&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1111&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The original Polaroid camera freed users from needing to trek to a darkroom to develop their images.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/cNomGxIq6MI">Lindsay Moe/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Three years later, after plenty of scientific development, Land and his Polaroid Corp. realized the miracle of nearly instant imaging. The film exposure and processing hardware are contained within the camera; there’s no muss or fuss for the photographer, who just points and shoots and then watches the image materialize on the photo once it spools out of the camera. Land demonstrated his new technology publicly for the first time on <a href="https://mobile.twitter.com/OpticaWorldwide/status/1098613395765501955">Feb. 21, 1947, at a meeting</a> of the Optical Society of America.</p>
<p>Land is probably best known for the “instant photo” – or the spiritual progenitor of today’s <a href="http://www.dailymail.co.uk/sciencetech/article-3619679/What-vain-bunch-really-24-billion-selfies-uploaded-Google-year.html">ubiquitous selfie</a>. His Polaroid camera was first released commercially in 1948 at retail locations and prices aimed at the postwar middle class. But this is just one of a host of technological breakthroughs Land invented and commercialized, most of which centered around light and how it interacts with materials. The technology used to show a 3D movie and the goggles we wear in the theater were made possible by Land and his colleagues. The camera aboard the U-2 spy plane, as featured in the movie “<a href="https://www.imdb.com/title/tt3682448/">Bridge of Spies</a>,” was a Land product, as were even some aspects of the plane’s mechanics. He also worked on theoretical problems, drawing on a deep understanding of both chemistry and physics.</p>
<p><a href="https://scholar.google.com/citations?user=8hzH2SoAAAAJ&hl=en&oi=ao">I’m a vision scientist</a> who has touched many of the fields in which Land made great advances, through my own work on new imaging methods, image processing techniques and human color vision. As the 2018 recipient of the <a href="https://www.osa.org/en-us/awards_and_grants/awards/award_description/edwinland/">Edwin H. Land Medal</a>, awarded by the Optical Society of America and the <a href="https://www.optica.org//en-us/about/newsroom/news_releases/2018/the_optical_society_and_society_for_imaging_scienc/">Society for Imaging Science and Technology</a>, my own work relies on Land’s technological innovations that made modern imaging possible.</p>
<h2>Controlling light’s properties</h2>
<p>Edwin Land had his first optics breakthrough as a young man, when he figured out a convenient and affordable method to control one of the fundamental properties of light: polarization.</p>
<p>You can think of light as waves propagating from a source. Most light sources produce a mixture of waves with all different physical properties, such as wavelength and amplitude of vibration. Light is considered polarized if the amplitude varies in a consistent manner perpendicular to the direction the wave is traveling.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="diagram of only vertical lightwaves passing through filter" src="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=280&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=280&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=280&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=352&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=352&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=352&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A polarizing filter can block all the light waves that don’t match its orientation.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/ko/image-vector/polarization-light-waves-421267105">Fouad A. Saad/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Given the right material for the light waves to pass through, the light waves may be rotated into another plane, slowed down or blocked. Modern 3D goggles work because one eye receives light waves vibrating along the horizontal plane while the other eye receives the light vibrating along the vertical plane. </p>
<p>Before Land, researchers built components to control polarization from rock crystals, which were assigned almost magical names and properties, though they merely decreased the velocity or amplitude of light waves traveling at specific orientations. Land created “polarizers” by growing small crystals and embedding them in plastic sheets, altering the light passing through depending on its orientation in relation to the rows of crystals. His inexpensive polarizer made it possible to reliably and practically filter light so only wavelengths with a particular orientation would pass through.</p>
<p>Land founded the Polaroid Corp. in 1937 to commercialize his new technology. His sheet polarizers found applications ranging from the identification of chemical compounds to adjustable sunglasses. Polarizing filters became standard in photography to reduce glare. Today the principles of polarized light are used in most computer and cellphone screens to enhance contrast, decrease glare and even turn on or off individual pixels.</p>
<p><a href="https://doi.org/10.1167/iovs.03-0124">Polarizing filters help researchers visualize structures</a> that might not be seen otherwise – from astronomical features to biological structures. In my own field of vision science, polarization imaging localizes classes of chemicals, such as <a href="https://doi.org/10.1364/JOSAA.24.001468">protein molecules leaking from blood vessels</a> in diseased eyes. Polarization is also combined with high-resolution imaging techniques to detect <a href="https://doi.org/10.1038/s41598-017-03529-8">cellular damage</a> beneath the reflective retinal surface. </p>
<h2>A new way to get the data out</h2>
<p>Before the days of high-speed digital capture of data and affordable high-resolution displays, or use of videotape, Polaroid photography was the method of choice to obtain output in many scientific labs. Experiments or medical tests needed graphical or pictorial output for interpretation, often from an analog oscilloscope which plotted out a voltage or current change over time. The oscilloscope was fast enough to capture key features of the data – but recording the output for later analysis was a challenge before Land’s instant camera came along.</p>
<p>A common example in vision science is the recording of eye movements. A research study reported in 1960 plotted light reflected from an observer’s moving eye on an oscilloscope screen, which was photographed with a <a href="https://doi.org/10.1364/JOSA.50.000245">mounted Polaroid camera</a> – not unlike the consumer Polaroid camera a family might pull out at a birthday party. For decades, research labs and medical facilities used <a href="https://www.ebay.com/p/Tektronix-C-5c-Oscilloscope-Camera-for-Polaroid-Film-B054450/1437576020">setups consisting of a Polaroid camera and a mounting rig</a> to collect electrical signals displayed on oscilloscope screens. The format sizes are less than dazzling compared to modern digital resolutions, but they were revolutionary at the time.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=599&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=599&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=599&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=753&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=753&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=753&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Land’s inventions led to the widespread use of polarized light to characterize tissues and objects, as in this pseudo-color image of a diabetic patient’s retina that unmasks irregular structures caused by edema.</span>
<span class="attribution"><span class="source">Ann Elsner</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>In 1987, with the founding of my new retinal imaging laboratory, there was no inexpensive method to provide shareable output of our <a href="https://doi.org/10.1016/0042-6989(95)00100-E">novel images</a>. After a few years of struggling to obtain high-quality output for conferences and publications, the Polaroid Corp. came to our rescue, with the donation of a printer, allowing our scientific contributions to reach an audience beyond our lab.</p>
<h2>Eyes are not cameras</h2>
<p>Land’s contributions go beyond patenting over 500 innovations and inventing products that millions purchased. His understanding of the interaction of light and matter promoted novel ways of characterizing chemicals with polarized light. And he provided insights into the workings of the human visual system that had seemed to defy the laws of physics, coming up with what he called the <a href="https://pdfs.semanticscholar.org/8b2a/d82ce40117417fa36ba16941ce022f2185f3.pdf">Retinex theory</a> of color vision to explain how people perceive a broad range of color <a href="https://doi.org/10.1364/JOSAA.3.000916">without the expected wavelengths</a> being present in the room.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroids clipped to a string agains brick wall" src="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Quick prints can be shared and displayed.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/hillaryandanna/760585681">Hillary Hartley</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Despite his brilliance, Land’s Polaroid Corp. eventually hit hard times in the decades after his death in 1991. Heavily invested in its film sales, Polaroid wasn’t prepared as all tiers of the imaging market went digital, with everyone from consumer photographers to high-end medical and optical imagers abandoning film and processing.</p>
<p>But rather than sink with the film market, Polaroid reinvented itself with new products that could help output the new world of digital images. And in a case of history repeating itself, <a href="https://us.polaroid.com/collections/instant-cameras">Polaroid</a> and other manufacturers of instant cameras are enjoying renewed popularity with younger generations who had no exposure to the original versions. Just like little Jennifer Land, plenty of people today still want a tangible version of their pictures, right now.</p>
<p><em>This is an updated version of an article originally published on May 18, 2018. It corrects the year Jennifer Land inspired her father’s invention.</em></p><img src="https://counter.theconversation.com/content/94282/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ann Elsner receives funding from NIDILRR and NIH. She owns shares in Aeon Imaging, LLC.</span></em></p>
Whether at a family gathering or in a research lab, getting access to images immediately was a game-changer. And Land’s innovations went far beyond the instant photo.
Ann Elsner, Professor of Optometry, Indiana University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/93628
2018-03-22T10:36:04Z
2018-03-22T10:36:04Z
The dinosaur that got away: how we diagnosed a 200-million-year-old infected predator bite
<figure><img src="https://images.theconversation.com/files/211201/original/file-20180320-31602-nc28x0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Reconstruction of the bite wound affecting the shoulder of our herbivorous dinosaur.</span> <span class="attribution"><span class="source">Zongda Zhang/Lida Xing</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><blockquote>
<p>Nature, red in tooth and claw. </p>
</blockquote>
<p>When Tennyson published his poem <a href="http://www.online-literature.com/tennyson/718/">In Memoriam</a>, little did he know that this phrase from it would become so intimately associated with the process of Darwinian natural selection. Five little words which evoke the harsh evolutionary realities of competition for food, resources and life itself between predator and prey, the hunter and the hunted. </p>
<p>Now my colleagues and I, led by Lida Xing from the China University of Geosciences (Beijing), have <a href="http://www.nature.com/articles/s41598-018-23451-x">published evidence</a> of one lucky animal that got away – in this case, a herbivorous dinosaur from China. Our work highlights how the use of X-ray tomography – a rapidly developing technique in digital imaging – is revolutionising the study of the fossil record.</p>
<p>Our dinosaur is <em>Lufengosaurus huenei</em>, a Lower Jurassic sauropod, who would have lived 200-170m years ago in what is now Yunnan Province, China. <a href="http://www.prehistoric-wildlife.com/species/l/lufengosaurus.html"><em>Lufengosaurus</em></a> was a herbivore, around six metres in length and weighing a little under two tonnes. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=342&fit=crop&dpr=1 600w, https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=342&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=342&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=430&fit=crop&dpr=1 754w, https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=430&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/211351/original/file-20180321-165568-1tywk1n.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=430&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Map showing the location of the dinosaur fossil discovery.</span>
<span class="attribution"><span class="source">Lida Xing</span></span>
</figcaption>
</figure>
<p>When the dinosaur was excavated in 1997, there was a pathological abnormality on one of the right ribs of the animal. Viewed from the side, there is a concave section of missing bone which cuts almost halfway through the rib. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1138&fit=crop&dpr=1 600w, https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1138&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1138&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1430&fit=crop&dpr=1 754w, https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1430&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/211346/original/file-20180321-165568-m5eei6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1430&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The pathological rib of Lufengosaurus, showing the removal of a large area of bone.</span>
<span class="attribution"><span class="source">Lida Xing</span></span>
</figcaption>
</figure>
<p>The traditional approach in studying bone pathology is what is termed “morphoscopic evaluation”. This usually involves low powered magnification of the bone, but this would only image the external surface of the fossil. In the case of our rib, the lesion penetrated deep into the bone, so seeing the internal structure was needed for a diagnosis. </p>
<p>Now, 20 years after its initial discovery, we have used <a href="https://www.microphotonics.com/how-does-a-microct-scanner-work/">X-ray micro-computed tomography</a>, or micro-CT for short, to image the deep structures of our dinosaur.</p>
<h2>Seeing inside fossils</h2>
<p>Tomography (from the Greek <em>tomos</em> to slice, and <em>graphos</em> to write) is a non-invasive technique that has significant diagnostic advantages over conventional methods, allowing high-resolution slices and 3D images to be built up of internal structures without damaging the fossil.</p>
<p>Following micro-CT scanning, we reconstructed the cellular structure of the rib. In cross-section, there was clear evidence of both destructive changes and new bone formation which could not be observed from the outside. The pattern of these bone-destroying and bone-forming processes tells us that the disease process was both chronic (long-term) and active at the time of the animal’s death.</p>
<p>We diagnosed a process called osteomyelitis, which in this case had produced an abscess inside the bone. Osteomyelitis is a severe infection originating in the bone marrow, usually resulting from the introduction of pyogenic (pus-producing) bacteria into the bone. Pathogens enter the bone via the bloodstream, or through open wounds or fractures.</p>
<p>This is only the second case of osteomyelitis to be found in a sauropod dinosaur in the fossil record. The only other case comes from a <a href="https://www.researchgate.net/publication/308797156_The_first_evidence_of_osteomyelitis_in_a_sauropod_dinosaur">giant titanosaur from Argentina</a> who had a bacterial infection of the spine. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=610&fit=crop&dpr=1 600w, https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=610&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=610&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=766&fit=crop&dpr=1 754w, https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=766&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/211352/original/file-20180321-165577-1dduk9t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=766&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Micro-computed tomography allowed us to produce surface renderings of the fossil in 3D (top row) and 2D X-ray slices through the rib (bottom row). These show areas of cellular reorganisation, bone destruction and bone formation indicative of ostemyelitis.</span>
<span class="attribution"><span class="source">Patrick Randolph-Quinney, UCLan</span></span>
</figcaption>
</figure>
<h2>Tooth and claw</h2>
<p>In this <em>Lufengosaurus</em> we also have the earliest recorded case of a bony abscess caused by osteomyelitis in the fossil record. </p>
<p>Given the shape of the lesion, and its position on the ribcage, we think that the infection may have been caused by a puncture wound from a bite. The teardrop shape suggests that the damage was produced by a tooth or claw, and is in keeping with evidence for predator bite trauma found elsewhere in the dinosaur fossil record.</p>
<p>The bacterial infection would have had a big impact on the life of the Yunnan dinosaur. Osteomyelitis is known to produce fever, fatigue, nausea and discomfort, and may send tracts of bacteria into the brain, accelerating death. We know that the dinosaur survived for some time with this infection, but this may have made it vulnerable to other diseases or unable to fend for itself in the long term.</p>
<p>What is exciting is that this case gives us evidence of interaction between a large plant-eating dinosaur (a sauropod) and one of the aggressive predators living at that time. We don’t just have evidence of disease but of behaviour between animals – between predator and prey at this deep period in prehistory. </p>
<p>We do not know which species of predator caused the bite, but the wound from the failed attack is a smoking gun. It is possible that <a href="http://www.prehistoric-wildlife.com/species/s/sinosaurus.html"><em>Sinosaurus</em></a>, a well-known predator found in Jurassic Yunnan, would have been able to attack <em>Lufengosaurus</em>.</p>
<h2>Virtual palaeontology</h2>
<p>This discovery was only made possible by the application of X-ray tomography (micro-CT). The first commercially available micro-CT scanner appeared in 1994, but it is only in the last decade that it has begun to be used in palaeontology, partly because of the cost of the equipment. Tomography is increasingly allowing us to understand processes such as trauma and infection in the fossil record at the cellular level. </p>
<p>This technology has opened up the fossil record, allowing palaeontologists to image and analyse the deep structure of fossils. This has enabled spectacular discoveries such as the <a href="https://www.sajs.co.za/article/view/3566">earliest hominin cancer</a> and the <a href="https://www.sajs.co.za/article/view/3562">earliest tumour</a>, the <a href="https://www.nature.com/articles/s41467-018-03296-8.pdf">flight pattern of Archaeoptryx</a>, or to <a href="https://www.sciencedirect.com/science/article/pii/S2095927318300331">rebuild an early bird trapped in amber</a>. It has also allowed us to <a href="https://www.sajs.co.za/article/view/3580">correct historical cases of pathological misdiagnosis</a> in fossils. </p>
<p>The resulting scans can be shared across the world, visualised and studied without the need to access the fossils directly. They can also be <a href="http://johnhawks.net/weblog/topics/metascience/open-access/benefits-data-brouwers-q-and-a-2015.html">3D printed</a>, both in their actual size or at any other scale that we require. </p>
<p>Who knows what spectacular discoveries await us using this technology, but it is clear that the future of <a href="https://www.theguardian.com/science/2016/mar/30/getting-under-a-fossils-skin-how-ct-scans-have-changed-palaeontology-dinosaur-lizard">palaeontological research is virtual</a>.</p><img src="https://counter.theconversation.com/content/93628/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Patrick Randolph-Quinney does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
New research uses pathology in dinosaur bones to look at predator-prey interactions in the fossil record.
Patrick Randolph-Quinney, Reader/Associate Professor in Biological and Forensic Anthropology, University of Central Lancashire
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/90258
2018-01-22T12:25:46Z
2018-01-22T12:25:46Z
The next generation of cameras might see behind walls
<figure><img src="https://images.theconversation.com/files/202620/original/file-20180119-110121-1wggumj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>You might be really pleased with the camera technology in your latest smartphone, which can recognise your face and take slow-mo video in ultra-high definition. But these technological feats are just the start of a larger revolution that is underway.</p>
<p>The latest camera research is shifting away from increasing the number of mega-pixels towards fusing camera data with computational processing. By that, we don’t mean the Photoshop style of processing where effects and filters are added to a picture, but rather a radical new approach where the incoming data may not actually look like at an image at all. It only becomes an image after a series of computational steps that often involve complex mathematics and modelling how light travels through the scene or the camera.</p>
<p>This additional layer of computational processing magically frees us from the chains of conventional imaging techniques. One day we may not even need cameras in the conventional sense any more. Instead we will use light detectors that only a few years ago we would never have considered any use for imaging. And they will be able to do incredible things, like see through fog, inside the human body and even behind walls.</p>
<h2>Single pixel cameras</h2>
<p>One extreme example is the <a href="https://dx.doi.org/10.1098%252Frsta.2016.0233">single pixel camera</a>, which relies on a beautifully simple principle. Typical cameras use lots of pixels (tiny sensor elements) to capture a scene that is likely illuminated by a single light source. But you can also do things the other way around, capturing information from many light sources with a single pixel. </p>
<p>To do this you need a controlled light source, for example a simple data projector that illuminates the scene one spot at a time or with a series of different patterns. For each illumination spot or pattern, you then measure the amount of light reflected and add everything together to create the final image. </p>
<p>Clearly the disadvantage of taking a photo in this is way is that you have to send out lots of illumination spots or patterns in order to produce one image (which would take just one snapshot with a regular camera). But this form of imaging would allow you to create otherwise impossible cameras, for example that work at wavelengths of light beyond the visible spectrum, where good detectors <a href="https://www.nature.com/articles/ncomms12010">cannot be made into cameras</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-amazing-camera-that-can-see-around-corners-51948">The amazing camera that can see around corners</a>
</strong>
</em>
</p>
<hr>
<p>These cameras could be used to take photos through <a href="https://www.osapublishing.org/oe/abstract.cfm?uri=oe-23-11-14424">fog or thick falling snow</a>. Or they could <a href="http://advances.sciencemag.org/content/3/4/e1601782">mimic the eyes of some animals</a> and automatically increase an image’s resolution (the amount of detail it captures) depending on what’s in the scene.</p>
<p>It is even possible to capture images from light particles that have <a href="https://www.nature.com/articles/nature13586">never even interacted</a> with the object we want to photograph. This would take advantage of the idea of “quantum entanglement”, that two particles can be connected in a way that means whatever happens to one happens to the other, even if they are a long distance apart. This has intriguing possibilities for looking at objects whose properties might change when lit up, such as the eye. For example, does a retina look the same when in darkness as in light?</p>
<h1>Multi-sensor imaging</h1>
<p>Single-pixel imaging is just one of the simplest innovations in upcoming camera technology and relies, on the face of it, on the traditional concept of what forms an picture. But we are currently witnessing a surge of interest for systems where that use lots of information but traditional techniques only collect a small part of it.</p>
<p>This is where we could use multi-sensor approaches that involve many different detectors pointed at the same scene. <a href="https://www.nasa.gov/mission_pages/hubble/multimedia/index.html">The Hubble telescope</a> was a pioneering example of this, producing pictures made from combinations of many different images taken at different wavelengths. But now you can buy commercial versions of this kind of technology, such as the <a href="https://www.lytro.com/255D">Lytro camera</a> that collects information about light intensity and direction on the same sensor, to produce images that can be refocused after the image has been taken.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Light L16.</span>
<span class="attribution"><span class="source">Light</span></span>
</figcaption>
</figure>
<p>The next generation camera will probably look something like the <a href="https://light.co/camera">Light L16 camera</a>, which features ground-breaking technology based on more than ten different sensors. Their data are combined combined using a computer to provide a 50Mb, re-focusable and re-zoomable, professional-quality image. The camera itself looks like a very exciting Picasso interpretation of a crazy cell-phone camera.</p>
<p>Yet these are just the first steps towards a new generation of cameras that will change the way in which we think of and take images. Researchers are also working hard on the problem of seeing through fog, <a href="https://www.nature.com/articles/ncomms1747">seeing behind walls</a>, and even imaging deep inside the <a href="https://www.nature.com/articles/nphoton.2014.107">human body and brain</a>.
All of these techniques rely on combining images with models that explain how light travels through through or around different substances.</p>
<p>Another interesting approach that is gaining ground relies on artificial intelligence to “learn” to <a href="https://www.osapublishing.org/optica/abstract.cfm?uri=optica-4-9-1117">recognise objects from the data</a>. These techniques are inspired by learning processes in the human brain and are likely to play a major role in <a href="https://arxiv.org/abs/1709.07244">future imaging systems</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cDbGFT5rM0I?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Single photon and quantum imaging technologies are also maturing to the point that they can take pictures with incredibly low light levels and videos with incredibly fast speeds reaching a trillion frames per second. This is enough to even capture images <a href="https://www.nature.com/articles/ncomms7021">of light itself</a> travelling across as scene.</p>
<p>Some of these applications might require a little time to fully develop but we now know that the underlying physics should allow us to solve these and other problems through a clever combination of new technology and computational ingenuity.</p><img src="https://counter.theconversation.com/content/90258/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Daniele Faccio receives funding from EPSRC, QuantIC - The Quantum Hub for Imaging, The Leverhulme Trust, DSTL.</span></em></p><p class="fine-print"><em><span>Stephen McLaughlin receives funding from EPSRC for a variety of research grants which analyse data which require the computational imaging methods described in the article</span></em></p>
Single-pixel cameras, multi-sensor imaging and quantum technologies will change the way we take photos.
Daniele Faccio, Professor of Quantum Technologies, University of Glasgow
Stephen McLaughlin, Head of School of Engineering and Physical Sciences, Heriot-Watt University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/85229
2017-10-05T00:27:19Z
2017-10-05T00:27:19Z
Chilled proteins and 3-D images: The cryo-electron microscopy technology that just won a Nobel Prize
<figure><img src="https://images.theconversation.com/files/188879/original/file-20171004-31791-6zhlqy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Cryo-electron microscopy resolution continues to improve.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/nihgov/24030250059">Veronica Falconieri, Sriram Subramaniam, National Cancer Institute, National Institutes of Health</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>Many people will never have heard of cryo-electron microscopy before the announcement that Jacques Dubochet, Joachim Frank and Richard Henderson had won the <a href="https://www.nobelprize.org/nobel_prizes/chemistry/laureates/2017/press.html">2017 Nobel Prize in chemistry</a> for their work developing this technology. So what is it, and why is it worthy of this honor?</p>
<p>Cryo-electron microscopy – or cryo-EM – is an imaging technology that allows scientists to obtain pictures of the biological “machines” that work inside our cells. Most amazingly, it can reconstruct individual snapshots into movie-like scenes that show how protein components of these biological machines move and interact with each other.</p>
<p>It’s like the difference between having a list of all of the individual parts of an engine versus being able to see the engine fully assembled and running. The parts list can tell you a lot, but there’s no replacement for seeing what you’re studying in action.</p>
<p>What’s revolutionary about cryo-EM is not only that it lets scientists actually see and understand how important biological machines work, but that it allows us to study a vast array of important proteins that can’t be seen using any other structural biology technique.</p>
<p>Advances in both imaging technology and computing have really <a href="https://directorsblog.nih.gov/2016/01/14/got-it-down-cold-cryo-electron-microscopy-named-method-of-the-year/">pushed cryo-EM forward</a> over the last decade or so. Researchers are now able to generate atomic, or near-atomic, resolution 3-D models of challenging molecules – things like receptors that are therapeutic drug targets, molecular motors that deliver cargo to different parts of the cell and emerging viruses that lead to human disease.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=504&fit=crop&dpr=1 600w, https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=504&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=504&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=633&fit=crop&dpr=1 754w, https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=633&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/188869/original/file-20171004-13096-b4vg6a.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=633&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Cryo-EM structure of the enzyme beta-galactosidase.</span>
<span class="attribution"><span class="source">EMDB-2984</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Frozen world of cryo-EM</h2>
<p>To obtain an image using cryo-EM, researchers take proteins that have been biochemically purified from cells and instantaneously freeze the sample on a cryo-EM grid at -180 degrees Celsius. The goal of this process is to trap many copies of a single protein or a protein complex in a thin layer of vitrified ice.</p>
<p>This ice is transparent to the microscope’s electron beam and allows the proteins to retain their natural shape and organization. If the sample is frozen too slowly, then ice crystals form, ruining the structure of the molecules being studied and disrupting the electrons traveling through the sample.</p>
<p>A major advantage of this technique is that it saves time and work. Cryo-EM’s ability to look at proteins in a near-native state is in stark contrast with X-ray crystallography, the longstanding gold standard for obtaining high-resolution biomolecular images. The older technique requires the formation of ordered crystals, where the proteins must first self-organize together in repeating patterns – at best a tricky challenge, at worst an impossible one for certain molecules. With cryo-EM, there’s no need to coax biological molecules into ordered arrays.</p>
<p>Once the cryo-EM sample has been frozen, a focused beam of electrons reveals the shape of these very small, nanometer-sized proteins. (A nanometer is about one million times smaller than the tip of a needle.) Each image contains all the information required to determine the 3-D structure. But these raw images are extremely “noisy” and hard to see, so large numbers of images for each sample must be collected using the microscope.</p>
<p>Specialized computer analysis then combines hundreds of thousands of individual, 2-D snapshots from different angles into a composite that can be viewed in 3-D. Many 3-D structures determined by cryo-EM are now at a high enough resolution that researchers can visualize individual atoms in the structures.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=378&fit=crop&dpr=1 600w, https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=378&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=378&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=476&fit=crop&dpr=1 754w, https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=476&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/188863/original/file-20171004-31791-1qpxueu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=476&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Noisy raw images are on the right. Individual complexes are computationally boxed out to create galleries of particles, based on millions of images (in the middle). These individual particles are then aligned and averaged to generate 2-D images based on thousands of individual particles trapped in the vitrified ice in similar orientations (on the right).</span>
<span class="attribution"><span class="source">Melanie Ohi</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Software can also sort out proteins that are in different stages of a biological process – thus helping piece together how a biological machine moves, changes and functions. Then researchers can further test how the machine’s organization leads to specific functions in a living cell.</p>
<p>For example, several years ago researchers here at the University of Michigan Life Sciences Institute obtained the <a href="https://doi.org/10.1038/nature13423">first 3-D snapshots of the “assembly line”</a> within microorganisms that naturally produces a broad class of compounds <a href="https://youtu.be/IUw3fvpinSs">known as polyketides</a>, which includes antibiotics and other drugs. This information then gives investigators a solid blueprint for figuring out how they might redesign the microbial assembly line to produce new drugs.</p>
<p>One of the challenges of working with cryo-EM is that it requires massive amounts of computational power and data storage. One cryo-EM movie of 30 to 60 frames requires up to 8 gigabytes of storage – it would take only a couple of these movies to fill up most smartphones. About a thousand of these movies are collected in a day, requiring somewhere between one and eight terabytes of storage (or about 8,000 smartphones’ worth). A full data set for one 3-D structure can require 4,000 movies.</p>
<p>These data sets often take hundreds of thousands of hours of computer processor time to piece together. Researchers rely on supercomputers that use many processors working in parallel. The most advanced cryo-electron microscopes and accompanying computing tools require significant investments on the parts of universities, which is why they’re still relatively rare. For example, buying just the microscope and specialized camera needed to collect these cryo-EM images costs well over US$5 million.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/188876/original/file-20171004-30164-zcgbuw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Resolution of the technology has radically improved in the last few years, from mostly showing shapeless blobs to now being able to visualize proteins at atomic resolution.</span>
<span class="attribution"><a class="source" href="https://www.nobelprize.org/nobel_prizes/chemistry/laureates/2017/press.html">© Martin Högbom/The Royal Swedish Academy of Sciences</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Putting the technology to work</h2>
<p>While the technology and technique are interesting, the true power of the tool is its ability to help us answer important biological questions.</p>
<p>In the Cianfrocco lab, for example, we’re using cryo-EM to look at the dynamic process of how molecular “motors” move along microtubular “tracks” inside cells. We want to figure out the basics of how cells know what to move, when. While much work has gone into identifying the necessary building blocks for moving cargo around the cell, the molecular details remain unknown.</p>
<p>It’s these motors that move things around the cell that aren’t working correctly in Parkinson’s, Huntington’s and Charcot-Marie-Tooth disease. Learning more about how they’re malfunctioning will be critical for developing new therapeutics for these neurodegenerative diseases. Viruses also hijack these these motor proteins during infection, which means understanding more about how they walk will help researchers design effective treatments against viruses such as HIV and rabies.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/188878/original/file-20171004-22112-1hkav2y.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Melanie Ohi with grad student Amanda Erwin outside the cryo-electron microscope.</span>
<span class="attribution"><span class="source">Lesia Thompson Photography for the University of Michigan</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>And in the Ohi lab, we’re generating detailed 3-D structures of the molecular machines bacteria use to spread disease. Biological pathogens have evolved numerous ways to infect their hosts, including toxins that alter cellular functions and complex secretion systems that inject DNA and proteins into host cells. Using 3-D snapshots of these machines, we’re hoping to find new ways to target the processes bacteria use to cause disease, such as the bacteria <em>Helicobacter pylori</em>’s ability to trigger chronic inflammation, which is a major risk factor for stomach cancer. </p>
<p>One of the larger questions for the field is how to make the specialized and expensive resources required for cryo-EM more broadly available to researchers across the country and around the world. To this end, the Cianfrocco lab is developing two separate cloud computing resources with the goal to streamline cryo-EM data processing to allow structural biologists to focus on the biology, not the hardware. By removing the computing bottlenecks, these tools have the potential to continue the growth of cryo-EM into a mainstream technique for scientists worldwide.</p><img src="https://counter.theconversation.com/content/85229/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Cianfrocco has consulted on cryo-EM for pharmaceutical and biotechnology firms and is also a science advisor to Single Particle LLC.</span></em></p><p class="fine-print"><em><span>Melanie Ohi does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
The 2017 Nobel Prize in chemistry goes to three scientists who revolutionized biochemistry by inventing a technology that can image the molecules of life without destroying them.
Melanie Ohi, Research Associate Professor, U-M Life Sciences Institute and and Associate Professor of Cell and Developmental Biology, U-M Medical School, University of Michigan
Michael Cianfrocco, Research Assistant Professor at U-M Life Sciences Institute and Assistant Professor of Biological Chemistry, U-M Medical School, University of Michigan
Licensed as Creative Commons – attribution, no derivatives.