tag:theconversation.com,2011:/us/topics/imaging-1009/articlesImaging – The Conversation2024-02-19T13:44:01Ztag:theconversation.com,2011:article/2235282024-02-19T13:44:01Z2024-02-19T13:44:01ZLung cancer: Predicting which patients are at high risk of recurrence to improve outcomes<figure><img src="https://images.theconversation.com/files/575433/original/file-20240205-29-abkjt8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Chemotherapy is used to treat all lung cancer patients. Yet many would not need such invasive treatment if diagnosis of the risk of recurrence were more refined. A new technology could change all that.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>Lung cancer <a href="https://cancer.ca/en/research/cancer-statistics/cancer-statistics-at-a-glance">is responsible for more deaths than breast, colon and prostate cancer combined</a>. </p>
<p>With advancements in lung cancer screening, it is expected that more patients will be diagnosed at earlier stages, enabling them to undergo surgery, the primary treatment modality for early-stage patients.</p>
<p>However, a significant proportion of patients will have a recurrence of their cancer after resection (surgery to remove the tumour). Unfortunately, current clinical guidelines cannot help predict which patients are at risk. Better knowledge of who is at risk has significant implications for systemic therapy selection such as chemotherapy for early-stage lung cancer patients after surgery. </p>
<p>To find solutions to this problem, our research group at McGill University launched a project in collaboration with Université Laval. <a href="https://www.nature.com/articles/s41586-022-05672-3#MOESM1">Preliminary results were published in <em>Nature</em></a>. In our work we discovered that the use of a new imaging technology, along with artificial intelligence, could improve outcomes for cancer patients.</p>
<h2>Too much or too little intervention</h2>
<p>This clinical dilemma has important implications for the choice of treatment, such as chemotherapy. For example, lung cancer patients who are cured by surgery could be spared the toxicity of chemotherapy. Patients at risk of their cancer recurring could benefit from additional therapeutic interventions.</p>
<p>The challenge of predicting recurrence for patients with early-stage lung cancer has important implications for the 31,000 Canadians who are diagnosed with this terrible disease every year.</p>
<h2>Mass cytometry imaging</h2>
<p>To address this clinical problem, we used <a href="https://www.mcgill.ca/gci/facilities/single-cell-imaging-and-mass-cytometry-analysis-platform-scimap">imaging mass cytometry</a> (IMC), a new technology that allows for a comprehensive characterization of the tumour microenvironment. </p>
<p>The tumour microenvironment is a complex ecosystem composed of interactions between tumour cells, immune cells, and various structural cells. IMC can be used to visualize up to 50 markers at the cell surface, significantly more than was previously possible. </p>
<p>This technology makes it possible to identify different types of cells and determine their spatial organization, i.e. how they interact. IMC produces images that can be analyzed to determine the frequency of cell subpopulations, their activation states, the other cell types with which they interact and their organization in cellular communities. </p>
<p>The results of our study, published in <em>Nature</em>, reveal that various cell types can interact in cellular communities, and that communities composed of B cells were strongly associated with prolonged survival in lung cancer patients. Our study highlights that beyond cellular frequency, cellular interactions and spatial organization also correlate strongly with important clinical outcomes such as survival.</p>
<h2>Using artificial intelligence to make better predictions</h2>
<p>Based on our initial results, we hypothesized that important spatial features embedded within IMC images, such as cellular interactions, could be important in predicting clinical outcomes. </p>
<p>Our dataset of 416 patients and over 1.6 million cells provided sufficient power to make predictions using artificial intelligence. We sought to predict which patients with early-stage lung cancer would have a recurrence of their cancer after surgery. </p>
<p>Using 1 mm2 tumour samples, material readily available from surgical resections or biopsies, we used artificial intelligence algorithms together with IMC images to make our predictions. Our algorithm was able to predict with 95 per cent accuracy which patients would experience a cancer recurrence by using the spatial information contained within the images. </p>
<h2>Six markers can make all the difference</h2>
<p>One of the challenges in applying our results in a clinical setting is that IMC is not readily available. Clinical pathologists typically use less complex technologies such as immunofluorescence, which are often limited to three or fewer markers. </p>
<figure class="align-center ">
<img alt="image obtained using immunofluorescence" src="https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/573537/original/file-20240205-17-2oj4zm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Immunofluorescence image of a tumour treated with immunotherapy. This technology is often limited to the use of three or fewer markers at a time.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>To address this challenge, we sought to identify the minimum number of markers needed to make meaningful predictions about recurrence in lung cancer patients after surgery. By using six markers, we obtained an accuracy rate of 93 per cent, a result that is close to the 95 per cent accuracy rate obtained by using 35 markers. </p>
<p>These results suggest that by harnessing the power of artificial intelligence with existing technologies available in hospitals, we may be able to improve the post-surgical clinical management of patients with early-stage lung cancer. Our ultimate goal is to increase cure rates for those at high risk of cancer recurrence, while minimizing toxicity for those who can be cured by surgery.</p><img src="https://counter.theconversation.com/content/223528/count.gif" alt="La Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark Sorin has received funding from the Fonds de recherche du Québec and Vanier Canada Graduate Scholarships.</span></em></p><p class="fine-print"><em><span>Logan Walsh has received funding from McGill University's Interdisciplinary Infection and Immunity Initiative, the Brain Tumour Funders' Collaborative, the Canadian Institutes of Health Research (CIHR; PJT-162137), the Canada Foundation for Innovation and holds the Rosalind Goodman Research Chair in Lung Cancer.</span></em></p>Treatment for lung cancer patients is the same for everyone, regardless of the risk of recurrence. The use of a new technology could refine diagnosis.Mark Sorin, Étudiant au MD-PhD, chercheur en cancer du poumon, McGill UniversityLogan Walsh, Assistant Professor, Department of Human Genetics, McGill UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2147842023-10-05T12:33:59Z2023-10-05T12:33:59ZHow a disgruntled scientist looking to prove his food wasn’t fresh discovered radioactive tracers and won a Nobel Prize 80 years ago<figure><img src="https://images.theconversation.com/files/551579/original/file-20231002-27-bnczk3.jpg?ixlib=rb-1.1.0&rect=392%2C8%2C5059%2C3473&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">George De Hevesy working in his lab at Stockholm University in 1944. </span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/hungarian-radiochemist-george-de-hevesy-at-work-in-his-news-photo/870101654?adppopup=true">Keystone Features/Hulton Archive via Getty Images</a></span></figcaption></figure><p>Each October, the Nobel Prizes celebrate a handful of groundbreaking scientific achievements. And while many of the awarded discoveries revolutionize the field of science, some originate in unconventional places. For <a href="https://www.nobelprize.org/prizes/chemistry/1943/hevesy/biographical/">George de Hevesy</a>, the 1943 Nobel Laureate in chemistry who discovered radioactive tracers, that place was a boarding house cafeteria in Manchester, U.K., in 1911. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A black and white headshot of a young man with a mustache wearing a suit." src="https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=818&fit=crop&dpr=1 600w, https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=818&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=818&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1028&fit=crop&dpr=1 754w, https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1028&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/551573/original/file-20231002-29-bnczk3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1028&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Hungarian chemist George de Hevesy.</span>
<span class="attribution"><a class="source" href="https://upload.wikimedia.org/wikipedia/commons/b/b4/George_de_Hevesy.jpg">Magnus Manske</a></span>
</figcaption>
</figure>
<p>De Hevesey had the sneaking suspicion that the staff of the boarding house cafeteria where he ate at every day was reusing leftovers from the dinner plates – each day’s soup seemed to contain all of the prior day’s ingredients. So he came up with a plan to test his theory. </p>
<p>At the time, de Hevesy was working with radioactive material. He <a href="https://tech.snmjournals.org/content/jnmt/24/4/291.full.pdf">sprinkled a small amount</a> of radioactive material in his leftover meat. A few days later, he took an electroscope with him to the kitchen and <a href="https://tech.snmjournals.org/content/jnmt/24/4/291.full.pdf">measured the radioactivity</a> in the prepared food. </p>
<p>His landlady, who was to blame for the recycled food, exclaimed “this is magic” when de Hevesy showed her his results, but really, it was just the first successful radioactive tracer experiment. </p>
<p><a href="https://scholar.google.com/citations?user=vlmJRrsAAAAJ&hl=en">We are</a> a team <a href="https://www.chemistry.msu.edu/faculty-research/faculty-members/liddick-sean.aspx">of chemists</a> and physicists <a href="https://scholar.google.com/citations?user=MkkjF8YAAAAJ&hl=en">who work</a> at the <a href="https://frib.msu.edu">Facility for Rare Isotope Beams</a>, located at Michigan State University. De Hevesy’s early research in the field has revolutionized the way that modern scientists like us use radioactive material, and it has led to a variety of scientific and medical advances.</p>
<h2>The nuisance of lead</h2>
<p>A year before conducting his recycled ingredients experiment, Hungary-born de Hevesy had <a href="https://orau.org/health-physics-museum/articles/four-tales-george-de-hevesy.html">traveled to the U.K.</a> to start work with nuclear scientist <a href="https://www.nobelprize.org/prizes/chemistry/1908/rutherford/facts/">Ernest Rutherford</a>, who’d won a Nobel Prize just two years prior.</p>
<p>Rutherford was at the time <a href="https://doi.org/10.1021/ed040p36">working with a radioactive substance</a> called radium D, a valuable byproduct of radium because of <a href="https://www.britannica.com/science/half-life-radioactivity">its long half-life</a> (22 years). However, Rutherford couldn’t use his radium D sample, as it had large amounts of lead mixed in. </p>
<p>When de Hevesy arrived, Rutherford asked him <a href="https://tech.snmjournals.org/content/jnmt/24/4/291.full.pdf">to separate the radium D</a> from the nuisance lead. The nuisance lead was made up of a combination of stable isotopes of lead (Pb). Each isotope had the same number of protons (82 for lead), but a different number of neutrons.</p>
<p>De Hevesy worked on separating the radium D from the natural lead using chemical separation techniques for almost two years, <a href="https://www.nobelprize.org/prizes/chemistry/1943/hevesy/lecture/">with no success</a>. The reason for his failure was that, unknown to anyone at the time, radium D was actually a different form of lead – namely the radioactive isotope, or radioisotope Pb-210. </p>
<p>Nevertheless, de Hevesy’s failure led to an even bigger discovery. The creative scientist figured out that if he could not separate radium D from natural lead, he could use it as a tracer of lead.</p>
<p><a href="https://theconversation.com/hunting-for-rare-isotopes-the-mysterious-radioactive-atomic-nuclei-that-will-be-in-tomorrows-technology-86177">Radioactive isotopes</a>, like Pb-210, are unstable isotopes, which means that over time they will transform into a different element. During this transformation, called radioactive decay, they typically release particles or light, which can be <a href="https://www.britannica.com/science/radioactivity">detected as radioactivity</a>. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/TJgc28csgV0?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Through radioactivity, an unstable isotope can turn from one element to another.</span></figcaption>
</figure>
<p>This radioactivity acts as a signature indicating the presence of the radioactive isotope. This critical property of radioisotopes allows them to be used as tracers.</p>
<h2>Radium D as a tracer</h2>
<p><a href="https://www.iaea.org/topics/radiotracers">A tracer</a> is a substance that stands out in a crowd of similar material because it has unique qualities that make it easy to track. </p>
<p>For example, if you have a group of kindergartners going on a field trip and one of them is wearing a smartwatch, you can tell if the group went to the playground by tracking the GPS signal on the smartwatch. In de Hevesy’s case, the kindergartners were the lead atoms, the smart watch was radium D, and the GPS signal was the emitted radioactivity. </p>
<p>In the 1910s, the <a href="https://doi.org/10.1007/PL00000541">Vienna Institute of Radium Research</a> had a <a href="https://doi.org/10.1098/rsnr.2013.0070">larger collection of radium</a> and its byproducts than any other institution. To continue his experiments with radium D, de Hevesy moved to Vienna in 1912. </p>
<p>He collaborated with Fritz Paneth, who had also attempted the impossible task of separating radium D from lead without success. The two scientists “spiked” samples of different chemical compounds with small amounts of a radioactive tracer. This way they could study chemical processes by tracking the movement of the radioactivity <a href="https://www.nobelprize.org/uploads/2018/06/hevesy-lecture.pdf">across different chemical reactions</a></p>
<p>De Hevesy continued his work studying chemical processes using different isotopic markers for many years. He even was the first to introduce nonradioactive tracers. One nonradioactive tracer he studied was a heavier isotope of hydrogen, <a href="https://www.iaea.org/newscenter/news/what-is-deuterium">called deuterium</a>. Deuterium is 10,000 times less abundant than common hydrogen, but is roughly twice as heavy, which makes it easier to separate the two.</p>
<p>De Hevesy and his co-author used deuterium to track water in their bodies. In their investigations, they took turns ingesting samples and measuring the deuterium in their urine to study <a href="https://doi.org/10.1038/134879a0">the elimination of water</a> from the human body. </p>
<p>De Hevesy was awarded the <a href="https://www.nobelprize.org/prizes/chemistry/1943/summary/">1943 Nobel Prize in chemistry</a> “for his work on the use of isotopes as tracers in the study of chemical processes.” </p>
<h2>Radioactive tracers today</h2>
<p>More than a century after de Hevesy’s experiments, many fields now routinely use radioactive tracers, from medicine to materials science and biology. </p>
<p>These tracers can monitor the progression of disease in <a href="https://doi.org/10.3390/ijms23095023">medical procedures</a>, the uptake of nutrients in <a href="https://doi.org/10.2976/1.2921207">plant biology</a>, the age and flow of <a href="https://doi.org/10.5194/hess-24-249-2020">water in aquifers</a> and the <a href="https://doi.org/10.1016/j.apradiso.2021.110076">measurement of wear and corrosion of materials</a>, among other applications. Radioisotopes allow researchers to follow the paths of nutrients and drugs in living systems without invasively cutting the tissue.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Four brain scans, two in contrasted colors with the background shown as white and the brain as gray, two with the background shown as black and the brain shown either as gray or orange." src="https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=453&fit=crop&dpr=1 600w, https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=453&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=453&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=570&fit=crop&dpr=1 754w, https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=570&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/551730/original/file-20231003-15-397yxg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=570&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Radioactive tracers, seen in the top left photo as a white spot and indicated by an arrow in the top right, are often used today in brain scans.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/positron-emission-tomography-ct-scan-uses-a-royalty-free-image/1463929233?phrase=brain+scan+radioactive+tracer&adppopup=true">mr. suphachai praserdumrongchai/iStock via Getty Images</a></span>
</figcaption>
</figure>
<p>In modern research, scientists focus on producing new isotopes and on developing procedures to use radioactive tracers more efficiently. The <a href="https://frib.msu.edu/">Facility for Rare Isotope Beams</a>, or FRIB, where the three of us work, has a program dedicated to the production and harvesting of unique radioisotopes. These radioisotopes are then used in medical and other applications. </p>
<p><a href="https://theconversation.com/powerful-linear-accelerator-begins-smashing-atoms-2-scientists-on-the-team-explain-how-it-could-reveal-rare-forms-of-matter-185754">FRIB produces radioactive beams</a> for its basic science program. In the production process, a large number of unused isotopes are collected in a tank of water, where they can be later <a href="https://doi.org/10.1039/D0NJ04411C">isolated and studied</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two scientists, a woman wearing a white shirt and a man wearing a dark blue shirt, squat on the concrete ground in a laboartory with lots of machinery and shelves, and a green lit ceiling." src="https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/552099/original/file-20231004-26-tls88s.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Scientists Greg Severin and Katharina Domnanich at the Facility for Rare Isotope Beams.</span>
<span class="attribution"><span class="source">Facility for Rare Isotope Beams.</span></span>
</figcaption>
</figure>
<p>One recent study involved the <a href="https://doi.org/10.1039/D0NJ04411C">isolation of the radioisotope Zn-62</a> from the irradiated water. This was a challenging task considering there were 100 quadrillion times more water molecules than Zn-62 atoms. Zn-62 is an important radioactive tracer utilized to follow the metabolism of zinc in plants and in nuclear medicine.</p>
<p>Eighty years ago, de Hevesy managed to take a dead-end separation project and turn it into a discovery that created a new scientific field. Radioactive tracers have already changed human lives in so many ways. Nevertheless, scientists are continuing to develop new radioactive tracers and find innovative ways to use them.</p><img src="https://counter.theconversation.com/content/214784/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Artemis Spyrou receives funding from the National Science Foundation and the Department of Energy.</span></em></p><p class="fine-print"><em><span>Sean Liddick receives funding from the Department of Energy and the National Nuclear Security Administration. He is affiliated with the Facility for Rare Isotope Beams.</span></em></p><p class="fine-print"><em><span>Katharina Domnanich does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Some Nobel Prize-winning ideas originate in strange places, but still go on to revolutionize the scientific field. George de Hevesy’s research on radioactive tracers is one such example.Artemis Spyrou, Professor of Nuclear Physics, Michigan State UniversityKatharina Domnanich, Assistant Professor of Chemistry, Michigan State UniversitySean Liddick, Associate Professor of Chemistry, Michigan State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1924632022-10-19T15:16:56Z2022-10-19T15:16:56Z3-D techniques shed light on what makes a bird’s lungs so efficient<figure><img src="https://images.theconversation.com/files/490042/original/file-20221017-25-7od4j8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A Cinnamon-Chested Bee-Eater is released after being ringed at the National Museum of Kenya.</span> <span class="attribution"><span class="source">Luke Dray/Getty Images</span></span></figcaption></figure><p>Birds are profoundly important animals. As predators, <a href="https://theconversation.com/biodiversity-depends-on-pollinators-a-first-estimate-of-how-many-plants-rely-on-animals-166908">pollinators</a>, seed dispersers, scavengers and <a href="https://theconversation.com/mistletoes-locust-bean-trees-and-birds-work-together-in-nigerias-forest-ecology-177264">ecosystem bioengineers</a>, the world’s 11,000 species of birds play critical roles in the food chain and therefore the existence of animal life. </p>
<p>They have also shaped the advancement of human societies <a href="https://theconversation.com/how-birds-are-used-to-reveal-the-future-130844">culturally</a>, philosophically, artistically, economically and scientifically. Birds feature prominently in the history of painting, poetry, commerce and <a href="https://theconversation.com/birdsong-has-inspired-humans-for-centuries-is-it-music-79000">music</a>. </p>
<p>Since they can easily escape from unsuitable habitats, birds are important “sentinel” animals: the number and diversity of species indicates environmental health. BirdLife International’s <a href="https://www.birdlife.org/state-of-the-worlds-birds/">State of the World’s Birds Report for 2022</a> says that about half of all bird species are decreasing and more than one in eight of them are at risk of extinction.</p>
<p>Knowledge of bird biology and their place in ecosystems contributes to devising conservation efforts. Biology explains why animals behave the way they do and what threatens their survival.</p>
<p>One of the aspects of bird biology that has long interested scientists is their lungs. They are structurally very complex and functionally efficient. Their lungs are what allows birds to fly. Flying uses a huge amount of energy and some birds fly nonstop over <a href="https://theconversation.com/as-far-as-the-moon-and-back-twice-heres-a-look-at-the-most-extraordinary-journeys-migrating-birds-make-168904">very long distances</a> or at very <a href="https://theconversation.com/migratory-birds-found-to-be-flying-much-higher-than-expected-new-research-167582">high altitudes</a> where there is little oxygen.</p>
<p>Even after extensive study, questions about the bioengineering of the avian respiratory system have persisted. They relate to how the airways and blood vessels are shaped, arranged and connected, and how air flows around the lung. </p>
<p>To explore these aspects of the avian lung, my colleagues and I have used a variety of techniques. Three-dimensional (3-D) serial section computer reconstruction is one of them. </p>
<p>Using this technique <a href="https://wap.hillpublisher.com/UpFile/202105/20210518180525.pdf">showed us</a> that the tiny structures (air- and blood capillaries) between which oxygen is exchanged are not the shape they were long thought to be. Because they are so small and so tightly entangled with each other, it wasn’t possible to see their shapes and connections clearly until we used 3-D reconstruction. We were then able to see what makes the bird lung so efficient at taking up the oxygen needed to release energy – key to survival.</p>
<h2>The approach</h2>
<p>For hundreds of years, scientists could only study biological structures in two dimensions – sections of tissue were placed under a transmission microscope. In the late 1970s, the South African-born Nobel prize winner <a href="https://www.nobelprize.org/prizes/medicine/2002/brenner/facts/">Sydney Brenner</a> was the first to apply computing to reconstruct series of sections. More recently, 3-D reconstruction methodologies have revolutionised various fields of biology.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=419&fit=crop&dpr=1 600w, https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=419&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=419&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=527&fit=crop&dpr=1 754w, https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=527&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/490026/original/file-20221017-12-g3xky2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=527&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Plate 1: airways, arteries and veins of the fowl lung.</span>
<span class="attribution"><span class="source">John Ndegwa Maina</span></span>
</figcaption>
</figure>
<p>3-D reconstruction <a href="https://wap.hillpublisher.com/UpFile/202105/20210518180525.pdf">showed us</a> that the airways and blood vessels track each other and supply specific parts of the bird’s lung. The various branches of the airway system do not interconnect and neither do the branches of the blood system. We were able to get a much clearer view of the shapes and connections of the air capillaries and blood capillaries in the lung. The compact entwining of the capillaries increases respiratory surface area while minimising the thickness of the blood-gas barrier. </p>
<p>The design of the bird’s lungs forms a highly efficient gas exchange system with large functional reserve. The lungs are ventilated continuously and in one direction (from back to front) with “fresh” air by coordinated actions of the very large air sacs. During every respiratory cycle, the air in the lung is replaced with “clean” air. This maintains a high pressure that drives oxygen into the blood circulating across the lung. It gives birds their flying power. </p>
<p>Our 3-D serial section reconstruction supplied new details and underscored the value of the technique for investigating complex biological structures.</p>
<h2>3-D reconstruction</h2>
<p>3-D reconstruction entails preparing a spatial model of a structure from 2-D images. Because it takes time, a lot of materials and specialised skills, it’s not often used in biological studies. </p>
<p>We used the method on a chicken lung because this is the model animal for study of the biology of birds. </p>
<p>We cut 2,689 serial sections of a chicken lung at a thickness of 8 micrometres (each micrometre is one millionth of a metre). We stained and mounted them onto glass slides, photographed sections and aligned the images for reconstruction using open-source software. </p>
<p>There are other modern 3-D reconstruction methods that are faster, cheaper and easier to use. But 3-D histological serial section reconstruction (building up a picture from thin slices of tissue) remains a very important technique. The reconstructions have better contrast and signal-to-noise ratio (there’s less unwanted information). Also, dyes and markers can be used to enhance identification of structures. </p>
<h2>Bird lung capillaries</h2>
<p>The process showed us that the extremely small terminal respiratory units of the bird lung – long called “air capillaries” – are not so: they are rather rotund structures, interconnected by very narrow passages.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=464&fit=crop&dpr=1 600w, https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=464&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=464&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=583&fit=crop&dpr=1 754w, https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=583&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/490032/original/file-20221017-21-zpl43s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=583&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Plate II. Air capillaries (top) and blood capillaries (bottom)</span>
<span class="attribution"><span class="source">John N Maina</span></span>
</figcaption>
</figure>
<p>Moreover, the “blood capillaries” are not “true” capillaries like those found in most other tissues and organs that are much longer than they are wide. They comprise clearly separate parts that are about as long as they are wide and interconnect in 3-D. The air- and the blood capillaries of the bird lung intertwine very tightly in a “honeycomb” arrangement. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=358&fit=crop&dpr=1 600w, https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=358&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=358&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=449&fit=crop&dpr=1 754w, https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=449&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/490037/original/file-20221017-2267-67rk2p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=449&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Plate IV. Comparing images.</span>
<span class="attribution"><span class="source">John N Maina</span></span>
</figcaption>
</figure>
<p>Knowing the shape and size of these units provides information about the gas exchange efficiency of the bird’s lung, which is a flow-through system. </p>
<h2>More to come</h2>
<p>As more efficient ways of applying 3-D reconstruction technology are developed, 3-D imaging and animation will become a vital means of research in a biologist’s toolbox. It will be possible to fully conceptualise the forms of structural components and hence allow better understanding of how they work.</p>
<p>Vital insights into the biology of animals, including birds, will allow us to formulate more effective measures that will ensure their conservation in the face of challenges from global warming and environmental pollution.</p><img src="https://counter.theconversation.com/content/192463/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>John Maina receives funding from the National Research Foundation of South Africa </span></em></p>An understanding of bird biology is the starting point for conservation efforts. 3-D reconstructions of biological structures greatly add to this understanding.John Maina, Professor of Comparative Respiratory Morphology, University of JohannesburgLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1673272022-02-23T12:35:05Z2022-02-23T12:35:05ZHow to capture satellite images in your backyard – and contribute to a snapshot of the climate crisis<figure><img src="https://images.theconversation.com/files/442250/original/file-20220124-21-opz0yx.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2488%2C1998&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A DIY satellite ground station in London, UK.</span> <span class="attribution"><span class="source">Dyer & Engelmann</span>, <span class="license">Author provided</span></span></figcaption></figure><p>Documentaries about the climate crisis are often illustrated with spectacular satellite images of forest fires, hurricanes and flooded landscapes. People around the world weather these conditions with little control over how their experiences are recorded and represented. Our project, <a href="https://open-weather.community/">open-weather</a>, offers the tools and knowledge to change that.</p>
<p>On the first day of <a href="https://theconversation.com/uk/topics/cop26-80762">COP26</a> (the latest UN climate change conference in Glasgow) our network of 29 volunteers captured a collective image of Earth by tuning into transmissions from three orbiting National Oceanic and Atmospheric Administration (NOAA) satellites. We did this using DIY satellite ground stations made up of radio antennae plugged into laptops.</p>
<p>Each member of the group recorded a satellite image as well as what they could feel and observe of the weather on the ground. Across 14 countries and six continents, the network recorded a total of 38 images which, when stitched together onto a map, produced a <a href="https://cop26-nowcast.open-weather.community/">snapshot of the planet</a> on October 31 2021. </p>
<p>This snapshot included a cyclonic weather system curling around the UK, dust clouds sweeping the Indian subcontinent, and the glaciers of the Patagonian Andes, which have been shown by geographer <a href="http://www.antarcticglaciers.org/glaciers-and-climate/glacier-recession/shrinking-patagonian-glaciers/">Bethan Davies</a> to be rapidly receding and thinning in response to global warming.</p>
<h2>How to take your own satellite images</h2>
<p>Receiving images from the public data broadcast of NOAA satellites is something anyone can learn how to do. All you need is a basic V-shaped antenna, a device called a dongle, and one of many free software programmes, like <a href="https://cubicsdr.com/">CubicSDR</a>. The antenna and dongle together cost around £50 (US$66).</p>
<p>Now you’re ready to launch your DIY satellite ground station. First, use a <a href="https://www.n2yo.com/">free online tool</a> to track satellite orbits overhead, then find somewhere outdoors with a clear view of the sky. Connect the antenna to your laptop using the dongle and tune it to a specific frequency using the software. Position the antenna so that the tip of the V points north, and the arms of the V are parallel to the ground as a NOAA satellite passes overhead.</p>
<figure class="align-center ">
<img alt="A person wearing a hat lies on a blanket in a vast field on a sunny day with a laptop and antenna." src="https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=293&fit=crop&dpr=1 600w, https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=293&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=293&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=369&fit=crop&dpr=1 754w, https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=369&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/431347/original/file-20211110-21-1aw0xt2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=369&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Preparing the ground station.</span>
<span class="attribution"><span class="source">Natasha Honey</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Your antenna captures the satellite’s unique radio transmission and sends it to your laptop, where the software transforms the signal into a sound. The sound can be decoded into two images received by the satellite as it passed over you. The first is composed of mostly visible light reflecting off the surface of the Earth, the second is made of infrared radiation – invisible electromagnetic waves emitted by the land, sea and clouds. The way you position your antenna and even your body are recorded in the image as signal and noise. This means each image is unique to the person and place that created it. </p>
<p>Open-weather was founded in April 2020 out of a desire to open up this practice to non-specialists. We published a series of <a href="https://publiclab.org/wiki/open-weather">how-to guides</a> and <a href="https://publiclab.org/notes/sashae/06-21-2021/diy-satellite-ground-station-workshop">hosted workshops</a> in different countries. We also <a href="https://cop26-nowcast.open-weather.community/">created artworks</a> in collaboration with design studio <a href="https://rectangle.design/">Rectangle</a>, and commissioned by The Photographers’ Gallery in London. As a result, a network of amateur satellite image decoders has begun to form around the world.</p>
<p>Here’s what they captured while world leaders were gathered in Glasgow for COP26.</p>
<figure class="align-center ">
<img alt="An over-the-shoulder view of someone editing satellite images on a laptop." src="https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/447771/original/file-20220222-13-1eep2x1.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Look what the satellite picked up.</span>
<span class="attribution"><span class="source">Sasha Engelmann</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>The climate crisis in a snapshot</h2>
<p>For their part in the project, cartographer and marine technician Joaquín Ezcurra and journalist Aimée Juhazs travelled to Parque Nacional Ciervo de los Pantanos in Argentina – a wetland at risk from climate change.</p>
<p>It was “a day of unexpected low temperatures” after the arrival of the cold <em>sudestada</em> wind, Ezcurra and Juhazs wrote in their field notes. They added that “communities living in the delta of the Paraná River in Argentina are suffering dearly from both low levels of water, and increasing numbers of fires during the winter dry season”.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two black-and-white satellite images side by side showing a watery landscape." src="https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=393&fit=crop&dpr=1 600w, https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=393&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=393&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=493&fit=crop&dpr=1 754w, https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=493&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/447808/original/file-20220222-23-refm11.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=493&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Climate change threatens to dry out some of the world’s wetlands.</span>
<span class="attribution"><span class="source">Joaquín Ezcurra & Aimée Juhazs</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Ankit Sharma, a mechanical engineering student in Mumbai, India, submitted a trio of images covering the vast region from the Persian Gulf to the Himalayas. During the second satellite pass, he noted: “My laptop had a layer of dust on it … heavy pollution was felt”.</p>
<figure class="align-center ">
<img alt="Two black-and-white satellite images side by side showing land and clouds." src="https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=493&fit=crop&dpr=1 600w, https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=493&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=493&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=620&fit=crop&dpr=1 754w, https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=620&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/448020/original/file-20220223-19-1f8ln60.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=620&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Mumbai from above.</span>
<span class="attribution"><span class="source">Ankit Sharma</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>“The pattern of cloud [reflects] the beauty of the nature”, wrote radio amateur Yoshi Matsuoka in Atsugi Kanagawa, Japan.</p>
<p>He noted, too, that the region had had “extreme torrential rain”. Many contributors wrote about their experiences of exceptional rainfall.</p>
<p>“Weather systems are getting tougher and tougher to predict”, and so too is knowing “what to plant, where to plant, and when to plant”, wrote Natasha Honey, a farmer in New South Wales, Australia.</p>
<figure class="align-right ">
<img alt="A person holds an antenna aloft with one hand on a laptop on the roof of a car." src="https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1234&fit=crop&dpr=1 600w, https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1234&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1234&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1551&fit=crop&dpr=1 754w, https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1551&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/431350/original/file-20211110-13-1gz41vm.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1551&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Tshimbalanga’s ground station in Kinshasa.</span>
<span class="attribution"><span class="source">Cedrick Tshimbalanga</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>In Glasgow, not far from the COP26 conference venue, artist and curator Alison Scott commented: “Climate change is felt … in a lack of public transport resilience; in bike lanes being opened (and closed) … in the corporate hijacking of COP26 and the city’s unpreparedness for its scale; in the erosion of rogue-landlord-ed sandstone tenement buildings in need of retro-fitting. It is felt in the history of the place.”</p>
<p>“The sun dominates”, wrote artist Cédrick Tshimbalanga in Kinshasa, Democratic Republic of Congo. Before, “the rainy season was alive and rain was abundant, and during the dry season, it was much colder”.</p>
<p>Zack Wettstein, a doctor in Seattle, Washington, received a satellite transmission during a “cold, dry, autumn morning, with no wind in sight … in stark contrast to the weather of the past week, when we were struck with an atmospheric river of rain from a bomb cyclone off the Pacific”.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two black-and-white satellite images side by side showing a stretch of coastline occluded by cloud." src="https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=392&fit=crop&dpr=1 600w, https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=392&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=392&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=493&fit=crop&dpr=1 754w, https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=493&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/447810/original/file-20220222-15-16agcrb.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=493&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Post-storm calm reigns in Seattle, US.</span>
<span class="attribution"><span class="source">Zack Wettstein</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>He added: “As a physician working in the emergency department, I see patients affected by these hazards wrought by climate change … with injuries, illness and exacerbations of their underlying disease”.</p>
<p>We received a surprise contribution from Barfrost in Kirkenes, Norway, who imaged the cartographic North Pole and noted that “southern insects [are surviving] the winter”.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two black-and-white satellite images side by side showing the outline of coasts in the Arctic." src="https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=385&fit=crop&dpr=1 600w, https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=385&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=385&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=484&fit=crop&dpr=1 754w, https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=484&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/447775/original/file-20220222-23-1xhgew6.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=484&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Arctic Circle – a region at the fore of Earth’s changing climate.</span>
<span class="attribution"><span class="source">Barfrost</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>These satellite images and field notes demonstrate that the climate crisis feels different depending on who you are and where you live. In some places, dry seasons are expanding. Elsewhere, it’s clouds of dust, increasingly volatile storms, or health effects triggered by the air that we breathe.</p>
<p>As politicians fail to respond to the climate emergency, a growing community of Earth-watchers has practical and political potential. Together, we might learn to be collectively responsible for, and accountable to, the environments we are changing.</p>
<p><em>For more images, field notes and how-to guides, visit <a href="http://cop26-nowcast.open-weather.community">our website</a>.</em></p>
<hr>
<figure class="align-right ">
<img alt="Imagine weekly climate newsletter" src="https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/434988/original/file-20211201-21-13avx6y.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><strong><em>Don’t have time to read about climate change as much as you’d like?</em></strong>
<br><em><a href="https://theconversation.com/uk/newsletters/imagine-57?utm_source=TCUK&utm_medium=linkback&utm_campaign=Imagine&utm_content=DontHaveTimeTop">Get a weekly roundup in your inbox instead.</a> Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. <a href="https://theconversation.com/uk/newsletters/imagine-57?utm_source=TCUK&utm_medium=linkback&utm_campaign=Imagine&utm_content=DontHaveTimeBottom">Join the 10,000+ readers who’ve subscribed so far.</a></em></p>
<hr><img src="https://counter.theconversation.com/content/167327/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sasha Engelmann does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p><p class="fine-print"><em><span>Sophie Dyer is a project manager at Amnesty International.</span></em></p>With an antenna, a laptop and some software, you can take a picture of Earth from space.Sasha Engelmann, Lecturer in GeoHumanities, Royal Holloway University of LondonSophie Dyer, Researcher in Human Rights, Harvard UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1741542022-02-02T13:07:22Z2022-02-02T13:07:22ZNew AI technique identifies dead cells under the microscope 100 times faster than people can – potentially accelerating research on neurodegenerative diseases like Alzheimer’s<figure><img src="https://images.theconversation.com/files/443241/original/file-20220128-19-25rkui.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2129%2C1408&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Eliminating human guesswork can make for faster and more accurate research.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/nerve-cells-illustration-royalty-free-illustration/973901864">KTSDESIGN/Science Photo Library via Getty Images</a></span></figcaption></figure><p>Understanding when and why a cell dies is fundamental to the study of human development, disease and aging. For <a href="https://doi.org/10.1080/17460441.2019.1623784">neurodegenerative diseases</a> such as Lou Gehrig’s disease, Alzheimer’s and Parkinson’s, identifying dead and dying neurons is critical to developing and testing new treatments. But identifying dead cells can be tricky and has been a constant problem throughout <a href="https://scholar.google.com/citations?hl=en&user=cQdBoWUAAAAJ&view_op=list_works&alert_preview_top_rm=2&sortby=pubdate">my career as a neuroscientist</a>.</p>
<p>Until now, scientists have had to manually mark which cells look alive and which look dead under the microscope. Dead cells have a <a href="https://doi.org/10.1038/cdd.2009.44">characteristic balled-up appearance</a> that is relatively easy to recognize once you know what to look for. My research team and I have employed a veritable army of undergraduate interns paid by the hour to scan through thousands of images and keep a tally of when each neuron in a sample appears to have died. Unfortunately, doing this by hand is a slow, expensive and sometimes error-prone process.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Time lapse of dying neuron over 10 minutes under a microscope" src="https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=95&fit=crop&dpr=1 600w, https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=95&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=95&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=120&fit=crop&dpr=1 754w, https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=120&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/443555/original/file-20220131-139881-ixpc7n.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=120&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">This is a time lapse of what a dying neuron looks like under a microscope. Imagine sifting through hundreds of thousands of images like these by hand!</span>
<span class="attribution"><a class="source" href="https://doi.org/10.1126/sciadv.abf8142">Jeremy Linsley</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<p>Making matters even more difficult, scientists recently began using <a href="https://doi.org/10.1038/nature02998">automated microscopes</a> to continually capture images of cells as they change over time. While automated microscopes make it easier to take photos, they also create a massive amount of images to manually sort through. It became clear to us that manual curation was neither accurate nor efficient. Furthermore, most <a href="https://doi.org/10.1038/s41467-021-25549-9">imaging techniques</a> can detect only the late stages of cell death, sometimes days after a cell has already begun to decompose. This makes it difficult to distinguish between what actually contributed to the cell’s death from factors just involved in its decay. </p>
<p>My colleagues and I have been trying for some time to automate the curation process. Our initial attempts could not handle the wide range of cell and microscope types we use in our research, nor rival the accuracy of our interns. But a <a href="https://doi.org/10.1126/sciadv.abf8142">new artificial intelligence technology</a> my research team developed can identify dead cells with both superhuman accuracy and speed. This advance could potentially turbocharge all kinds of biomedical research, especially on neurodegenerative disease.</p>
<h2>AI to the rescue</h2>
<p>Artifical intelligence has recently taken the field of microscopy by storm. A form of AI called <a href="https://doi.org/10.1038/nature21056">convolutional neural networks, or CNNs</a>, has especially been of interest because it can analyze images as accurately as humans can.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/YRhxdVk_sIs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Convolutional neural networks are able to filter for specific patterns in images through multiple layers of processing.</span></figcaption>
</figure>
<p>Convolutional neural networks can be trained to recognize and discover complex patterns in images. As with human vision, giving CNNs many example images and pointing out what features to pay attention to can teach the computer to recognize patterns of interest.</p>
<p>These patterns could include <a href="https://doi.org/10.1016/j.cell.2018.03.040">biological phenomena</a> difficult to see by eye. For example, one research group was able to train CNNs to identify <a href="https://doi.org/10.1038/nature21056">skin cancer</a> more accurately than trained dermatologists. Even more recently, colleagues of mine were able to train CNNs to <a href="https://doi.org/10.1016/j.cell.2018.03.040">identify complex biological signatures</a> such as cell type in microscopy images.</p>
<p>Building on this work, we developed a new technology called <a href="https://doi.org/10.1126/sciadv.abf8142">biomarker-optimized CNNs, or BO-CNNs</a>, to identify cells that have died. First, we needed to teach the BO-CNN to distinguish between clearly dead and clearly alive cells. So we prepared a petri dish with mice neurons that were engineered to produce a nontoxic protein called a <a href="https://doi.org/10.1038/s41467-021-25549-9">genetically encoded death indicator, or GEDI</a>, that colored living cells green and dead cells yellow. The BO-CNN could easily learn that green meant “alive” and yellow meant “dead.” But it was also learning other features distinguishing living and dead cells that aren’t so obvious to the human eye.</p>
<p>After the BO-CNN learned how to identify the characteristics that distinguished the green cells from the yellow, we showed it neurons that weren’t distinguished by color. The BO-CNN was able to correctly label live and dead cells significantly faster and more accurately than people trained to do the same thing. The model could even look at images of cell types it had not seen before taken from different types of microscopes and still correctly identify dead cells.</p>
<p>One obvious question still remained, however – why was our model so effective at finding dead cells?</p>
<p>Researchers often treat the decisions CNNs make as <a href="https://doi.org/10.1038/s42256-019-0048-x">black boxes</a>, with the strategy the computer uses to solve a visual task considered less important than how well it performs. However, because there must be some patterns in the cell structure the model focuses on to make its decisions, identifying these patterns could help scientists better define what cell death looks like and understand why it occurs. </p>
<p>To figure out what these patterns were, we used additional <a href="https://doi.org/10.1007/s11263-019-01228-7">computational tools</a> to create visual representations of the BO-CNN’s decisions. We found that our BO-CNN model detects cell death in part by focusing on changing fluorescence patterns in the nucleus of the cell. This is a feature that human curators were previously unaware of, and it may be the reason designs for previous AI models were less accurate than the BO-CNN.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Microscopy images showing rat neurons before and after treatment with glutamate; the neurons are colored green when alive and yellow when dead" src="https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=354&fit=crop&dpr=1 600w, https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=354&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=354&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=445&fit=crop&dpr=1 754w, https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=445&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/443244/original/file-20220128-14047-1wva32o.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=445&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">These images show alive neurons colored green and dead neurons colored yellow. To induce death, neurons were treated with an excess of the neurotransmitter glutamate, overstimulating them to the point of irreversible damage.</span>
<span class="attribution"><a class="source" href="https://doi.org/10.1126/sciadv.abf8142">Jeremy Linsley</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<h2>Harnessing the power of AI</h2>
<p>I believe our approach represents a major advance in harnessing artificial intelligence to study complex biology, and this proof of concept could be broadly applied beyond detecting cell death in microscopic imaging. <a href="https://github.com/finkbeiner-lab/GEDI-ORDER">Our software is open source</a> and available to the public.</p>
<p>Live-cell microscopy is extremely rich with information that researchers have difficulty interpreting. But with the use of technologies like BO-CNNs, researchers can now use signals from cells themselves to train AI to recognize and interpret signals in other cells. By taking out human guesswork, BO-CNNs increase the reproducibility and speed of research and can help researchers discover new phenomena in images that they would otherwise not have been able to easily recognize.</p>
<p>With the power of AI, my research team is currently working to extend our BO-CNN technology toward predicting the future – identifying damaged cells before they even start to die. We believe this could be a game-changer for neurodegenerative disease research, helping pinpoint new ways to prevent neuronal death and eventually lead to more effective treatments.</p>
<p>[<em>Understand new developments in science, health and technology, each week.</em> <a href="https://memberservices.theconversation.com/newsletters/?nl=science&source=inline-science-understand">Subscribe to The Conversation’s science newsletter</a>.]</p><img src="https://counter.theconversation.com/content/174154/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jeremy Linsley was supported by the National Institutes of Health (U54 NS191046, R37 NS101996, RF1 AG058476, RF1 AG056151, RF1 AG058447, P01 AG054407, U01 MH115747), the National Library of Medicine (R01 LM013617), the Koret Foundation, the Taube/Koret Center for Neurodegenerative Research, the National Center for Research Resources (RR18928), the Target ALS Foundation, the Amyotrophic Lateral Sclerosis Association Neuro Collaborative, Mike Frumkin, and the Department of Defense (W81XWH-13-ALSRP-TIA).</span></em></p>Understanding when and how neurons die is an important part of research on neurodegenerative diseases like Lou Gehrig’s, Alzheimer’s and Parkinson’s diseases.Jeremy Linsley, Scientific Program Leader at Gladstone Institutes, University of California, San FranciscoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1681592021-10-04T11:40:54Z2021-10-04T11:40:54ZBepiColombo’s first close-up pictures of Mercury’s surface hint at answers to the planet’s secrets<figure><img src="https://images.theconversation.com/files/423350/original/file-20210927-15-1qyik7a.jpg?ixlib=rb-1.1.0&rect=5%2C5%2C3830%2C2151&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Artist's impression of BepiColombo during a swing-by of Mercury</span> <span class="attribution"><span class="source">ESA/ATG medialab</span></span></figcaption></figure><p>The <a href="https://www.esa.int/Science_Exploration/Space_Science/BepiColombo/Mercury_ahead!">BepiColombo spacecraft</a> – a joint project by the European and Japanese space agencies – swung by its destination planet Mercury in the early hours of October 2 2021. Passing within just 200km of the surface of Mercury, it sent back some <a href="https://www.esa.int/Science_Exploration/Space_Science/BepiColombo/BepiColombo_s_first_views_of_Mercury">spectacular pictures</a>.</p>
<p>For those of us who have worked for a decade or more on this mission, there could hardly be a way better to celebrate what would have been the 101st birthday of the mission’s namesake, Italian mathematician and engineer Giuseppe Colombo. His groundbreaking work in this area earned him the title of the <a href="https://www.esa.int/About_Us/ESA_history/Giuseppe_Bepi_Colombo_Grandfather_of_the_fly-by">grandfather</a> of the planetary fly-by technique, now more often termed a “swing-by”.</p>
<p>BepiColombo’s cruise from Earth began in <a href="https://theconversation.com/europe-blasts-off-to-mercury-heres-the-rocket-science-104641">October 2018</a>, and its journey is far from over. It will travel twice around the sun in the time it takes Mercury to orbit the sun three times (around 264 days). This will allow it to rendezvous with the planet for another swing-by on June 23 2022. </p>
<p>After a total of six Mercury swing-bys, the cumulative effect of the planet’s gravity will reduce the spacecraft’s velocity to the point where it can fall into orbit with Mercury around the end of 2025.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="The BepiColombo spacecraft showing where the external cameras are mounted" src="https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=215&fit=crop&dpr=1 600w, https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=215&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=215&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=270&fit=crop&dpr=1 754w, https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=270&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/422196/original/file-20210920-22-14erckt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=270&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Left: the location of the three MCAMs on the Mercury Transfer Module, seen in an exploded view of the spacecraft stack. Right: artist’s impression of the stacked spacecraft.</span>
<span class="attribution"><span class="source">Left: Micro-Cameras & Space Exploration SA. Right: spacecraft: ESA/ATG medialab; Mercury: Nasa/JPL</span></span>
</figcaption>
</figure>
<p>BepiColombo is actually composed of two <a href="https://sci.esa.int/web/bepicolombo">connected</a> spacecraft and a propulsion unit. During its cruise through interplanetary space, the European orbiter (called the “<a href="https://www.cosmos.esa.int/web/bepicolombo/mpo">Mercury Planetary Orbiter</a>” or MPO) is attached on one side to the interplanetary propulsion unit (or “<a href="https://www.esa.int/Science_Exploration/Space_Science/BepiColombo/Mercury_Transfer_Module">Mercury Transfer Module</a>”). On the other, it carries a Japanese orbiter named Mio (or “<a href="https://www.isas.jaxa.jp/en/missions/spacecraft/current/mmo.html">Mercury Magnetospheric Orbiter</a>”), plus a sunshield to prevent Mio from overheating. </p>
<p>This stacked configuration obstructs the openings through which sophisticated visible, infrared and X-ray cameras inside MPO – capable of imaging and analysing Mercury’s surface in great detail – will operate once MPO finally becomes free-flying. In fact, most of BepiColombo’s science instruments will be wholly or partly inoperative until each orbiter is set free, around December 2025.</p>
<h2>Adding the cameras</h2>
<p>Until a relatively late stage in mission planning, it was accepted that BepiColombo would be “flying blind” during its whole cruise from Earth, including during swing-bys – meaning no images would be available until orbit around Mercury had been achieved. </p>
<p>But the level of public interest aroused in 2015 by by images of <a href="https://theconversation.com/rosetta-scientists-unveil-the-source-of-ice-and-dust-jets-on-comet-67p-48122">comet 67P</a> from the Rosetta mission led BepiColombo engineers Kelly Geelen and James Windsor to propose that low-cost lightweight cameras should be added to the spacecraft. </p>
<p>By the end of 2016, it was agreed that three small monitoring cameras – each only 6.5cm in length – would be mounted onto the craft. These would snap planetary pictures during swing-bys. </p>
<p>It was decided to place these cameras on the Mercury Transfer Module, where they would also be able to monitor the deployment of the solar panels that provide the spacecraft with power, the magnetometer boom used for measuring magnetic fields, and the communication antennae.</p>
<figure class="align-center ">
<img alt="One of the monitoring cameras as used on BepiColombo" src="https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=263&fit=crop&dpr=1 600w, https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=263&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=263&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=330&fit=crop&dpr=1 754w, https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=330&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/422419/original/file-20210921-17-fjmmav.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=330&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Extremely small and light monitoring cameras carry out a range of functions on a spacecraft such as BepiColombo.</span>
<span class="attribution"><span class="source">Micro-Cameras & Space Exploration SA.</span></span>
</figcaption>
</figure>
<h2>What Bepi saw</h2>
<p>During BepiColombo’s first Mercury swing-by, the fields of view of monitoring cameras two and three tracked across the planet. Camera three showed us part of the southern hemisphere, beginning with a view of sunrise over <a href="https://messenger.jhuapl.edu/Explore/Science-Images-Database/gallery-image-238.html">Astrolabe Rupes</a> – a striking feature named after a French Antarctic exploration ship. </p>
<p>Astrolabe Rupes is a 250km long “<a href="http://lroc.sese.asu.edu/posts/374">lobate scarp</a>” – a long, curved structure marking where one part of the planet’s crust has been pushed over nearby terrain, due to the whole planet contracting as it slowly cooled. </p>
<p>There are some much smaller <a href="https://theconversation.com/the-moon-is-still-geologically-active-study-suggests-116768">equivalent features</a> on the Moon, but Mercury is the only nearby celestial body where lobate scarps are known to occur on such a large scale.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A geological feature of Mercury" src="https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/424297/original/file-20211002-25-1lyhkp0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Astrolabe Rupes catches the light of the rising sun, captured at a range of 1183km. MPO’s transmitting antenna is brightly lit in the foregound, contributing to a ghosting effect in the middle of the image.</span>
<span class="attribution"><span class="source">ESA/BepiColombo/MTM, CC BY-SA 3.0 IGO</span></span>
</figcaption>
</figure>
<p>Four minutes later, the perspective had changed enough to reveal a wider area: including the lava-flooded, 251km-wide <a href="https://www.esa.int/ESA_Multimedia/Images/2021/10/A_taste_of_Mercury_geology_annotated">Haydn crater</a> and <a href="https://theconversation.com/mysterious-red-spots-on-mercury-get-names-but-what-are-they-95114">Pampu Facula</a>, one of many bright spots likely formed by explosive volcanic eruptions. Both of these features attest to Mercury’s long volcanic history, at its most active more than three billion years ago but probably persisting until around one billion years ago. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A wider-angle view of Mercury's surface" src="https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/424298/original/file-20211002-25-1obbu0t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Astrolabe Rupes is still visible in this image taken at 2687km, allowing a wider area of the planet’s surface to be seen.</span>
<span class="attribution"><span class="source">https://www.esa.int/Science_Exploration/Space_Science/BepiColombo/BepiColombo_s_first_views_of_Mercury</span></span>
</figcaption>
</figure>
<p>Meanwhile, camera two focused on Mercury’s northern hemisphere, including the region surrounding <a href="https://www.nasa.gov/mission_pages/messenger/multimedia/messenger_orbit_image20111123_1.html">Calvino crater</a>: an important location for deciphering what lies in the layers of Mercury’s crust. </p>
<p>It also showed Lermontov crater: a region which appears bright because it is host to both <a href="https://news.ncsu.edu/2016/08/byrne-mercury/">volcanic deposits</a> and “hollows”, where a currently unknown <a href="https://www.smithsonianmag.com/smart-news/mercurys-messy-surface-hints-planet-was-once-habitable-180974501/">volatile ingredient</a> of the crust is being lost to space via a mysterious process.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Mercury's North hemisphere" src="https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/424299/original/file-20211002-46781-ub8qnv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">At 2418km, Mercury’s North hemisphere is towards the lower left, and a brightly sunlit magnetometer boom is in the foreground.</span>
<span class="attribution"><span class="source">ESA/BepiColombo/MTM, CC BY-SA 3.0 IGO</span></span>
</figcaption>
</figure>
<p>Nasa’s <a href="https://solarsystem.nasa.gov/missions/messenger/in-depth/">MESSENGER</a> mission orbited Mercury between 2011 and 2015, revealing a <a href="https://theconversation.com/the-more-we-learn-about-mercury-the-weirder-it-seems-55972">perplexing planet</a>. We are still struggling to understand its composition, origin and history.</p>
<p>Why Mercury has features such as explosive <a href="https://mobile.arc.nasa.gov/public/iexplore/missions/pages/yss/may.html">volcanoes</a> and strange, unique <a href="https://www.planetary.org/articles/02171332-what-are-mercurys-hollows">hollows</a> on its surface is just one of the problems we hope further study will solve. Once in orbit, BepiColombo’s advanced payload of scientific instruments will help us understand more about how Mercury formed and what it’s made of.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-more-we-learn-about-mercury-the-weirder-it-seems-55972">The more we learn about Mercury, the weirder it seems</a>
</strong>
</em>
</p>
<hr>
<p>In the meantime, these extraordinary swing-by pictures at least remind us that we have a healthy spacecraft heading to an exciting destination.</p><img src="https://counter.theconversation.com/content/168159/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Rothery is Professor of Planetary Geosciences at the Open University. He is co-leader of the European Space Agency's Mercury Surface and Composition Working Group, and a Co-Investigator on MIXS (Mercury Imaging X-ray Spectrometer) that is now on its way to Mercury on board the European Space Agency's Mercury orbiter BepiColombo. He has received funding from the UK Space Agency and the Science & Technology Facilities Council for work related to Mercury and BepiColombo, and from the European Commission under its Horizon 2020 programme for work on planetary geological mapping (776276 Planmap). He is author of Planet Mercury - from Pale Pink Dot to Dynamic World (Springer, 2015), Moons: A Very Short Introduction (Oxford University Press, 2015) and Planets: A Very Short Introduction (Oxford University Press, 2010). He is Educator on the Open University's free learning Badged Open Course (BOC) on Moons and its equivalent FutureLearn Moons MOOC, and chair of the Open University's level 2 course on Planetary Science and the Search for Life.</span></em></p>What did Mercury look like as BepiColombo swung by?David Rothery, Professor of Planetary Geosciences, The Open UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1265012020-02-20T19:00:23Z2020-02-20T19:00:23ZBrain temperature is difficult to measure. Here’s how a new infrared technique can help<figure><img src="https://images.theconversation.com/files/315417/original/file-20200214-11005-1iydf8y.jpg?ixlib=rb-1.1.0&rect=44%2C0%2C5000%2C3330&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock/Popartic</span></span></figcaption></figure><p>Brain temperature is implicated in many common conditions including stroke, multiple sclerosis, epilepsy, traumatic brain injury, and headaches.</p>
<p>Changes in brain temperature can indicate there is a disease developing, but researchers have struggled to measure it. The use of conventional thermometers is very invasive, and remote measuring techniques are blunt and often inaccurate. </p>
<p>But a new technique that combines infrared light with temperature-sensitive nanoparticles could be the solution.</p>
<h2>Understanding brain temperature</h2>
<p>Temperature is tightly regulated in living beings, so sudden changes usually indicate that something is amiss. <a href="http://www.frontiersin.org/journal/10.3389/fnins.2014.00307/abstract">The brain is no exception to this.</a> Brain temperature depends on neural activity, and will vary if blood flow is disrupted (as occurs, for instance, in stroke). </p>
<p>Brain temperature is not only relevant for diagnosing conditions, it can also be harnessed for therapeutic uses. Heat can kill cells, which may be useful <a href="https://theconversation.com/destroying-tumors-with-gold-nanoparticles-98422">in treating tumours</a>. Manipulating brain temperature can also activate or suppress neural activity, which may be used to alleviate the symptoms of some neurological disorders, such as Parkinson’s.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/ozzy-osbourne-has-a-type-of-parkinsons-disease-called-parkin-a-neurologist-explains-130367">Ozzy Osbourne has a type of Parkinson's disease called Parkin: A neurologist explains</a>
</strong>
</em>
</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=408&fit=crop&dpr=1 600w, https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=408&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=408&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=513&fit=crop&dpr=1 754w, https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=513&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/315444/original/file-20200214-10980-57cux5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=513&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The brain is very complex and well-protected, which makes it difficult for researchers to measure its temperature.</span>
<span class="attribution"><span class="source">Shutterstock/Nata-Lia</span></span>
</figcaption>
</figure>
<h2>Traditional methods lacking</h2>
<p>Researchers have struggled to detect neurological disorders based on changes in brain temperature. This is because it is difficult to measure brain temperature accurately with current technology. </p>
<p>The brain is not only extremely complex, it is also very delicate and well-protected. To make matters more complicated, brain temperature changes associated with significant variations in neural activity are usually small (below 1°C) and may occur very rapidly over a small area.</p>
<p>Conventional thermometers are not a great option for sensing brain temperature. They require contact with the object they are measuring – so in this case, they need to be inserted into the brain itself. This very invasive procedure requires drilling a hole in the skull and can damage and scar the brain permanently.</p>
<p>The reward for such a risk is very limited. These thermometers can only measure the temperature at a single spot, making them useless to understand how it changes across different brain regions.</p>
<p>There are options for remote temperature sensing, but they also fail at mapping brain temperature effectively. They can <a href="https://www.tandfonline.com/doi/full/10.3109/02656736.2013.832411">only record</a> surface temperature, or are not sensitive and fast enough.</p>
<h2>Measuring temperature without entering the brain</h2>
<p>To measure brain temperature accurately, we need a very sensitive technique that can measure small temperature changes remotely, in real time, and with good spatial resolution. This is where near-infrared fluorescence comes in.</p>
<p>Fluorescence is a common technique for high-resolution, real-time imaging of cells. Researchers use contrast agents (dyes or nanoparticles) that emit visible light when illuminated. Some of these contrast agents change their fluorescence depending on the local temperature, acting as local, tiny, thermometers.</p>
<p>But fluorescent thermometers that emit visible light are not very useful when it comes to measuring temperature below the skin surface – they would only be if our skin was transparent. </p>
<p>The skin, however, <em>is</em> <a href="https://www.nature.com/articles/nnano.2009.326">quite transparent</a> to near-infrared light, and the same is true for <a href="https://www.nature.com/articles/s41551-016-0010">fat, muscle and bone</a>.</p>
<p>Visible light is just the part of the <a href="https://theconversation.com/explainer-what-is-the-electromagnetic-spectrum-8046">electromagnetic spectrum</a> that our eyes can see. Invisible, near-infrared light has a slightly shorter wavelength that our eyes – and conventional fluorescence imaging cameras – cannot detect.</p>
<p>Using near-infrared fluorescent contrast agents, researchers have been able to <a href="https://www.nature.com/articles/nphoton.2014.166">see blood vessels in the brain</a> through the skull in live mice – even tiny vessels no thicker than a few microns (one thousandth of a millimeter).</p>
<p>Some near-infrared nanoparticles are <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/adom.201600508">highly sensitive to changes in temperature</a>. Combining them with wide-field fluorescence imaging, it is possible to measure the temperature of the brain temperature through the scalp and skull - no drill holes or inserted thermometers required.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=320&fit=crop&dpr=1 600w, https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=320&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=320&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=402&fit=crop&dpr=1 754w, https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=402&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/302779/original/file-20191120-502-1jt46pu.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=402&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Using a new technique harnessing near-infrared light, researchers were able to monitor real-time brain temperature in mice.</span>
<span class="attribution"><span class="source">Adapted from the Journal of Advanced Functional Materials 2018.</span></span>
</figcaption>
</figure>
<p>My research used this technique to see, in real time, <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/adfm.201806088">how brain temperature drops in live mice after drug administration</a>. Near-infrared fluorescence thermometry can help us understand how brain temperature and neurological diseases are related – eventually leading to the application of temperature-based diagnosis in humans. </p>
<p>For this technique to become fully useful, the delivery of the temperature-sensitive contrast agents still needs improvement. Having them reach the brain and stay there for as long as required – without altering the function of the brain – is still a major challenge. To avoid invasive brain injections (as we used in our work), the next step is developing an efficient method to get the contrast agents across the <a href="https://theconversation.com/explainer-what-is-the-blood-brain-barrier-and-how-can-we-overcome-it-75454">blood-brain barrier</a>.</p><img src="https://counter.theconversation.com/content/126501/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Blanca del Rosal Rabes does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A new tool for seeing hotspots in the brain could help doctors detect neurological disorders.Blanca del Rosal Rabes, Postdoctoral Research Fellow, Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1275232020-01-17T03:19:39Z2020-01-17T03:19:39ZAustralian sea lions are declining. Using drones to check their health can help us understand why<figure><img src="https://images.theconversation.com/files/303412/original/file-20191125-74562-ijrn48.jpg?ixlib=rb-1.1.0&rect=0%2C24%2C5423%2C3607&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Australian sea lions (Neophoca cinerea) are one of the rarest pinnipeds in the world and they are declining.</span> <span class="attribution"><a class="source" href="http://www.jarrodhodgson.com.au">Jarrod Hodgson</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>Australian sea lions are in trouble. Their population has never recovered from the impact of the commercial sealing that occurred mainly in the 19th century.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/304174/original/file-20191128-176629-syyjf5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Low-lying rock islands and outcrops make important breeding sites for Australian sea lions but many are threatened by sea-level rise.</span>
<span class="attribution"><span class="source">J. Hodgson</span></span>
</figcaption>
</figure>
<p>Currently, the Australian sea lion is a threatened species (listed as <a href="https://www.iucnredlist.org/species/14549/4443172">endangered by the International Union for Conservation of Nature or IUCN)</a> with the population estimated at 10,000 – 12,000. More than 80% of these animals live in the coastal waters of South Australia, where their numbers are estimated to have fallen by more than half over the past 40 years. </p>
<p>The sea lions’ survival is threatened by many factors, including bycatch in commercial fisheries, entanglement in marine debris and impacts related to climate change. </p>
<p>With time running out, the sea lions’ survival depends on informed management. One important step is to establish a low-risk way of quickly assessing the health of the current population. The results could help us identify how to stop the population declining. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australias-other-reef-is-worth-more-than-10-billion-a-year-but-have-you-heard-of-it-45600">Australia's 'other' reef is worth more than $10 billion a year - but have you heard of it?</a>
</strong>
</em>
</p>
<hr>
<h2>Technological insight</h2>
<p>One common way to get a quick idea of an animal’s health is to assess its body using a measure equivalent to the <a href="https://en.wikipedia.org/wiki/Body_mass_index">body mass index</a> (BMI) for humans, which is calculated from a person’s mass divided by the square of their height. But using a tape measure and scales to obtain the size and mass of Australian sea lions is time consuming, costly and involves risky anaesthesia of endangered animals.</p>
<p>With our colleagues Dirk Holman and <a href="http://www.antarctica.gov.au/science/meet-our-scientists/dr-aleks-terauds">Aleks Terauds</a>, we recently developed a technique to non-invasively estimate the body condition of Australian sea lions by using a drone to collect high-resolution photos of sedated sea lions. We then used the photos to digitally reconstruct a 3D model of each animal to estimate its length, width and overall volume – and compared these to physical measurements. </p>
<p>The technique, recently published in <a href="https://doi.org/10.1016/j.biocon.2019.108402"><em>Biological Conservation</em></a>, worked better than expected.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=240&fit=crop&dpr=1 600w, https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=240&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=240&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=302&fit=crop&dpr=1 754w, https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=302&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/303405/original/file-20191125-74599-16xcgmq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=302&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Drone-captured photographs were processed to create 2D mosaics of images and 3D models. These were used to measure area and volume, both of which approximated animal mass.</span>
<span class="attribution"><span class="source">J. Hodgson</span></span>
</figcaption>
</figure>
<p>The measurements were accurate, and we found a strong correlation between the mass of an individual and the area and volume measurements derived from the drone pictures. These are the key ingredients needed to assess sea lion condition without handling animals. </p>
<h2>Conserving an iconic species</h2>
<p>While simple body condition measurements have limitations, they are useful for conservation because they provide rapid health insights across a species’ range. </p>
<p>Australian sea lions breed at around 80 known sites spanning more than 3,000 km of southern Australian coastline within the <a href="https://theconversation.com/australias-other-reef-is-worth-more-than-10-billion-a-year-but-have-you-heard-of-it-45600">Great Southern Reef</a>. </p>
<p>Our technique can be used to study free-ranging animals at colonies across this range, from Kangaroo Island in South Australia to the Houtman Abrolhos Islands in Western Australia, and test for differences in condition. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=339&fit=crop&dpr=1 600w, https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=339&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=339&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=427&fit=crop&dpr=1 754w, https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=427&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/309600/original/file-20200113-103990-1364qeo.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=427&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">3D models of animals measured in the study.</span>
<span class="attribution"><span class="source">J. Hodgson</span></span>
</figcaption>
</figure>
<p>This can give us valuable information about how individual health and colony trends in abundance are related. For example, if a colony is in decline and its members are in poor condition, it could be that factors such as food availability and disease are driving the decline.</p>
<p>However, if there is no difference in the condition of animals from declining and recovering colonies, then declines may be due to direct human impacts such as bycatch in commercial fisheries and entanglement in marine debris. We could then target the most likely threats identified using this technique to better understand their impact and how to protect the sea lions against them.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=372&fit=crop&dpr=1 600w, https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=372&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=372&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=467&fit=crop&dpr=1 754w, https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=467&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/303404/original/file-20191125-74599-kf9j1s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=467&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">These two adult male Australian sea lions differed by just 11 cm in length but more than 130 kg in mass.</span>
<span class="attribution"><span class="source">J. Hodgson</span></span>
</figcaption>
</figure>
<p>This technique could be used to complete a population-wide survey of Australian sea lion condition and help ensure the species’ survival. It would build on past mitigation measures which include successfully <a href="https://www.afma.gov.au/sites/default/files/uploads/2014/03/Australian-Sea-Lion-Management-Strategy-2015-v2.0-FINAL.pdf">reducing by-catch from gillnet fishing along the sea floor</a>. </p>
<p>It will also complement current initiatives, including a trial to <a href="http://www.doi.org/10.1007/s00436-015-4481-4">control a parasite</a> that may improve <a href="https://sydney.edu.au/news-opinion/news/2019/07/22/saving-our-sea-lions.html">pup survival</a>.</p>
<p>Australian sea lions are an icon of Australia’s Great Southern Reef. As an important top-order predator in these coastal waters, they are indicators of ocean health. Understanding and mitigating the causes of their decline will not only help the species recover, but it will also help to ensure the unique coastal ecosystems on which Australian sea lions depend remain intact and functional.</p><img src="https://counter.theconversation.com/content/127523/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jarrod Hodgson received funding from the Department for Environment and Water, Government of South Australia, for this research. He has also received funding from the Australian Government's Research Training Program.</span></em></p><p class="fine-print"><em><span>Lian Pin Koh receives funding from Australian Research Council. He currently works for Conservation International, USA.</span></em></p><p class="fine-print"><em><span>Simon Goldsworthy received funding from the Department for Environment and Water, Government of South Australia, for this research. </span></em></p>Australia’s only sea lion species is endangered and continues to decline. A new non-invasive monitoring technique could help to identify the causes and better inform conservation strategies.Jarrod Hodgson, PhD Candidate, University of AdelaideLian Pin Koh, Professor, University of AdelaideSimon Goldsworthy, Principal Scientist, Ecosystem Effects of Fishing & Aquaculture, South Australian Research and Development Institute, and Affiliate Professor, University of AdelaideLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/942822018-05-18T10:41:43Z2018-05-18T10:41:43Z75 years of instant photos, thanks to inventor Edwin Land’s Polaroid camera<figure><img src="https://images.theconversation.com/files/219447/original/file-20180517-26274-1f6mmvc.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2618%2C2070&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Edwin Land, on the left, invented and commercialized a number of technologies, most of which centered on light.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-OH-USA-APHS150797-Polaroid-Land-Camera/155ca24494f748d3aae778e1db3f8755/2/0">AP Photo</a></span></figcaption></figure><p>It probably happens every minute of the day: A little girl demands to see the photo her parent has just taken of her. Today, thanks to smartphones and other digital cameras, we can see snapshots immediately, whether we want to or not. But in 1943 when <a href="https://www.acs.org/content/acs/en/education/whatischemistry/landmarks/land-instant-photography.html">3-year-old Jennifer Land</a> asked to see the family vacation photo that her dad had just taken, the <a href="https://www.library.hbs.edu/hc/polaroid/instant-photography/the-idea-of-instant-photography/">technology didn’t exist</a>. So her dad, <a href="https://www2.rowland.harvard.edu/book/export/html/16141">Edwin Land, went to work inventing it</a>.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroid camera faces the viewer" src="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=884&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=884&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=884&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1111&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1111&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1111&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The original Polaroid camera freed users from needing to trek to a darkroom to develop their images.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/cNomGxIq6MI">Lindsay Moe/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Three years later, after plenty of scientific development, Land and his Polaroid Corp. realized the miracle of nearly instant imaging. The film exposure and processing hardware are contained within the camera; there’s no muss or fuss for the photographer, who just points and shoots and then watches the image materialize on the photo once it spools out of the camera. Land demonstrated his new technology publicly for the first time on <a href="https://mobile.twitter.com/OpticaWorldwide/status/1098613395765501955">Feb. 21, 1947, at a meeting</a> of the Optical Society of America.</p>
<p>Land is probably best known for the “instant photo” – or the spiritual progenitor of today’s <a href="http://www.dailymail.co.uk/sciencetech/article-3619679/What-vain-bunch-really-24-billion-selfies-uploaded-Google-year.html">ubiquitous selfie</a>. His Polaroid camera was first released commercially in 1948 at retail locations and prices aimed at the postwar middle class. But this is just one of a host of technological breakthroughs Land invented and commercialized, most of which centered around light and how it interacts with materials. The technology used to show a 3D movie and the goggles we wear in the theater were made possible by Land and his colleagues. The camera aboard the U-2 spy plane, as featured in the movie “<a href="https://www.imdb.com/title/tt3682448/">Bridge of Spies</a>,” was a Land product, as were even some aspects of the plane’s mechanics. He also worked on theoretical problems, drawing on a deep understanding of both chemistry and physics.</p>
<p><a href="https://scholar.google.com/citations?user=8hzH2SoAAAAJ&hl=en&oi=ao">I’m a vision scientist</a> who has touched many of the fields in which Land made great advances, through my own work on new imaging methods, image processing techniques and human color vision. As the 2018 recipient of the <a href="https://www.osa.org/en-us/awards_and_grants/awards/award_description/edwinland/">Edwin H. Land Medal</a>, awarded by the Optical Society of America and the <a href="https://www.optica.org//en-us/about/newsroom/news_releases/2018/the_optical_society_and_society_for_imaging_scienc/">Society for Imaging Science and Technology</a>, my own work relies on Land’s technological innovations that made modern imaging possible.</p>
<h2>Controlling light’s properties</h2>
<p>Edwin Land had his first optics breakthrough as a young man, when he figured out a convenient and affordable method to control one of the fundamental properties of light: polarization.</p>
<p>You can think of light as waves propagating from a source. Most light sources produce a mixture of waves with all different physical properties, such as wavelength and amplitude of vibration. Light is considered polarized if the amplitude varies in a consistent manner perpendicular to the direction the wave is traveling.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="diagram of only vertical lightwaves passing through filter" src="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=280&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=280&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=280&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=352&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=352&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=352&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A polarizing filter can block all the light waves that don’t match its orientation.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/ko/image-vector/polarization-light-waves-421267105">Fouad A. Saad/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Given the right material for the light waves to pass through, the light waves may be rotated into another plane, slowed down or blocked. Modern 3D goggles work because one eye receives light waves vibrating along the horizontal plane while the other eye receives the light vibrating along the vertical plane. </p>
<p>Before Land, researchers built components to control polarization from rock crystals, which were assigned almost magical names and properties, though they merely decreased the velocity or amplitude of light waves traveling at specific orientations. Land created “polarizers” by growing small crystals and embedding them in plastic sheets, altering the light passing through depending on its orientation in relation to the rows of crystals. His inexpensive polarizer made it possible to reliably and practically filter light so only wavelengths with a particular orientation would pass through.</p>
<p>Land founded the Polaroid Corp. in 1937 to commercialize his new technology. His sheet polarizers found applications ranging from the identification of chemical compounds to adjustable sunglasses. Polarizing filters became standard in photography to reduce glare. Today the principles of polarized light are used in most computer and cellphone screens to enhance contrast, decrease glare and even turn on or off individual pixels.</p>
<p><a href="https://doi.org/10.1167/iovs.03-0124">Polarizing filters help researchers visualize structures</a> that might not be seen otherwise – from astronomical features to biological structures. In my own field of vision science, polarization imaging localizes classes of chemicals, such as <a href="https://doi.org/10.1364/JOSAA.24.001468">protein molecules leaking from blood vessels</a> in diseased eyes. Polarization is also combined with high-resolution imaging techniques to detect <a href="https://doi.org/10.1038/s41598-017-03529-8">cellular damage</a> beneath the reflective retinal surface. </p>
<h2>A new way to get the data out</h2>
<p>Before the days of high-speed digital capture of data and affordable high-resolution displays, or use of videotape, Polaroid photography was the method of choice to obtain output in many scientific labs. Experiments or medical tests needed graphical or pictorial output for interpretation, often from an analog oscilloscope which plotted out a voltage or current change over time. The oscilloscope was fast enough to capture key features of the data – but recording the output for later analysis was a challenge before Land’s instant camera came along.</p>
<p>A common example in vision science is the recording of eye movements. A research study reported in 1960 plotted light reflected from an observer’s moving eye on an oscilloscope screen, which was photographed with a <a href="https://doi.org/10.1364/JOSA.50.000245">mounted Polaroid camera</a> – not unlike the consumer Polaroid camera a family might pull out at a birthday party. For decades, research labs and medical facilities used <a href="https://www.ebay.com/p/Tektronix-C-5c-Oscilloscope-Camera-for-Polaroid-Film-B054450/1437576020">setups consisting of a Polaroid camera and a mounting rig</a> to collect electrical signals displayed on oscilloscope screens. The format sizes are less than dazzling compared to modern digital resolutions, but they were revolutionary at the time.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=599&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=599&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=599&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=753&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=753&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=753&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Land’s inventions led to the widespread use of polarized light to characterize tissues and objects, as in this pseudo-color image of a diabetic patient’s retina that unmasks irregular structures caused by edema.</span>
<span class="attribution"><span class="source">Ann Elsner</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>In 1987, with the founding of my new retinal imaging laboratory, there was no inexpensive method to provide shareable output of our <a href="https://doi.org/10.1016/0042-6989(95)00100-E">novel images</a>. After a few years of struggling to obtain high-quality output for conferences and publications, the Polaroid Corp. came to our rescue, with the donation of a printer, allowing our scientific contributions to reach an audience beyond our lab.</p>
<h2>Eyes are not cameras</h2>
<p>Land’s contributions go beyond patenting over 500 innovations and inventing products that millions purchased. His understanding of the interaction of light and matter promoted novel ways of characterizing chemicals with polarized light. And he provided insights into the workings of the human visual system that had seemed to defy the laws of physics, coming up with what he called the <a href="https://pdfs.semanticscholar.org/8b2a/d82ce40117417fa36ba16941ce022f2185f3.pdf">Retinex theory</a> of color vision to explain how people perceive a broad range of color <a href="https://doi.org/10.1364/JOSAA.3.000916">without the expected wavelengths</a> being present in the room.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroids clipped to a string agains brick wall" src="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Quick prints can be shared and displayed.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/hillaryandanna/760585681">Hillary Hartley</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Despite his brilliance, Land’s Polaroid Corp. eventually hit hard times in the decades after his death in 1991. Heavily invested in its film sales, Polaroid wasn’t prepared as all tiers of the imaging market went digital, with everyone from consumer photographers to high-end medical and optical imagers abandoning film and processing.</p>
<p>But rather than sink with the film market, Polaroid reinvented itself with new products that could help output the new world of digital images. And in a case of history repeating itself, <a href="https://us.polaroid.com/collections/instant-cameras">Polaroid</a> and other manufacturers of instant cameras are enjoying renewed popularity with younger generations who had no exposure to the original versions. Just like little Jennifer Land, plenty of people today still want a tangible version of their pictures, right now.</p>
<p><em>This is an updated version of an article originally published on May 18, 2018. It corrects the year Jennifer Land inspired her father’s invention.</em></p><img src="https://counter.theconversation.com/content/94282/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ann Elsner receives funding from NIDILRR and NIH. She owns shares in Aeon Imaging, LLC.</span></em></p>Whether at a family gathering or in a research lab, getting access to images immediately was a game-changer. And Land’s innovations went far beyond the instant photo.Ann Elsner, Professor of Optometry, Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/902582018-01-22T12:25:46Z2018-01-22T12:25:46ZThe next generation of cameras might see behind walls<figure><img src="https://images.theconversation.com/files/202620/original/file-20180119-110121-1wggumj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>You might be really pleased with the camera technology in your latest smartphone, which can recognise your face and take slow-mo video in ultra-high definition. But these technological feats are just the start of a larger revolution that is underway.</p>
<p>The latest camera research is shifting away from increasing the number of mega-pixels towards fusing camera data with computational processing. By that, we don’t mean the Photoshop style of processing where effects and filters are added to a picture, but rather a radical new approach where the incoming data may not actually look like at an image at all. It only becomes an image after a series of computational steps that often involve complex mathematics and modelling how light travels through the scene or the camera.</p>
<p>This additional layer of computational processing magically frees us from the chains of conventional imaging techniques. One day we may not even need cameras in the conventional sense any more. Instead we will use light detectors that only a few years ago we would never have considered any use for imaging. And they will be able to do incredible things, like see through fog, inside the human body and even behind walls.</p>
<h2>Single pixel cameras</h2>
<p>One extreme example is the <a href="https://dx.doi.org/10.1098%252Frsta.2016.0233">single pixel camera</a>, which relies on a beautifully simple principle. Typical cameras use lots of pixels (tiny sensor elements) to capture a scene that is likely illuminated by a single light source. But you can also do things the other way around, capturing information from many light sources with a single pixel. </p>
<p>To do this you need a controlled light source, for example a simple data projector that illuminates the scene one spot at a time or with a series of different patterns. For each illumination spot or pattern, you then measure the amount of light reflected and add everything together to create the final image. </p>
<p>Clearly the disadvantage of taking a photo in this is way is that you have to send out lots of illumination spots or patterns in order to produce one image (which would take just one snapshot with a regular camera). But this form of imaging would allow you to create otherwise impossible cameras, for example that work at wavelengths of light beyond the visible spectrum, where good detectors <a href="https://www.nature.com/articles/ncomms12010">cannot be made into cameras</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-amazing-camera-that-can-see-around-corners-51948">The amazing camera that can see around corners</a>
</strong>
</em>
</p>
<hr>
<p>These cameras could be used to take photos through <a href="https://www.osapublishing.org/oe/abstract.cfm?uri=oe-23-11-14424">fog or thick falling snow</a>. Or they could <a href="http://advances.sciencemag.org/content/3/4/e1601782">mimic the eyes of some animals</a> and automatically increase an image’s resolution (the amount of detail it captures) depending on what’s in the scene.</p>
<p>It is even possible to capture images from light particles that have <a href="https://www.nature.com/articles/nature13586">never even interacted</a> with the object we want to photograph. This would take advantage of the idea of “quantum entanglement”, that two particles can be connected in a way that means whatever happens to one happens to the other, even if they are a long distance apart. This has intriguing possibilities for looking at objects whose properties might change when lit up, such as the eye. For example, does a retina look the same when in darkness as in light?</p>
<h1>Multi-sensor imaging</h1>
<p>Single-pixel imaging is just one of the simplest innovations in upcoming camera technology and relies, on the face of it, on the traditional concept of what forms an picture. But we are currently witnessing a surge of interest for systems where that use lots of information but traditional techniques only collect a small part of it.</p>
<p>This is where we could use multi-sensor approaches that involve many different detectors pointed at the same scene. <a href="https://www.nasa.gov/mission_pages/hubble/multimedia/index.html">The Hubble telescope</a> was a pioneering example of this, producing pictures made from combinations of many different images taken at different wavelengths. But now you can buy commercial versions of this kind of technology, such as the <a href="https://www.lytro.com/255D">Lytro camera</a> that collects information about light intensity and direction on the same sensor, to produce images that can be refocused after the image has been taken.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Light L16.</span>
<span class="attribution"><span class="source">Light</span></span>
</figcaption>
</figure>
<p>The next generation camera will probably look something like the <a href="https://light.co/camera">Light L16 camera</a>, which features ground-breaking technology based on more than ten different sensors. Their data are combined combined using a computer to provide a 50Mb, re-focusable and re-zoomable, professional-quality image. The camera itself looks like a very exciting Picasso interpretation of a crazy cell-phone camera.</p>
<p>Yet these are just the first steps towards a new generation of cameras that will change the way in which we think of and take images. Researchers are also working hard on the problem of seeing through fog, <a href="https://www.nature.com/articles/ncomms1747">seeing behind walls</a>, and even imaging deep inside the <a href="https://www.nature.com/articles/nphoton.2014.107">human body and brain</a>.
All of these techniques rely on combining images with models that explain how light travels through through or around different substances.</p>
<p>Another interesting approach that is gaining ground relies on artificial intelligence to “learn” to <a href="https://www.osapublishing.org/optica/abstract.cfm?uri=optica-4-9-1117">recognise objects from the data</a>. These techniques are inspired by learning processes in the human brain and are likely to play a major role in <a href="https://arxiv.org/abs/1709.07244">future imaging systems</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cDbGFT5rM0I?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Single photon and quantum imaging technologies are also maturing to the point that they can take pictures with incredibly low light levels and videos with incredibly fast speeds reaching a trillion frames per second. This is enough to even capture images <a href="https://www.nature.com/articles/ncomms7021">of light itself</a> travelling across as scene.</p>
<p>Some of these applications might require a little time to fully develop but we now know that the underlying physics should allow us to solve these and other problems through a clever combination of new technology and computational ingenuity.</p><img src="https://counter.theconversation.com/content/90258/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Daniele Faccio receives funding from EPSRC, QuantIC - The Quantum Hub for Imaging, The Leverhulme Trust, DSTL.</span></em></p><p class="fine-print"><em><span>Stephen McLaughlin receives funding from EPSRC for a variety of research grants which analyse data which require the computational imaging methods described in the article</span></em></p>Single-pixel cameras, multi-sensor imaging and quantum technologies will change the way we take photos.Daniele Faccio, Professor of Quantum Technologies, University of GlasgowStephen McLaughlin, Head of School of Engineering and Physical Sciences, Heriot-Watt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/890772017-12-20T19:06:05Z2017-12-20T19:06:05ZNeuroscience in pictures: the best images of the year<figure><img src="https://images.theconversation.com/files/199811/original/file-20171218-27538-1dg7d1r.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Neuroscientists require images to understand what's happening in the brain.</span> <span class="attribution"><span class="source">Chase Sherwell/QBI</span>, <span class="license">Author provided</span></span></figcaption></figure><p>To understand how the healthy brain works and what occurs in brain disease, neuroscientists use many microscopy techniques, ranging from whole-brain human MRIs to imaging within a single neuron (brain cell), creating stunning images in the process.</p>
<p>Here are a selection of the best and brightest produced by scientists at the <a href="http://qbi.uq.edu.au">Queensland Brain Institute</a> at <a href="http://uq.edu.au">The University of Queensland</a> in 2017.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=379&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=379&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=379&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=476&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=476&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199634/original/file-20171218-17889-cefuph.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=476&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Wei 'Leon' Luan/QBI</span></span>
</figcaption>
</figure>
<p>This is a side view of a mouse embryo’s brain. The axons of neurons (dark blue) that release dopamine, a neurotransmitter involved in reward and pleasure, grow towards their target brain regions.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=602&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=602&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=602&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=757&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=757&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199652/original/file-20171218-27557-1h3so5s.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=757&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Chase Sherwell/QBI</span></span>
</figcaption>
</figure>
<p>As neuroscience becomes increasingly of public interest, researchers are striving to make their findings accessible, with parallels to the pop art movement. These are MRI images of the human brain.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=594&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=594&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=594&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=747&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=747&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199633/original/file-20171218-17878-1er5sm1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=747&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Abdalla Mohamed, PhD student/QBI</span></span>
</figcaption>
</figure>
<p>This image shows diffusion tensor imaging, an MRI-based neuroimaging technique, revealling the fibre tracts through the corpus callosum in a rodent brain. The corpus callosum links the brain’s left and right hemispheres to each other. The colours represent the different directions that the tracts are travelling through the brain. </p>
<hr>
<h2>Small-scale wonders</h2>
<p>The colourful image below shows the nanoscale movements of individual molecules that are critical in mediating communication between neurons. Knowing how these molecules are organised, and how they move, is at the heart of understanding the brain in health and disease.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=505&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=505&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=505&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=634&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=634&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199629/original/file-20171218-17869-1ngdb1o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=634&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Ravikiran Kasula/QBI</span></span>
</figcaption>
</figure>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=495&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=495&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=495&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=621&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=621&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199632/original/file-20171218-17845-1uihenl.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=621&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Merja Joensuu/QBI</span></span>
</figcaption>
</figure>
<p>They may look like fireworks, but this image shows nanoscopic movements of single actin molecules. Actin is an essential protein found in all cells of plants and animals, in this case, a neurosecretory cell, a specialised type of nerve cell that releases message molecules into the blood.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199659/original/file-20171218-27607-panpxs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Lee Fletcher/QBI</span></span>
</figcaption>
</figure>
<p>This image shows the activity of a single neuron (gold) in the brain region the cortex, recorded after the surrounding neurons (cream) are activated with a flash of light.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199630/original/file-20171218-17851-vlgreu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Amandine Grimm/QBI</span></span>
</figcaption>
</figure>
<p>The blue neuron, which could be a manta ray atop a coral reef, expresses a protein tagged with a fluorescent marker. The pink of surrounding cells is formed from endoplasmic reticulum, a cell structure important for processing and transporting proteins.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=381&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=381&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=381&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=479&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=479&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199650/original/file-20171218-27547-43c7ws.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=479&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Eline van de Ven/QBI</span></span>
</figcaption>
</figure>
<p>This section of a mouse spinal cord shows a diversity of neuron types. The smaller neurons in pink are involved in pain and the large green neurons are involved in movement.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=215&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=215&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=215&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=270&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=270&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199639/original/file-20171218-27547-1tqz36f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=270&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Amandine Grimm/QBI</span></span>
</figcaption>
</figure>
<p>The organisation of neurons in the hippocampus, a brain region important for learning and memory, looks like a forest in snow. The “snow” is made of cell nuclei, which contain each cell’s genetic material. The “trees” are the neurons’ projections, along which electrical signals travel to enable communication with other cells.</p>
<hr>
<h2>The brain in disease</h2>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199653/original/file-20171218-27541-4q3umn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Adam Briner/QBI</span></span>
</figcaption>
</figure>
<p>In Alzheimer’s disease, tau protein (gold) becomes toxic as it builds up. It’s hard to believe these mesmerising, gem-like clusters can be so destructive.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=601&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=601&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=601&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=755&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=755&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199646/original/file-20171218-27585-4g58xw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=755&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Kok-Siong Chen/QBI</span></span>
</figcaption>
</figure>
<p>Understanding the characteristics of high-grade brain tumours is crucial to finding treatments for disease. High-resolution fluorescent imaging allows us to investigate how the normal brain cells become cancer cells and how they behave. This image demonstrates the infiltration process of the cancer cells (red) into the normal brain tissue (green).</p>
<hr>
<h2>Insights from nature</h2>
<p>Studying model organisms including sea creatures, zebrafish, and roundworms provides insight into vision, brain development, and nerve regeneration respectively. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199643/original/file-20171218-27585-1gvtptp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Wen-Sung Chung/QBI</span></span>
</figcaption>
</figure>
<p>Deep-sea creatures, including this jewel squid, emit their own light for defence, to attract prey, and even to camouflage. At a depth of 600m, the bioluminescent flashes emitted from the light organs of the jewel squid are deadly attractive to prey.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=745&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=745&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=745&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=936&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=936&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199644/original/file-20171218-27557-amwot1.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=936&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Miriam Henze/QBI</span></span>
</figcaption>
</figure>
<p>Two retinas are visible in each eye of this mantis shrimp. Mantis shrimp have the most complex visual system in the world; they can see visible and UV light, and can reflect and detect circular polarising light, an extremely rare ability in nature.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=596&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=596&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=596&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=750&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=750&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199648/original/file-20171218-27554-1fyuszz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=750&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Rumelo Amor/QBI</span></span>
</figcaption>
</figure>
<p>These are neurons firing in the brain of a one-week old zebrafish, recorded in 3D using a custom-built microscope and colour-coded for depth. Imaging activity in the brains of young zebrafish could lead to an understanding of how the brain is shaped for function.</p>
<hr>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=612&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=612&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=612&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=770&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=770&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199641/original/file-20171218-27538-8e1h5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=770&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Xue Yan Ho/QBI</span></span>
</figcaption>
</figure>
<p>A dish of <em>C. elegans</em> roundworms at different stages of their lifecycle. <em>C. elegans</em> is a simple, semi-transparent organism, making it an ideal model for researchers to study the nervous system.</p>
<hr>
<p><em>With thanks to QBI graphics designer Dr Nick Valmas, science writer Donna Lu and QBI PhD candidate Abdalla Z Mohamed.</em></p><img src="https://counter.theconversation.com/content/89077/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Take a look at some of the amazing neuroscience images out of the Queensland Brain Institute this year.Wei Luan, Postdoctoral Researcher, The University of QueenslandMerja Joensuu, Postdoctoral Research Fellow, The University of QueenslandRavi Kiran Kasula, PhD Student, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/728012017-03-13T02:33:13Z2017-03-13T02:33:13ZWhy we’re wasting money on medical tests and how behavioural insights can help<figure><img src="https://images.theconversation.com/files/159708/original/image-20170307-20739-1wayi0i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Doctors know most scans for low back pain are useless, but they have trouble convincing patients. </span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>In 2013 and 2014, more than 314,000 <a href="https://www.radiologyinfo.org/en/info.cfm?pg=bodyct">CT scans</a> of the lower back were ordered in Australia, <a href="https://www.safetyandquality.gov.au/atlas/">most of which showed no abnormalities</a>. In routine cases of low back pain, X-rays and CT scans provide no meaningful information to guide treatment, exposing patients to unnecessary radiation. </p>
<p>A number of factors have contributed to this, including increased consumer expectations, an ageing population, financial incentives (where doctors have a stake in imaging services) and “defensive medicine”, which is doctors protecting themselves against possible litigation arising from missing a diagnosis. </p>
<p>This is one of numerous areas of wasted health-care expenditure around the world. Studies in the US have reported that 20 to 25% of all healthcare delivered <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2690270/">is either not needed, or harmful</a>. The situation in Australia appears much the same. A conservative estimate of avoidable costs in Australia’s public hospital system is <a href="https://grattan.edu.au/wp-content/uploads/2014/03/806-costly-care.pdf">A$928 million</a>. </p>
<p>We can reduce some of this waste by looking at why doctors continue to order these tests and use behavioural techniques to change the situation.</p>
<h2>Why so much waste?</h2>
<p>One of the drivers of this waste is increasing consumer demand for medical tests. New technologies and increased public awareness have led to <a href="https://www.betterhealth.vic.gov.au/health/conditionsandtreatments/cancer-screening">increases in mass screening</a> for breast, bowel and cervical cancer. </p>
<p>Popular media further fuels demand; <a href="http://edition.cnn.com/2013/05/14/showbiz/angelina-jolie-double-mastectomy/">publicity of Angelina Jolie’s preventative mastectomy</a> in 2013 led to the “Angelina Jolie effect” – a <a href="http://breast-cancer-research.biomedcentral.com/articles/10.1186/s13058-015-0650-8">two-fold increase in consultations</a> for breast cancer genetic testing and risk-reduction surgery. While there is evidence to support screening in these cases, it has empowered consumers to request tests for a variety of other ailments, including X-rays and CT scans for routine low back pain. </p>
<p>Reducing healthcare waste relating to unnecessary tests has been a major priority for researchers, governments and health services for decades. Ironically, much of this effort has itself been wasted. Historical approaches to improving healthcare quality have revolved around the assumption that providing knowledge will solve the problem; if doctors are told X-rays and CT scans are not recommended in routine cases of low back pain, they will stop ordering them. </p>
<p>But the idea that knowledge leads to action is a flawed assumption. We know we should eat more vegetables and exercise more, but it doesn’t mean we do. In medicine, as in everyday life, there is a gap between what we know and what we do.</p>
<h2>How to use behavioural insights to help change doctors’ behaviour</h2>
<p>If it’s not just knowledge that drives human behaviour, how can we find out what does? The answer is deceptively simple: ask people why they do what they do.</p>
<p><a href="https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-7-37">Behavioural researchers</a> have identified 14 domains that influence our behaviour. In addition to knowledge, some of these influences include social influences, the environmental context, our professional identity and our beliefs about our capabilities.</p>
<p>When a team of researchers applied this psychological framework to the problem of overuse of X-rays in routine low back pain, they uncovered new insights into this behaviour. Some GPs reported they <a href="https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-7-38">lacked skills in communicating</a> to patients these investigations are of little or no value.</p>
<p>This was addressed through role play: using a prepared script to simulate a patient demanding an X-ray and giving doctors a response script suggesting alternative approaches, such as advice about appropriate activities and pain management strategies. </p>
<p>An example of such a script is: </p>
<blockquote>
<p>X-rays don’t really provide useful information that would change how we manage routine cases of back pain. They also expose you to radiation. Right now the best thing I can give you is some advice on how to manage your back pain. We can revisit the need for an X-ray or CT scan if more serious symptoms develop.</p>
</blockquote>
<p><a href="https://www.ncbi.nlm.nih.gov/pubmed/15805455">Studies have demonstrated</a> positive impacts of such techniques in changing low back pain health-care practices. But behaviour change should not stop at doctors. It’s also important to create more widespread public awareness that some tests are unnecessary and potentially harmful. <a href="http://www.choosingwisely.org.au/resources/consumers/5-questions-to-ask-your-doctor">NPS MedicineWise</a>, an independent, federal government-funded health organisation, developed a consumer resource outlining five questions to ask your doctor about tests. </p>
<h2>Studying the ‘why’</h2>
<p>The X-ray example shows that rather than continually producing and passively disseminating guidelines telling doctors what to do, it’s more worthwhile analysing why they do what they do. </p>
<p>Surprisingly, linking psychological theory to health-care improvement only began in earnest at the turn of the century. Alarmingly, over 15 years later, less than 10% of published quality-improvement studies explicitly report the use of such theory. </p>
<p>But this approach has demonstrated potential. For example, <a href="http://www.cochrane.org/news/support-health-professionals-reduces-unnecessary-use-antibiotics-hospitals">a recent review</a> of 29 studies aiming to reduce overuse of antibiotics found education alone was not as effective as interventions that employed additional behavioural techniques such as “enablement” (making it easier to do the right thing) and “restriction” (using rules such as restricting prescriptions to prevent doing the wrong thing). </p>
<p>However, much more research is needed in this area. An <a href="http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1000326">estimated 75 trials</a> testing new cures for diseases and injuries are published per day, equating to 319,000 since the year 2000. In comparison, roughly 7,000 studies have evaluated the effectiveness of the use of behavioural insights to make sure these cures are put into practice. </p>
<p>In other words, for every 45 trials designed to discover new cures, there is only one trial designed to test the use of behaviour change techniques to ensure these cures are applied to patients. </p>
<p>Without re-balancing this equation, there’s a risk of compounding the problem of waste in health care. Knowledge of what to do isn’t enough. We need to explore why doctors, patients and health-care professionals behave the way they do, and how we can influence their behaviour for the better. Only this can harness the full potential of medical research breakthroughs.</p><img src="https://counter.theconversation.com/content/72801/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Peter Bragge receives funding from a variety of government and research granting organisations to conduct healthcare quality improvement research, all of which is paid to his employer, Monash University. He played no role in any of the research outlined in this article. </span></em></p>Reducing health-care waste relating to unnecessary tests has been a major priority for researchers, governments and health services for decades. But how do we change the behaviour of doctors?Peter Bragge, Associate Professor, Healthcare Quality Improvement (QI) at Behaviour Works, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/702492017-01-17T04:33:42Z2017-01-17T04:33:42ZDetecting methane leaks with infrared cameras: They’re fast, but are they effective?<figure><img src="https://images.theconversation.com/files/152924/original/image-20170116-9055-cuiylb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Damage from a 2010 explosion and fire in San Bruno, California, caused by a leaking natural gas pipeline. The disaster killed eight people.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/thomashawk/5006359844">Thomas Hawk/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>Methane is the major component of natural gas, which heats our homes and recently surpassed coal as the <a href="https://www.eia.gov/todayinenergy/detail.php?id=27072">top fuel for generating electricity</a> in the United States. But methane is also a <a href="https://www.epa.gov/ghgemissions/overview-greenhouse-gases">powerful greenhouse gas</a> that contributes to global warming. And because methane is highly flammable, gas leaks pose a significant safety hazard, as we saw in fatal explosions in 2010 in <a href="https://ww2.kqed.org/news/2015/09/08/five-years-after-deadly-san-bruno-explosion-are-we-safer/">San Bruno, California</a> and 2015 in <a href="http://www.nytimes.com/2015/03/27/nyregion/reports-of-explosion-in-east-village.html">New York City</a>. The massive gas leak from the <a href="https://www.nytimes.com/2016/04/03/magazine/the-invisible-catastrophe.html">Aliso Canyon storage facility</a> in Southern California in October 2015 led to <a href="http://www.dailynews.com/general-news/20170112/socalgas-opens-gates-to-reporters-for-first-ever-tour-of-aliso-canyon-gas-field">evacuations</a> of over 8,000 families after reports of serious health issues. </p>
<p>The United States, Canada and Mexico are working together to reduce methane emissions from the oil and gas sector through the <a href="https://www.whitehouse.gov/the-press-office/2016/06/29/leaders-statement-north-american-climate-clean-energy-and-environment">North American Climate, Clean Energy and Environment Partnership</a>. As one step, the U.S. Environmental Protection Agency (EPA) recently finalized <a href="https://www.epa.gov/controlling-air-pollution-oil-and-natural-gas-industry/new-source-performance-standards-and#Final%20rules">rules</a> that require oil and gas companies to adopt leak detection and repair programs. </p>
<p>EPA recommends that gas companies use infrared cameras, one of the most commonly available leak detection technologies. These cameras enable gas leaks to be detected rapidly and safely. Although other detection technologies are available, they’re either expensive, slow or unsuitable.</p>
<p>My research focuses on evaluating leak detection technologies and using those insights to inform emissions mitigation policy. In our most recent <a href="http://dx.doi.org/10.1021/acs.est.6b03906">work</a>, we analyzed the limits of infrared cameras in effectively detecting methane leaks. Some of these limitations have important policy implications.</p>
<h2>Cameras work better in some conditions than others</h2>
<p>Infrared cameras work much like an iPhone camera, with a key difference. While an iPhone camera is sensitive to visible light, infrared cameras are sensitive to infrared light, the portion of the sun’s light that is invisible to the naked eye and has wavelengths longer than red. Since methane is sensitive to infrared light, infrared cameras can detect it. Indeed many production facility operators use them routinely for leak detection and repair procedures, with <a href="http://www.cpr.org/news/story/methane-hunt-tech-helps-colorado-oil-and-gas-operators-lead-way">anecdotal success</a>. </p>
<p>Despite such evidence, there were no systematic studies on the effectiveness of these cameras. And because all objects emit infrared light, we suspected that environmental conditions might play an important role in how the camera works. In the <a href="http://dx.doi.org/10.1021/acs.est.6b03906">study</a> I coauthored, we analyzed how environmental factors like temperature, wind, humidity and background conditions affected what the camera “sees.” </p>
<p>We developed a model to predict whether the camera will detect a leak of given size under different measurement conditions. In order to verify that this model is accurate, we intentionally released methane and compared the corresponding camera images with model results. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=303&fit=crop&dpr=1 600w, https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=303&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=303&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=381&fit=crop&dpr=1 754w, https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=381&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/152822/original/image-20170116-16922-1eckpm6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=381&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Infrared camera image (grayscale) of a controlled methane leak superimposed on the modeled leak image (color).</span>
</figcaption>
</figure>
<p>We found that the cameras’ effectiveness in detecting leaks was highly variable based on weather conditions. Also, the camera operator’s expertise and even properties of the facility, such as its location and gas composition, affected the readings. Under ideal conditions, the cameras detected over 80 percent of the total leakage at the facility. </p>
<p>But such a high success rate is possible if and only if the camera is operated during periods of low wind, warm weather, clear skies and leaks are imaged from distances of about 30 feet. Under nonideal conditions (high wind, cold days or viewing distance greater than 100 feet), they detected as little as 10 percent of the total leakage. </p>
<p>Recent EPA <a href="https://www.regulations.gov/docket?D=EPA-HQ-OAR-2010-0505">regulations</a> do not specify many crucial parameters. For example, an operator decides maximum viewing distance and acceptable wind conditions for testing leaks; and there are no requirements on temperature or cloud cover conditions. </p>
<p>With such latitude in determining testing protocols, an operator can perform a leak detection survey, without actually finding any leaks. This can be done by searching for leaks on cold or humid days or measuring from distances farther than about 100 feet. In order to consistently find leaks using this technology, we concluded that EPA’s rules need to be more specific, or use a different metric to track progress.</p>
<h2>A possible solution: Fix ‘super-emitters’</h2>
<p><a href="http://www.forbes.com/sites/jeffmcmahon/2016/04/03/theres-good-news-and-bad-news-about-americas-leaking-methane/#59ba8d3595b2">“Super-emitters,”</a> as the name implies, are very large leaks with a leak rate between 100 tons/year to over 1,000 tons/year. All tests that have been conducted at gas facilities – production wells, processing facilities and compressor stations – often find a small number of these super-emitters. </p>
<p>Infrared cameras are naturally suited to detect these super-emitters because it is easier to detect bigger leaks than smaller ones, even under unfavorable measurement conditions. Fixing super-emitters could be a cost-effective way for facility operators to significantly reduce leakage and improve safety.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/152926/original/image-20170116-9055-1ck92t5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Infrared imaging of the Aliso Canyon leak from a natural gas storage reservoir near Porter Ranch, California, January 12, 2016.</span>
<span class="attribution"><a class="source" href="http://earthobservatory.nasa.gov/IOTD/view.php?id=88245&src=eorss-iotd&utm_source=twitterfeed&utm_medium=twitter">NASA Earth Observatory</a></span>
</figcaption>
</figure>
<h2>Not all locations are created equal</h2>
<p>The number and sizes of leaks at any facility vary significantly across the country. For example, recent research at production facilities has shown that super-emitters contribute to a larger fraction of total leakage in <a href="http://pubs.acs.org/doi/abs/10.1021/acs.est.5b05503">Pennsylvania</a> compared to <a href="http://fortworthtexas.gov/gaswells/air-quality-study/final/">Texas</a>. Because super-emitters are easily detected by these cameras, leak detection will be more effective in Pennsylvania than in Texas. </p>
<p>In addition to its size, the composition of a leak can also significantly affect detection. Drilled wells can often release nonmethane gases like ethane and propane. Often termed “wet-gas,” these nonmethane compounds are more sensitive to infrared light than methane. Therefore, a leak rich in these compounds will “look” brighter than a similar leak that contains only methane. Such situations commonly occur in regions where significant oil is also extracted along with gas.</p>
<p>These subtleties have important consequences for policy. A camera’s effectiveness depends not only on its own properties, but also on the properties of the facility being tested. An 80 percent success rate in one location will not translate into a similar success rate at other locations. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/ClLP_Xv1buA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Using an infrared camera to detect methane emissions at an oil and natural gas corporation facility (video developed by EPA).</span></figcaption>
</figure>
<h2>Better options?</h2>
<p>Detecting methane leaks is a hard problem further complicated by the large number of potential sources and by a scarcity of cheap detection technologies. By effectively endorsing infrared cameras as a preferred option, EPA has rightly chosen the most efficient and, by some estimates, least-cost option. </p>
<p>However, new technologies and startups, many funded by the Department of Energy’s <a href="https://arpa-e.energy.gov/?q=arpa-e-programs/monitor">MONITOR</a> program, are being developed. Although many are in their early stages, these new technologies promise faster and more accurate leak detection. It would be a mistake for any mitigation policy to not take advantage of these new technologies, especially if they can further reduce costs and improve safety. We are currently researching broader questions on ways to design effective methane mitigation policy. How this will unfold in the <a href="https://www.washingtonpost.com/news/energy-environment/wp/2016/11/11/this-is-the-other-way-that-trump-could-worsen-global-warming/?utm_term=.cd475183f078">context of the incoming administration</a> remains to be seen.</p><img src="https://counter.theconversation.com/content/70249/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Arvind P. Ravikumar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Infrared cameras are the technology of choice for detecting gas leaks across the US. New research shows that these cameras can be quite inaccurate, and leaks can persist without being detected.Arvind P. Ravikumar, Post-doctoral Fellow: Energy systems analysis and energy policy, Stanford UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/574062016-05-31T01:04:47Z2016-05-31T01:04:47ZHow computing power can help us look deep within our bodies, and even the Earth<figure><img src="https://images.theconversation.com/files/122911/original/image-20160517-9476-w78fh8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The computer does more of the work than you might think.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-401715220/stock-photo-thessaloniki-greece-february-official-opening-of-the-first-ct-imaging-pet-ct-scanner.html?src=wcSemSkkJRQbjbDYm9SbKA-2-59">CT computer and scan room image via shutterstock.com</a></span></figcaption></figure><p>CAT scans, MRI, ultrasound. We are all pretty used to having machines – and doctors – peering into our bodies for a whole range of reasons. This equipment can help diagnose diseases, pinpoint injuries, or give expectant parents the first glimpse of their child.</p>
<p>As computational power has exploded in the past half-century, it has enabled a parallel expansion in the capabilities of these computer-aided imaging systems. What used to be pictures of two-dimensional “slices” have been assembled into high-resolution three-dimensional reconstructions. Stationary pictures of yesteryear are today’s real-time video of a beating heart. The advances have been truly revolutionary.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/EN5qgpVxrcU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">A cardiac MRI scan shows a heart beating.</span></figcaption>
</figure>
<p>Though different in their details, X-ray computed tomography, ultrasound and even MRI have a lot in common. The images produced by each of these systems derive from an elegant interplay of sensors, physics and computation. They do not operate like a digital camera, where the data captured by the sensor are basically identical to the image produced. Rather, a lot of processing must be applied to the the raw data collected by a CAT scanner, MRI machine or ultrasound system to produce before it the images needed for a doctor to make a diagnosis. Sophisticated algorithms based on the underlying physics of the sensing process are required to put Humpty Dumpty back together again.</p>
<h2>Early scanning methods</h2>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=360&fit=crop&dpr=1 600w, https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=360&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=360&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=453&fit=crop&dpr=1 754w, https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=453&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/122907/original/image-20160517-9491-18otosr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=453&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">One of the first published X-rays (at right, with normal view of the hand at left), from 1896.</span>
<span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File%3AX-ray_1896_nouvelle_iconographie_de_salpetriere.jpg">Albert Londe</a></span>
</figcaption>
</figure>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=367&fit=crop&dpr=1 600w, https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=367&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=367&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=461&fit=crop&dpr=1 754w, https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=461&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/122908/original/image-20160517-9464-1m98rqs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=461&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A modern hand X-ray.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/golanlevin/19300737031/">golanlevin/flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Though we use X-rays in some cutting-edge imaging techniques, X-ray imaging actually <a href="https://www.nde-ed.org/EducationResources/CommunityCollege/Radiography/Introduction/history.htm">dates back to the late 1800s</a>. The shadowlike contrast in X-ray images, or projections, shows the density of the material between the X-ray source and the data sensor. (In the past this was a piece of X-ray film, but today is usually a digital detector.) Dense objects, such as bones, absorb and scatter many more X-ray photons than skin, muscle or other soft tissue, which appear darker in the projections.</p>
<p>But then in the early 1970s, X-ray CAT (which stands for Computerized Axial Tomography) scans were developed. Rather than taking just a single X-ray image from one angle, a CAT system rotates the X-ray sources and detectors to collect many images from different angles – a process known as tomography. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/yTDgFW2UZFI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Computerized tomography imagery of a hand.</span></figcaption>
</figure>
<p>The difficulty is how to take all the data, from all those X-rays from so many different angles, and get a computer to properly assemble them into 3D images of, say, a person’s hand, as in the video above. That problem had a mathematical solution that had been studied by the <a href="https://thatsmaths.com/2013/03/07/ct-scans-and-the-radon-transform/">Austrian mathematician Johann Radon</a> in 1917 and rediscovered by the American physicist (and Tufts professor) <a href="http://www.nytimes.com/1998/05/09/us/allan-cormack-74-nobelist-who-helped-invent-cat-scan.html">Allan Cormack</a> in the 1960s. Using Cormack’s work, <a href="http://dx.doi.org/10.1148/radiol.2343042584">Godfrey Hounsfield</a>, an English electrical engineer, was the first to demonstrate a working CAT scanner in 1971. For their work on CAT, Cormack and Hounsfield received the <a href="http://www.nobelprize.org/nobel_prizes/medicine/laureates/1979/">1979 Nobel Prize in Medicine</a>. </p>
<h2>Extending the role of computers</h2>
<p>Until quite recently, these processing methods had more or less been constant since the 1970s and 1980s. Today, additional medical needs – and more powerful computers – are driving big changes. There is increased interest in CT systems that <a href="http://www.fda.gov/Radiation-EmittingProducts/RadiationEmittingProductsandProcedures/MedicalImaging/MedicalX-Rays/ucm115329.htm">minimize X-ray exposure</a>, yielding high-quality images from fewer images. In addition, certain uses, such as breast imaging, encounter physical constraints on how much access the imager can have to the body part. This requires scanning from only a very limited set of angles around the subject. These situations have led to research into <a href="http://www.massgeneral.org/imaging/services/3D_mammography_tomosynthesis.aspx">what are called “tomosynthesis” systems</a> – in which limited data are interpreted by computers to form fuller images. </p>
<p>Similar problems arise, for example, in the context of imaging the ground to see what objects – such as pollutants, land mines or oil deposits – are hidden beneath our feet. In many cases, all we can do is <a href="http://physicsworld.com/cws/article/news/2016/feb/16/ground-penetrating-radar-boosts-asparagus-production">send signals from the surface</a>, or drill a few holes to take sampling measurements. <a href="https://www.ncjrs.gov/school/ch3c_5.html">Security scanning in airports</a> is constrained by cost and time, so those X-ray systems can take only a few images.</p>
<p>In these and a host of other fields, we are faced with less overall data, which means the Cormack-Hounsfield mathematics can’t work properly to form images. The effort to solve these problems has led to the rise of a new area of research, “computational sensing,” in which sensors, physics and computers are being brought together in new ways. </p>
<p>Sometimes this involves applying more computer processing power to the same data. In other cases, hardware engineers designing the equipment <a href="https://www.ecse.rpi.edu/homepages/saulnier/eit/eit.html">work closely with the mathematicians</a> figuring out how best to analyze the data provided. Together these systems can provide new capabilities that hold the promise of major changes in many research areas.</p>
<h2>New scanning capabilities</h2>
<p>One example of this potential is in bio-optics, the use of light to look deep within the human body. While visible light does not penetrate far into tissue, anyone who has shone a red laser pointer into their finger knows that red light does in fact make it through at least a couple of centimeters. Infrared light penetrates even farther into human tissue. This capability opens up entirely new ways to image the body than X-ray, MRI or ultrasound.</p>
<p>Again, it takes computing power to move from those images into a unified 3D portrayal of the body part being scanned. But the calculations are much more difficult because the way in which light interacts with tissue is far more complex than X-rays.</p>
<p>As a result we need to use a different method from that pioneered by Cormack in which X-ray data are, more or less, directly turned into images of the body’s density. Now we construct an algorithm that follows a process over and over, feeding the result from one iteration back as input of the next. </p>
<p>The process starts by having the computer guess an image of the optical properties of the body area being scanned. Then it uses a computer model to calculate what data from the scanner would yield that image. Perhaps unsurprisingly, the initial guess is generally not so good: the calculated data don’t match the actual scans. </p>
<p>When that happens, the computer goes back and refines its guess of the image, recalculates the data associated with this guess and again compares with the actual scan results. While the algorithm guarantees that the match will be better, it is still likely that there will be room for improvement. So the process continues, and the computer generates a new and more improved guess. </p>
<p>Over time, its guesses get better and better: it creates output that looks more and more like the data collected by the actual scanner. Once this match is close enough, the algorithm provides the final image as a result for examination by the doctor or other professional.</p>
<p>The new frontiers of this type of research are still being explored. In the last 15 years or so, researchers – including my Tufts colleague <a href="https://ase.tufts.edu/biomedical/research/Fantini/">Professor Sergio Fantini</a> – have explored many potential uses of infrared light, such as <a href="http://dx.doi.org/10.1007/s10549-013-2802-9">detecting breast cancer</a>, functional brain imaging and <a href="http://dx.doi.org/10.1016/j.bbapap.2013.01.025">drug discovery</a>. Combining “big data” and “big physics” requires a close collaboration among electrical and biomedical engineers as well as mathematicians and doctors. As we’re able to develop these techniques – both mathematical and technological – we’re hoping to make major advances in the coming years, improving how we all live.</p><img src="https://counter.theconversation.com/content/57406/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Eric Miller receives funding from NSF, NIH, DHS. </span></em></p>Pairing more powerful computers with increasingly sensitive scanners can yield many benefits in medicine and other fields.Eric Miller, Professor and Chair of Electrical and Computer Engineering, Adjunct Professor of Computer Science, Adjunct Professor of Biomedical Engineering, Tufts UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/560072016-03-15T19:08:01Z2016-03-15T19:08:01ZAntibiotics for colds, x-rays for bronchitis, internal exams with pap tests – the latest list of tests to question<figure><img src="https://images.theconversation.com/files/115059/original/image-20160315-17766-1oucbfb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Among the 61 recommendations is: 'Don’t order chest x-rays in patients with uncomplicated acute bronchitis'.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/">Monkey Business Images/Shutterstock</a></span></figcaption></figure><p>The problem of questionable treatment and tests which may provide little or no benefit, yet may cause harm, is ubiquitous across all areas of health care. </p>
<p>Harm doesn’t just come in the form of side-effects or further testing. The “cons” of any treatment or test also include the costs, which can be financial, emotional, and the costs of the individual’s time. </p>
<p>The <a href="https://theconversation.com/less-is-the-new-more-choosing-medical-tests-and-treatments-wisely-40756">Choosing Wisely campaign</a> encourages patients and clinicians to question unnecessary treatments. First launched in America in 2012 and in Australia last year, the clinician-led initiative collates lists of tests, treatments and procedures that provide little or no value, and which may cause harm. </p>
<p>Today the Australian organisers, NPS MedicineWise, will release an additional 61 recommendations. These include:</p>
<ul>
<li><p>Don’t order chest x-rays in patients with uncomplicated acute bronchitis (<em>Routine chest x-rays don’t improve outcomes and may lead to false positives, further investigations and unnecessary radiation</em>)</p></li>
<li><p>Avoid prescribing antibiotics for upper respiratory tract infections, also known as the common cold (<em>Most uncomplicated upper respiratory infections are viral and antibiotic therapy isn’t suitable</em>)</p></li>
<li><p>Don’t initiate medicines to prevent disease in patients who have a limited life expectancy (<em>There is limited evidence to support the use of many medicines in frail, elderly patients who are more susceptible to the side-effects of medicines</em>) </p></li>
<li><p>Don’t routinely do a pelvic examination with a pap smear (<em>The procedure can cause pain, fear, anxiety and embarrassment and can lead to unnecessary, invasive and potentially harmful diagnostic procedures</em>)</p></li>
<li><p>Don’t request imaging for patients with non-specific low back pain (<em>Trials have consistently shown there is no advantage from routine imaging of non-specific low back pain and there are potential harms</em>).</p></li>
</ul>
<p>The need for informed conversations about potentially unnecessary treatments, tests and procedures is certainly not restricted to only the medical professions.</p>
<p>As well as the medical colleges and societies involved, it is encouraging that in this second release, organisations which represent nurses and allied health professionals such as physiotherapists and hospital pharmacists have participated. Hopefully in future releases, we will see more of Australia’s allied health organisations becoming involved in Choosing Wisely. </p>
<p>As with the <a href="https://theconversation.com/less-is-the-new-more-choosing-medical-tests-and-treatments-wisely-40756">2015 lists</a>, most of the recommendations are about doing less. Only a few are about encouraging a particular action to be done. An example is having an earlier conversation about prognosis, wishes, values and end of life in patients with advanced disease. </p>
<p>This may be because we clinicians are guilty more often of doing too much than too little. </p>
<p>This is counter-intuitive to most of us. Somehow, the thought that a clinician might have not done enough feels more reprehensible than their having done too much. And this is not just what patients might think – it’s probably true of many clinicians as well. </p>
<p>The memories of many junior hospital doctors probably include over-ordering tests (“just in case”, but also to demonstrate their knowledge of rare diagnostic possibilities) to avoid their seniors criticising them during an upcoming ward round. </p>
<p>The realisation that patients can actually be harmed more by receiving unnecessary tests, procedures, and treatments, than by not having received them has been painfully slow. </p>
<p>The Choosing Wisely campaign helps to signal a very important departure from normal business for clinicians and their organisations – thinking about <em>not</em> doing things. </p>
<p>While one of the drivers behind the Choosing Wisely campaign is reducing the tests and treatments people receive that provide little or no benefit, another is minimising the harm that can result from them. </p>
<p>For many of the recommendations, the harm is one that affects the individual. Quite a few of the recommendations are about not doing medical imaging and screening (such as not requesting imaging for non-specific low back pain). These typify individual harm – for example, unnecessary radiation exposure increases the risk of cancer. </p>
<p>Then there is a cluster of recommendations about the wise use of antibiotics. Antibiotic use has the interesting peculiarity of potentially causing harm to both individual patients and the community. We know that antibiotics – which can be life-saving for some serious infections such as meningitis and pneumonia – have little benefits to the common coughs and colds that make up a huge proportion of general practitioner visits. On balance, these benefits are of the same order as the common harms they cause (such as vaginal or oral thrush, diarrhoea, rashes, and so on). </p>
<p>But another important harm is the risk of inducing resistance. Antibiotic resistance – when bacteria adapt and antibiotics fail – is a deepening crisis that is <a href="https://theconversation.com/antibiotic-resistance-sorry-not-my-problem-44011">already killing thousands directly</a> and may soon disrupt many routine clinical procedures. </p>
<p>Antibiotic resistance is a direct result of antibiotic use. The more antibiotics are used when they are not needed, the less likely they are to be effective when needed for a bacterial infection. </p>
<p>So while the unnecessary use of antibiotics has potential harms to the individual, it can also contribute to the restricted use of antibiotics by others in the community who do need it. </p>
<p>For all of the recommendations, there is the harm to society that occurs from the wasted resources and cost of providing unnecessary tests and treatments, often at the expense of more effective uses of precious health care dollars. </p>
<p>But the premise behind Choosing Wisely is not about cost-cutting. It is one of the few existing processes for dealing with the one-way ratchet caused by more treatments and tests being generated every year, all of which increases the amount of things that can – but not necessarily should – be provided to patients. </p>
<p>No test or treatment should be provided to a patient <a href="https://www.mja.com.au/journal/2014/201/1/shared-decision-making-what-do-clinicians-need-know-and-why-should-they-bother">without a conversation</a> between the patient and clinician, during which the options (including the option of doing nothing), their benefits and harms, and the patient’s preferences and values <a href="http://www.choosingwisely.org.au/5-questions-to-ask-your-doctor">are discussed</a>.</p><img src="https://counter.theconversation.com/content/56007/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tammy Hoffmann has received funding from the NHMRC and ACSQHC for research about shared decision making.</span></em></p><p class="fine-print"><em><span>Chris Del Mar receives funding from the NHMRC and ACSQHC for related research </span></em></p>Harm doesn’t just come in the form of side-effects or further testing. The “cons” of any treatment also include the costs, which can be financial, emotional, and the costs of the individual’s time.Tammy Hoffmann, Professor of Clinical Epidemiology, Bond UniversityChris Del Mar, Professor of Public Health, Bond UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/519482015-12-07T18:28:48Z2015-12-07T18:28:48ZThe amazing camera that can see around corners<figure><img src="https://images.theconversation.com/files/104716/original/image-20151207-3133-79fjh0.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's just your average corner, but a far from average camera.</span> <span class="attribution"><span class="source">Gariepy et al./Heriott-Watt</span></span></figcaption></figure><p>How can a person see around a blind corner? One answer is to develop X-ray vision. A more mundane approach is to use a mirror. But if neither are an option, a group of scientists led by Genevieve Gariepy have developed a state-of-the-art detector which, with some clever data processing techniques, can turn walls and floors into a “virtual mirror”, giving the power to locate and track moving objects out of direct line of sight. </p>
<p>The shiny surface of a mirror works by reflecting scattered light from an object at a well-defined angle towards your eye. Because light scattered from different points on the object is reflected at the same angle, your eye sees a clear image of the object. In contrast, a non-reflective surface scatters light randomly in all directions, and creates no clear image. </p>
<p>However, as the researchers at Heriot-Watt University and the University of Edinburgh recognised, there is a way to tease out information on the object even from apparently random scattered light. Their method, <a href="http://www.nature.com/articles/doi:10.1038/nphoton.2015.234">published in Nature Photonics</a>, relies on laser range-finding technology, which measures the distance to an object based on the time it takes a pulse of light to travel to the object, scatter, and travel back to a detector.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=415&fit=crop&dpr=1 600w, https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=415&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=415&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=521&fit=crop&dpr=1 754w, https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=521&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/104700/original/image-20151207-3125-1ikdij2.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=521&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Laser light fired at the floor is scattered in a spherical wave. It reflects from the hidden object and makes its way back to the SPAD detector, where the noisy data is processed to reveal the hidden object.</span>
<span class="attribution"><a class="source" href="http://nature.com/articles/doi:10.1038/nphoton.2015.234">Gariepy et al./Nature Photonics</a></span>
</figcaption>
</figure>
<p>In principle, the measurement is quite simple. A laser pulse is bounced off the floor and scatters in all directions. A small fraction of the laser light strikes the object, and the back-scattered light is recorded on a patch of floor – the “virtual mirror” – next to the spot the laser strikes. Because the speed of light is known and constant, by measuring the time interval between the start of the laser pulse and the scattered light reaching the patch of floor, the position of the object can be triangulated. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Pi7iCUSXctY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>However, the devil is in the detail. The timing measurement needs to be accurate to within around 50 thousand billionths of a second (5x10<sup>-11</sup>, or 50 picoseconds), and the light levels that must be detected are extremely low. Overcoming both of these obstacles requires some serious laser and detector technology. The laser pulses used for the timing measurement are just ten femtoseconds (10 million-billionths of a second, or 10<sup>-15</sup>) long, and each pixel in the ultra-sensitive “camera” (known as a single-pixel avalanche diode array, or <a href="http://www.micro-photon-devices.com/Products/Photon-Counters">SPAD</a>) used to image the patch of floor is essentially an ultrafast stopwatch that records the arrival time of the scattered light pulse to within a few hundred-billionths of a second.</p>
<p>The complications do not end there. Light scattered from the object of interest reaches the virtual mirror of the floor, but so does light scattered from every other object in the vicinity. The success of this technique requires that the two be separated, the “signal” of the hidden object from the background noise of everything else. </p>
<p>This is achieved by using the fact that the hidden object the device is trying to detect is moving, while other nearby objects are not. Because the moving object generates a signal in the virtual mirror that changes with time, it can be filtered from the constant background signal produced by the stationary objects of the surroundings.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=407&fit=crop&dpr=1 600w, https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=407&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=407&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=511&fit=crop&dpr=1 754w, https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=511&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/104705/original/image-20151207-3108-131s1o6.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=511&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A histogram is created of photon arrival times for every pixel, which must then be processed to remove background objects from the target object.</span>
<span class="attribution"><a class="source" href="http://nature.com/articles/doi:10.1038/nphoton.2015.234">Gariepy et al./Nature Photonics</a></span>
</figcaption>
</figure>
<p>The final complication is that the timing measurement for scattered light arriving at a single point on the virtual mirror and recorded by a single pixel in the detector unfortunately doesn’t locate the object to a single unique position. A similar time delay could result from objects located at any number of different positions located an appropriate distance from the virtual mirror. </p>
<p>While the timing data from a single pixel only locates the object to a range of positions, the range is different for each pixel. However, it turns out that there is only a single position at which the timing condition is satisfied simultaneously for all pixels, and this allows the object to be unambiguously identified from the background signals.</p>
<p>The prototype camera system allows the object’s position behind the wall to be localised to within a centimetre or two, and by making measurements every few seconds the camera can also detect the speed of a moving object. In contrast to previous methods, which required long data processing times, the new method can track moving objects in real time. At present it’s limited to locating objects up to 60cm away from the virtual mirror on the floor, but this should improve to around ten metres, as well as to more closely detect the shapes of hidden objects as well as their positions.</p>
<p>So while it’s not quite as promising, or as convenient, as the science-fiction powers of X-ray vision, the study’s authors note that the technology has interesting future applications in areas such as surveillance – to detect a moving person behind a wall, for example – or in car safety systems to detect incoming vehicles approaching around corners.</p><img src="https://counter.theconversation.com/content/51948/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Claire Vallance receives funding from EPSRC, STFC, ERC, Royal Society, University of Oxford, Leverhulme Trust.</span></em></p>To see around a corner, all you need is a camera that can detect light at 100,000 billionths of a second.Claire Vallance, Professor of Physical Chemistry, University of OxfordLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/450722015-07-27T15:00:25Z2015-07-27T15:00:25ZWe transformed living cells into tiny lasers<figure><img src="https://images.theconversation.com/files/89538/original/image-20150723-22821-ernjsq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Green lasers glowing within cells.</span> <span class="attribution"><span class="source">Matjaž Humar and Seok Hyun Yun</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>In the last few decades, lasers have become an important part of our lives, with applications ranging from laser pointers and CD players to medical and research uses. Lasers typically have a very well-defined direction of propagation and very narrow and well-defined emission color. We usually imagine a laser as an electrical device we can hold in our hands or as a big box in the middle of a research laboratory.</p>
<p>Fluorescent dyes have also become commonplace, routinely used in research and diagnostics to identify specific cell and tissue types. Illuminating a fluorescent dye makes it emit light with a distinctive color. The color and intensity are used as a measure, for example, of concentrations of various chemical substances such as DNA and proteins, or to tag cells. The intrinsic disadvantage of fluorescent dyes is that only a few tens of different colors can be distinguished. </p>
<p>In a combination of the two technologies, researchers know that if a dye is placed in an optical cavity – a device that confines light, such as two mirrors, for example – they can create a laser.</p>
<p>Taking it all a step even further, our research, described in the journal Nature Photonics, shows we can create a miniature laser that can <a href="http://nature.com/articles/doi:10.1038/nphoton.2015.129">emit light inside a single live cell</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/SHbXDlnLIYA?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Tiny, tiny lasers</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=583&fit=crop&dpr=1 600w, https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=583&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=583&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=732&fit=crop&dpr=1 754w, https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=732&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/89689/original/image-20150724-8478-c0ljzm.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=732&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Green laser bead in a cell.</span>
<span class="attribution"><span class="source">Matjaž Humar and Seok Hyun Yun</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>We made our lasers out of solid polystyrene beads ten times smaller than the diameter of a human hair. The beads contain a fluorescent dye and the surface of the bead confines light, creating an optical cavity. We fed these laser beads to live cells in culture, which eat the lasers within a few hours. After that, we can operate the lasers by illuminating them with external light without any harm to the cells.</p>
<p>Then we capture the light emitted from the cells via a spectrometer and analyze the spectrum. The lasers can act as very sensitive sensors, enabling us to better understand cellular processes. For example, we measured the change in the refractive index – the way light travels through the cell – while varying the concentration of salt in the medium surrounding the cells. The refractive index is directly related to the concentration of chemical constituents within the cells, such as DNA, proteins and lipids.</p>
<p>Further, lasers can be used for cell tagging. Each laser within a cell emits light with a slightly different fingerprint that can be easily detected and used as a bar code to tag the cell. Since a laser has a very narrow spectral emission, a huge number of unique bar codes can be produced, something that was impossible before. </p>
<p>With careful laser design, up to a trillion cells (1,000,000,000,000) could be uniquely tagged. That’s comparable to the total number of cells in the human body. So in principle, it could be possible to individually tag and track every single cell in the human body. This is a huge leap from cell-tagging methods demonstrated until now, which can tag at most a few hundred cells. So far we’ve tagged cells only in Petri dishes, but there’s no reason it shouldn’t also work for cells within a living body.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/89691/original/image-20150724-8451-xrye2l.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Green cells with their blue nuclei were injected with red oil droplets that act as deformable lasers.</span>
<span class="attribution"><span class="source">Matjaž Humar and Seok Hyun Yun</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Alternative materials for cellular lasers</h2>
<p>Instead of a solid bead, we also used an droplet of oil as a laser inside cells. Using a micro pipette, we injected a tiny drop of oil containing fluorescent dyes into a cell. In contrast to the solid bead, forces acting inside the cells can deform the droplets. By analyzing the light emitted by a droplet laser, we can measure that deformation and calculate the force acting on the droplet. It’s a way to get a very precise picture of the kinds of mechanical forces exerted within cells by processes such as cellular migration and division.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/89693/original/image-20150724-8457-1qsc6cy.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Yellow lipid cells within subcutaneous fat tissue, which can be used as natural lasers.</span>
<span class="attribution"><span class="source">Matjaž Humar and Seok Hyun Yun</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Finally, we realized that fat cells already contain lipid droplets that can work as natural lasers. They don’t need to eat or be injected with lasers, just supplied with a nontoxic fluorescent dye. That means each of us already has millions of lasers inside our fat tissue that are just waiting to be activated to produce laser light. Next time you’re thinking about trimming down, you could just reconceptualize your body fat as a huge number of tiny lasers.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=547&fit=crop&dpr=1 600w, https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=547&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=547&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=687&fit=crop&dpr=1 754w, https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=687&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/89696/original/image-20150724-8457-11ux4ju.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=687&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Inserting an optical fibre into a piece of pig’s skin to excite and extract the laser light generated by subcutaneous fat cells.</span>
<span class="attribution"><span class="source">Matjaž Humar and Seok Hyun Yun</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>Our new cell laser technology will help us understand cellular processes and improve medical diagnosis and therapies. They could eventually provide remote sensing inside the human body without the need for sample collection. A cell is a smart machine, equipped with a computer with “DNA Inside.” Specialized cells, such as immune cells, can find the disease and site of inflammation, carrying the laser to the target for laser-based diagnosis and therapies. Imagine rather than a biopsy for a lump that doctors suspect to be cancer, cell lasers helping determine what its made of. Cell lasers also hold promise as a way of deliver laser for therapies, for example, to activate a photosensitive drug at the target to kill microbes or cancerous cells.</p><img src="https://counter.theconversation.com/content/45072/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matjaž Humar receives funding from Marie Curie International Outgoing Fellowship within the 7th European Community Framework Programme.</span></em></p><p class="fine-print"><em><span>Seok-Hyun Yun receives funding from National Science Foundation and National Institutes of Health.</span></em></p>Using fluorescent dye, researchers figured out how to turn cells into lasers – with applications for cell tagging and tracking as well as medical diagnoses and therapies.Matjaž Humar, Research Fellow in Dermatology, Harvard UniversitySeok-Hyun Yun, Associate Professor of Dermatology, Harvard UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/449412015-07-22T04:07:07Z2015-07-22T04:07:07ZHow drones can deliver tangible benefits to ordinary people in Africa<figure><img src="https://images.theconversation.com/files/89206/original/image-20150721-24275-zl8fa0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Unmanned aircraft, better known as drones, could have positive spin-offs for the African continent. </span> <span class="attribution"><span class="source">Reuters/Edward Echwalu</span></span></figcaption></figure><p>They are called unmanned aerial vehicles but are better known as <a href="http://dronewars.net/aboutdrone/">drones</a>. These are small aerial vehicles with fixed wings or small rotors, are usually powered with batteries, and equipped with a high resolution camera.</p>
<p>Drones range in cost from <a href="http://www.cnbc.com/2014/02/13/how-this-99-3-d-printed-drone-could-change-the-toy-industry.html">$99</a> to tens of thousands of dollars. But they are truly a disruptive technology in that they can do what piloted airplanes can but in cheaper, better, and – in many cases – more efficient ways.</p>
<p>In less than five years we will see unmanned aerial vehicles being flown on a myriad of missions doing good. They can be a game-changer on the African continent.</p>
<h2>Practical applications</h2>
<p>As part of on-going experimentation at the University of <a href="http://uas-test.umd.edu/">Maryland</a>, we have been flying small unmanned aerial vehicles in southern Africa for more than two years. We have found that they can be used in a number of practical ways in the medical field, agriculture, tourism and to protect the environment. </p>
<p>Drones have already shown their effectiveness in combatting <a href="http://www.slate.com/blogs/wild_things/2015/01/28/drones_for_wildlife_conservation_rangers_uavs_and_math_protect_elephants.html">rhino</a> and elephant poaching. Equipped with thermal imaging cameras and absolutely silent, drones can see animals and poachers in the bush at night.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=398&fit=crop&dpr=1 600w, https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=398&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=398&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=500&fit=crop&dpr=1 754w, https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=500&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/89208/original/image-20150721-24304-ogglpb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=500&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Is this Sony drone the next must-have item that could make a difference for the good of humankind.</span>
<span class="attribution"><span class="source">Reuters/Fabrizio Bensch</span></span>
</figcaption>
</figure>
<p>But there are a range of uses drones can be put to. They can be used for precision <a href="http://fortune.com/2015/05/18/drone-agriculture/">agriculture</a> to help farmers decide when and where to apply fertiliser or irrigate crops. Drones are also great for monitoring the depths of water holes.</p>
<p>We have found that unmanned aerial vehicles can be used to monitor fence lines so that instead of having individuals driving for hours every day to inspect the integrity of a fence, a low-flying unmanned aerial vehicle can videotape and analyse the structure of a fence in under an hour. If there are breaks in the fence, the drone’s computer can geo-tag the exact location. </p>
<p>In one park location we used an unmanned aerial vehicle instead of having two rangers drive the entire length of the fence. This resulted in savings of 51 litres of fuel a day. When calculated over a year the savings in fuel paid for the drone.</p>
<p>An unmanned aerial vehicle can also be dispatched when smoke is sighted in the sky. It can provide live video of a possible fire in minutes when it could take rangers several hours to drive to the location. This use has been exceptionally powerful during the day and at night.</p>
<p>There are also many ways that drones can be used to provide benefits to eco-tourists visiting lodges. Rather than driving around for hours looking for animals, the unmanned aerial vehicles can be dispatched to fly in front of a safari vehicle to scan the area for sightings. Happy tourists are likely to recommend a lodge to future visitors.</p>
<p>Finally, we are working on a project to use longer range unmanned aerial vehicles for flights of up to 30 kilometres to deliver medicines to remote villages. High-value but lightweight medicines are the perfect items for delivery by drones. This could be extraordinarily important to areas that may be cutoff during the rainy season. </p>
<h2>Barriers to approval</h2>
<p>Like many places around the world, the development of unmanned aerial vehicles technology has unfortunately outpaced the regulatory capability of national governments. As a result it is very difficult to obtain official permission to fly a drone – for any reason – in any African country. For example, the <a href="http://motherboard.vice.com/read/african-nations-are-banning-the-drones-that-could-stop-poachers">Kenyan government</a> has refused to grant permission to fly unmanned aerial vehicles in the highly threatened Tsavo West National Park. </p>
<p>Where we have tried to fly unmanned aerial vehicles, we have had to get permission from the host nation’s civil aviation authority, the national and local police, the military – usually the air force – and the intelligence community. The ministry of environment or tourism also has to be approached, but is usually the easiest place to obtain clearance to fly.</p>
<p>These efforts require many visits to Africa, countless forms that must be filled in, dozens of meetings with government agencies, and – in most cases – a denial based on an ill-informed understanding of unmanned aerial vehicle technology. Months and years have been wasted while the needs of many remain unmet.</p>
<p>Drones are a tool, nothing more. When used appropriately, they are a valuable tool with tangible benefits. Thirty years ago people feared computers but now the cellphone has become ubiquitous. Drones will soon become just as common for the good of the continent.</p><img src="https://counter.theconversation.com/content/44941/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Thomas Snitch does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Drones have a positive role to play on the African continent, from delivering medicines to fighting poaching and even giving visitors to game parks a head-up on where to spot the game.Thomas Snitch, Visiting Professor in Advanced Computer Studies, University of MarylandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/392722015-04-07T10:16:29Z2015-04-07T10:16:29ZFluorescent proteins light up science by making the invisible visible<figure><img src="https://images.theconversation.com/files/77059/original/image-20150403-9342-6eqpw2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Multiple fluorescent proteins illuminate the cells in a human brainstem.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/wbur/2926259123">Jeff Lichtman/Harvard University</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>When you look up at the blue sky, where are the stars that you see at night? They’re there but we can’t see them. A firefly flitting across a field is invisible to us during the day, but at night we can easily spot its flashes. Similarly, proteins, viruses, parasites and bacteria inside living cells can’t be seen by the naked eye under normal conditions. But a technique using a fluorescent protein can light up cells’ molecular machinations like a microscopic flashlight.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=911&fit=crop&dpr=1 600w, https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=911&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=911&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1144&fit=crop&dpr=1 754w, https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1144&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/77052/original/image-20150403-9306-19yekgr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1144&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The crystal jellyfish has about 300 photo organs on the bottom edge of the jellyfish’s umbrella.</span>
<span class="attribution"><span class="source">Courtesy Steven Haddock – http://biolum.eemb.ucsb.edu</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The first fluorescent protein found in nature comes from the crystal jellyfish, <em>Aequorea victoria</em>, where it is responsible for the green light emitted by its photo organs. It’s called green fluorescent protein (GFP). We don’t know why these jellyfish have this lit-up feature. </p>
<p>Fluorescent proteins absorb light with short wavelengths, such as blue light, and immediately return it with a different color light that has a longer wavelength, such as green. In <em>Aequorea victoria</em>, a protein named aequorin produces blue light which GFP converts into the green light emitted by the jellyfish’s photo organs. This visibility under standard conditions is extremely rare; most other organisms have fluorescent proteins that are only visible if they are illuminated by external blue light sources. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=647&fit=crop&dpr=1 600w, https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=647&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=647&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=812&fit=crop&dpr=1 754w, https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=812&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/77053/original/image-20150403-9312-e4gcxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=812&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Close up of a few of the photo organs.</span>
<span class="attribution"><span class="source">Courtesy Steven Haddock – http://biolum.eemb.ucsb.edu</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>After the green fluorescent jellyfish protein, many other fluorescent proteins have been both found in nature and created in the lab. We now have a spectrum of fluorescent colors available to us that make previously invisible biological structures and processes visible in blazing fluorescent glory. Many new applications reliant on these colors are being published on a regular basis.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/77054/original/image-20150403-9335-1x9xk6w.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Petri dish with bacterial colonies expressing differently colored fluorescent proteins. These fluorescent proteins developed by Roger Tsien’s group are called the mFruits and have names like mHoneydew, mTomato, mCherry, mRaspberry, and mPlum.</span>
<span class="attribution"><a class="source" href="http://en.wikipedia.org/wiki/File:FPbeachTsien.jpg">Paul Steinbach and Roger Y. Tsien, University of California, San Diego</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<h2>Shining a light on imaging</h2>
<p>Fluorescent protein technology has led to many other interesting developments designed to improve imaging with these glowing molecules.</p>
<p>CaMPARI is one new technique, short for calcium-modulated photoactivatable ratiometric integrator. By exploiting the fact that calcium concentrations change when nerve cells send signals, CaMPARI is able to <a href="http://dx.doi.org/10.1126/science.1260922">light up all the neurons that have fired</a> in a living organism. The technique is based on a fluorescent protein called EOS, which changes its fluorescence from green to red. In fruit flies, zebrafish and mice, CaMPARI-genetically-modified neurons fluoresce red if they are active and green if they are less active.</p>
<p>Before CaMPARI, all the fluorescent calcium indicators available temporarily lit up when the neuron fired. They couldn’t record the firing history of neurons or indicate whether a neuron had fired in the past. According to Loren Looger, one of the researchers who worked on the development of CaMPARI, “The most enabling thing about this technology may be that you don’t have to have your organism under a microscope during your experiment. So we can now <a href="http://www.hhmi.org/news/new-fluorescent-protein-permanently-marks-neurons-fire">visualize neural activity</a> in fly larvae crawling on a plate or fish swimming in a dish.”</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/c-NMfp13Uug?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The CLARITY technique removes opaque parts and makes the whole brain transparent.</span></figcaption>
</figure>
<h2>Expanding and transparent brains</h2>
<p>Even with the help of light emitted by fluorescent proteins, it’s difficult to image neurons tangled deep within the brain. Ed Boyden, a neuroscientist from MIT, has created a method to expand brains to make fluorescent neurons deep within the brain more visible. He uses acrylate, which forms a dense mesh to hold the brain in place and expand in the presence of water <a href="http://dx.doi.org/10.1126/science.1260088">thereby inflating the brain</a> equally by about 4.5 times in each direction. It’s a lot like a diaper expanding when it gets wet. Boyden thinks that this “expansion microscopy may provide a key tool for <a href="http://www.kurzweilai.net/expanding-the-brain-achieves-super-resolution-with-ordinary-confocal-microscopes">comprehensive, precise, circuit-wide, brain mapping</a>.”</p>
<p>One of the reasons expansion microscopy is so useful is that the brain can be made see-through before it is blown up several sizes larger. In 2013 Karl Deisseroth and Viviana Gradinaru at Stanford published a method called CLARITY that removes opaque molecules such as fats and <a href="http://dx.doi.org/10.1038/nature12107">makes the brain transparent</a> without changing its shape. According to Thomas Insel, director of the US National Institute of Mental Health, “This is probably one of the most <a href="http://dx.doi.org/10.1038/496151a">important advances for doing neuroanatomy</a> in decades.” Since developing CLARITY for brains, Gradinaru has extended the method to all other organs including <a href="http://dx.doi.org/10.1016/j.cell.2014.07.017">an entire mouse</a>.</p>
<p>Both of these methods can be applied to brains that have been genetically modified with fluorescent proteins, therefore allowing for the visualization of neurons deep within the brain.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=628&fit=crop&dpr=1 600w, https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=628&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=628&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=790&fit=crop&dpr=1 754w, https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=790&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/76988/original/image-20150402-9342-1lodej7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=790&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Mouse neurons labeled by GFPs.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/wellcomeimages/6880271634">Wellcome Images</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<p>In 2008, the three scientists responsible for taking GFP from the jellyfish and making it a common tool used in over a million experiments all over the world were awarded the <a href="http://www.nobelprize.org/nobel_prizes/chemistry/laureates/2008/">100th Nobel Prize in chemistry</a>. And in 2014 three other scientists were awarded the Nobel Prize for using fluorescent protein to <a href="http://www.nobelprize.org/nobel_prizes/chemistry/laureates/2014/">increase the resolution of light microscopes</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/76987/original/image-20150402-9306-wspqgb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"><em>E. coli</em> with GFPs glowing in their petri dishes.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/cdepaz/4979454151">Carlos de Paz</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<h2>Revolutionary and resilient</h2>
<p>I’ve been researching the photochemistry and photophysics of fluorescent proteins since they were first used in imaging technology in 1994, I’ve <a href="https://global.oup.com/academic/product/illuminating-disease-9780199362813?cc=us&lang=en&">written two books</a> on them, and still I’m stunned by the many different ways in which this fairly simple protein can be used. Perhaps I shouldn’t be surprised that plasmid DNA molecules coding for GFP have survived space flight – not inside the rocket, but on the outside where they were exposed to 1800F (1000C) temperatures and mad friction. 53% of the DNA intentionally placed inside the screw heads in the TEXUS-49 rocket mission <a href="http://dx.doi.org/10.1371/journal.pone.0112979">expressed fully fluorescent GFP</a> when inserted into cells upon return to earth.</p>
<p>Like stars at night, fluorescent proteins have been lighting up science for the last 20 years. And it won’t be long before they’re guiding surgeons to tumorous growths during surgery and allowing researchers to switch on and off selected biomolecular processes.</p><img src="https://counter.theconversation.com/content/39272/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marc Zimmer receives funding from NIH.</span></em></p>First found in jellyfish, but now inserted into all kinds of organisms, GFPs illuminate biological structures and processes that researchers otherwise couldn’t see.Marc Zimmer, Professor of Chemistry and author of Illuminating Disease, Connecticut CollegeLicensed as Creative Commons – attribution, no derivatives.