tag:theconversation.com,2011:/ca-fr/topics/cameras-8443/articles
Cameras – La Conversation
2023-12-06T13:28:53Z
tag:theconversation.com,2011:article/213213
2023-12-06T13:28:53Z
2023-12-06T13:28:53Z
Your car might be watching you to keep you safe − at the expense of your privacy
<figure><img src="https://images.theconversation.com/files/563468/original/file-20231204-15-ei72ki.png?ixlib=rb-1.1.0&rect=0%2C0%2C1273%2C714&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Many modern cars watch occupants -- a plus for safety but not so much for privacy.</span> <span class="attribution"><a class="source" href="https://www.lgnewsroom.com/2021/08/how-lgs-enhanced-in-vehicle-cabin-camera-makes-driving-and-riding-safer/">Courtesy LG</a></span></figcaption></figure><p>Depending on which late-model vehicle you own, your car <a href="https://www.consumerreports.org/cars/car-safety/driver-monitoring-systems-ford-gm-earn-points-in-cr-tests-a6530426322/">might be watching you</a> – literally and figuratively – as you drive down the road. It’s watching you with cameras that monitor the cabin and track where you’re looking, and with sensors that track your speed, lane position and rate of acceleration. </p>
<p>Your car uses this data to make your ride safe, comfortable and convenient. For example, the cameras <a href="https://www.wired.com/story/cars-that-watch-their-drivers-could-re-teach-the-world-to-drive/">can tell when you’ve been distracted</a> and need to bring your attention back to the road. They can also <a href="https://mycardoeswhat.org/safety-features/high-speed-alert/">identify when you are speeding</a> by verifying the speed limit from your GPS position or traffic signs along the road and warn you to slow down. Some carmakers are also beginning to incorporate similar features for convenience, such as unlocking your car by <a href="https://www.popsci.com/technology/genesis-gv60-facial-recognition/">scanning your face</a> <a href="https://www.techradar.com/news/fingerprint-scanners-are-now-being-used-to-unlock-and-start-your-car">or fingerprint</a>. Your car may also transmit some of this data to the manufacturer’s data centers, where the company uses it to improve your driving experience or provide you with personalized services.</p>
<p>In addition to providing these benefits, this data collection is a potential privacy nightmare. The information can reveal your identity, your habits when you’re in your car, how safely you drive, where you’ve been and where you regularly go. A report by the Mozilla Foundation, a nonprofit technology research and advocacy organization, found that <a href="https://foundation.mozilla.org/en/privacynotincluded/articles/its-official-cars-are-the-worst-product-category-we-have-ever-reviewed-for-privacy/">carmakers’ privacy policies are exceedingly lax</a>. The study identified cars as the “worst category of products for privacy that we have ever reviewed.” U.S. Sen. Ed Markey wrote a <a href="https://www.markey.senate.gov/imo/media/doc/senator_markey_letter_to_automakers_on_privacy.pdf">letter to U.S. automakers</a> on Nov. 30, 2023, asking a lengthy set of questions about their data practices.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/XKQ-uxTw11g?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Cars collect a lot of information about drivers and passengers.</span></figcaption>
</figure>
<p>Today’s smart cars present drivers with a trade-off between convenience and privacy, assuming drivers have the option of improving the data privacy of their cars. As a <a href="https://dblp.org/pid/172/0864.html">computer scientist who studies cybersecurity and resilience in transportation</a>, I see several technological routes to getting the best of both worlds: cars that make use of this collected data while also preserving users’ privacy.</p>
<h2>Driver data</h2>
<p>Today’s cars use a wide range of sensors to understand the environment, analyze the data and ensure the safety of passengers. For instance, cars are equipped with sensors that measure brake pedal position, vehicle speed, driver’s movements, surrounding vehicles and even traffic lights. The collected data is transmitted to the car’s electric control units, the computers that operate the car’s many systems.</p>
<p>There are two types of sensors that <a href="https://doi.org/10.1016/j.jsr.2009.04.005">continuously monitor and predict a driver’s drowsiness</a>. The first is vehicle status monitoring sensors such as lane detection and steering wheel position tracking. This data is not directly related to a specific person and can be considered not personally identifiable information unless it is correlated with other data that identifies the driver. </p>
<p>The second type of sensors tracks drivers themselves. This category includes things like cameras to <a href="https://doi.org/10.1007/s11768-010-8043-0">track the driver’s eye movements to predict fatigue</a>. This second group of sensors is directly related to the driver’s privacy because they collect personally identifiable information, such as the driver’s face.</p>
<h2>Protecting privacy</h2>
<p>There is a trade-off between the quality of the driving experience and the privacy of drivers, depending on the level of services and features. Some drivers may prefer to share their biometric data to facilitate accessing a car’s functions and automating a major part of their driving experience. Others may prefer to manually control the car’s systems, sharing less personally identifiable information or none at all.</p>
<p>At first glance, it seems the trade-off of privacy and driver comfort cannot be avoided. Car manufacturers tend to take measures to <a href="https://news.fiu.edu/2023/how-ai-will-protect-your-car-and-your-privacy">protect drivers’ data against data thieves</a>, but they collect a lot of data themselves. And as the Mozilla Foundation report showed, most car companies reserve the right to sell your data. Researchers are working on developing data analytics tools that better protect privacy and make progress on eliminating the trade-off.</p>
<p>For instance, over the past seven years, the concept of <a href="https://doi.org/10.48550/arXiv.1602.05629">federated machine learning</a> has attracted attention because it allows algorithms to learn from the data on your local device without copying the data to a central server. For instance, Google’s Gboard keyboard benefits from federated learning to better guess the next word you are likely to type <a href="https://support.google.com/gboard/answer/12373137?hl=en#zippy=%2Cfederated-learning">without sharing your private data with a server</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/zqv1eELa7fs?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Federated learning is a technique for training AI models that keeps people’s data private.</span></figcaption>
</figure>
<p>Research led by Ervin Moore, a Ph.D. student at Florida International University’s <a href="https://solidlab.network">Sustainability, Optimization, and Learning for InterDependent Networks laboratory</a>, and published in IEEE Internet of Things Journal explored the idea of using <a href="https://doi.org/10.1109/JIOT.2023.3313055">blockchain-based federated machine learning</a> to improve the privacy and security of users and their sensitive data. The technique could be used to protect drivers’ data. There are other techniques to preserve privacy as well, such as <a href="https://doi.org/10.1007/978-3-540-73538-0_4">location obfuscation</a>, which alters the user’s location data to prevent their location from being revealed.</p>
<p>While there is still a trade-off between user privacy and quality of service, privacy-preserving data analytics techniques could pave the way for using data without leaking drivers’ and passengers’ personally identifiable information. This way, drivers could benefit from a wide range of modern cars’ services and features without paying the high cost of lost privacy.</p><img src="https://counter.theconversation.com/content/213213/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>M. Hadi Amini receives funding for researching privacy and security of transportation systems from U.S. Department of Transportation. Opinions expressed represent the author's personal or professional opinions and do not represent or reflect the position of Florida International University.
His work on transportation system cybersecurity is in part supported by the National Center for Transportation Cybersecurity and Resiliency (TraCR). Any opinions, findings, conclusions, and recommendations expressed in this material are those of the author and do not necessarily reflect the views of TraCR or the U.S. Government generally. </span></em></p>
Your car’s safety technology takes you into account. But a lot of that technology helps car companies collect data about you. Researchers are working on closing the gap between safety and privacy.
M. Hadi Amini, Assistant Professor of Computing and Information Sciences, Florida International University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/204078
2023-06-09T12:29:14Z
2023-06-09T12:29:14Z
The US has a child labor problem – recalling an embarrassing past that Americans may think they’ve left behind
<figure><img src="https://images.theconversation.com/files/530946/original/file-20230608-2398-osoifr.jpg?ixlib=rb-1.1.0&rect=311%2C187%2C2993%2C2286&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Lewis Wickes Hine, 'A little spinner in a Georgia Cotton Mill, 1909.'</span> <span class="attribution"><span class="source">Gelatin silver print, 5 x 7 in. The Photography Collections, University of Maryland, Baltimore County (P545)</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>At the University of Maryland, Baltimore County’s Special Collections, where I am head curator, we’ve recently completed <a href="https://umbc.edu/stories/preserving-the-photography-of-lewis-hine/">a major digitization and rehousing project</a> of our collection of over 5,400 photographs made by <a href="https://iphf.org/inductees/lewis-hine/">Lewis Wickes Hine</a> in the early 20th century.</p>
<p>Traveling the country with his camera, Hine captured the often oppressive working conditions of thousands of children – some as young as 3 years old. </p>
<p>As I’ve worked with this collection over the past two years, the social and political implications of Hine’s photographs have been very much on my mind. The patina of these black-and-white photographs suggests a bygone era – an embarrassing past that many Americans might imagine they’ve left behind. </p>
<p>But with <a href="https://www.marketplace.org/shows/make-me-smart/in-2023-america-has-a-child-labor-problem/">numerous reports</a> of <a href="https://www.reuters.com/business/us-crack-down-child-labor-amid-massive-uptick-2023-02-27/">child labor violations</a>, many involving immigrants, occurring in the U.S., along with an uptick in <a href="https://www.npr.org/2023/03/10/1162531885/arkansas-child-labor-law-under-16-years-old-sarah-huckabee-sanders">state legislation</a> <a href="https://apnews.com/article/iowa-child-labor-bill-d2546845dd6ad7ec0a2c74fb3fc0def3">rolling back the legal working age</a>, it’s clear that Hine’s work is as relevant today as it was a century ago.</p>
<h2>‘An investigator with a camera’</h2>
<p>A sociologist by training, Hine began making photographs in 1903 while working as a teacher at the progressive Ethical Culture School in New York City. </p>
<p>Between 1903 and 1908, he and his students photographed migrants at Ellis Island. Hine believed that the future of the U.S. rested in its identity as an immigrant nation – a position that contrasted with <a href="https://pluralism.org/xenophobia-closing-the-door">escalating xenophobic fears</a>. </p>
<p>Based on this work, the <a href="https://www.loc.gov/pictures/collection/nclc/background.html">National Child Labor Committee</a>, which advocated for child labor laws, hired Hine to document the living and working conditions of American children. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Boy covered in soot poses with his hands clasped behind his back." src="https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=827&fit=crop&dpr=1 600w, https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=827&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=827&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1040&fit=crop&dpr=1 754w, https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1040&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/530975/original/file-20230608-29-2g9rie.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1040&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Lewis Wickes Hine, ‘Trapper Boy, Turkey Knob Mine, MacDonald, West Virginia, 1908.’</span>
<span class="attribution"><span class="source">Gelatin silver print. 5 x 7 in. The Photography Collections, University of Maryland, Baltimore County (P148)</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>By the late 19th century, several states had passed <a href="https://www.bls.gov/opub/mlr/2017/article/history-of-child-labor-in-the-united-states-part-2-the-reform-movement.htm">laws limiting the age of child laborers</a> and establishing maximum working hours. But at the turn of the century, the <a href="https://www.bls.gov/opub/mlr/2017/article/history-of-child-labor-in-the-united-states-part-1.htm">number of working kids soared</a> – between 1890 and 1910, 18% of children ages 10 to 15 were employed.</p>
<p>In his work for the National Child Labor Committee, Hine journeyed to farms and mills in the industrializing South and the streets and factories of the Northeast. He <a href="https://90025031.weebly.com/uploads/2/2/9/4/22941172/6532401.png?256">used a Graflex camera</a> with 5-by-7-inch glass plate negatives and employed flash powder for nighttime and interior shots, hauling upward of 50 pounds of equipment on his slight frame. </p>
<p>To gain entry into factories and other facilities, Hine sometimes disguised himself as a Bible, postcard or insurance salesman. Other times he’d wait outside to catch workers arriving for or departing from their shifts.</p>
<p>Along with photographic records, Hine collected his subjects’ personal stories, including their ages and ethnicities. He documented their working lives, such as their typical hours and any injuries or ailments they incurred as a result of their labor. </p>
<p>Hine, who considered himself “<a href="https://openlibrary.org/books/OL2525831M/Lewis_Hine_in_Europe">an investigator with a camera</a>,” used this information to create what he termed “photo stories” – combinations of images and text that could be used on posters, in public lectures and in published reports to help the organization advance its mission.</p>
<figure class="align-center ">
<img alt="Boys standing at a table splayed with seafood as an older worker obsveres" src="https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=355&fit=crop&dpr=1 600w, https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=355&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=355&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=446&fit=crop&dpr=1 754w, https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=446&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/531002/original/file-20230608-21-jdp136.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=446&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Lewis Wickes Hine’s photograph of three young fish cutters working at the Seacoast Canning Co. in Eastport, Maine.</span>
<span class="attribution"><a class="source" href="https://tile.loc.gov/storage-services/service/pnp/nclc/00900/00972v.jpg">National Child Labor Committee collection, Library of Congress, Prints and Photographs Division</a></span>
</figcaption>
</figure>
<h2>Legislation follows</h2>
<p>Hine’s muckraking photographs exemplify the genre of <a href="https://www.metmuseum.org/toah/hd/edph/hd_edph.htm">documentary photography</a>, which relies upon the perceived truthfulness of photography to make a case for social change. </p>
<p>The camera serves as an eyewitness to a societal ill, a problem that needs a solution. Hine portrayed his subjects in a direct manner, typically frontally and looking straight into the camera, against the backdrop of the very factories, farmland or cities where they worked. </p>
<p>By capturing details of his sitters’ bare feet, tattered clothes, soiled faces and hands, and diminutive stature against hulking industrial equipment, Hine made a direct statement about the poor conditions and precarity of these children’s lives.</p>
<figure class="align-center ">
<img alt="Five young boys wearing caps and holding newspapers in front of an imposing white building." src="https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=409&fit=crop&dpr=1 600w, https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=409&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=409&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=514&fit=crop&dpr=1 754w, https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=514&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/530972/original/file-20230608-19-jlog7o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=514&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Lewis Wickes Hine, ‘Group of newsies selling on Capitol steps, April 11, 1912.’</span>
<span class="attribution"><span class="source">The Photography Collections, University of Maryland, Baltimore County (P2904)</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Hine’s photographs made a successful case for child labor reform. </p>
<p>Notably, the National Child Labor Committee’s efforts resulted in Congress establishing the <a href="https://www.childwelfare.gov/pubPDFs/Story_of_CB.pdf">Children’s Bureau</a> in 1912 and passing the <a href="https://www.archives.gov/milestone-documents/keating-owen-child-labor-act">Keating-Owen Act</a> in 1916, which limited working hours for children and prohibited the interstate sale of goods produced by child labor.</p>
<p>Although the <a href="http://sites.gsu.edu/us-constipedia/child-labor-law/">Supreme Court later ruled</a> it and a subsequent Child Labor Tax Law of 1919 unconstitutional, momentum for enshrining protections for child workers had been created. In 1938, Congress passed the <a href="https://www.dol.gov/agencies/whd/flsa">Fair Labor Standards Act</a>, which established restrictions and protections on employing children. </p>
<p>The National Child Labor Committee’s project also included advocacy for the enforcement of existing child labor regulations, a regulatory problem reemerging today as the Department of Labor – the agency tasked with enforcing labor laws – <a href="https://news.bloomberglaw.com/daily-labor-report/dols-wage-arm-vows-child-labor-focus-despite-no-rule-changes">comes under fire</a> for failing to protect child workers.</p>
<figure class="align-center ">
<img alt="Hooded girl in a field of cotton stares forlornly at the camera." src="https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=438&fit=crop&dpr=1 600w, https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=438&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=438&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=551&fit=crop&dpr=1 754w, https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=551&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/530998/original/file-20230608-29-alq94t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=551&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A young picker carries a large sack of cotton on her back.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/young-cotton-picker-carries-a-large-sack-of-cotton-on-her-news-photo/640486085?adppopup=true">Lewis Wickes Hine/Library of Congress via Getty Images</a></span>
</figcaption>
</figure>
<h2>The ethics of picturing child labor</h2>
<p>A recent surge of unaccompanied minors, primarily from Central America, has brought new attention to America’s old problem of child labor and has threatened the very laws Hine and the National Child Labor Committee worked to enact. </p>
<p>Some estimates suggest that one-third of migrants under 18 <a href="https://www.nytimes.com/2023/02/25/us/unaccompanied-migrant-child-workers-exploitation.html">are working illegally</a>, whether it’s laboring more hours than current laws permit, or working without the proper authorizations. Many of them perform hazardous jobs similar to those of Hine’s subjects: handling dangerous equipment and being exposed to noxious chemicals in factories, slaughterhouses and industrial farms.</p>
<p>While the content of Hine’s photographs remains pertinent to today’s child labor crisis, a key distinction between the subject of Hine’s photographs and working children today is race. </p>
<p>Hine focused his camera almost exclusively on white children who arrived in the country during waves of immigration from Europe during the late-19th and early-20th centuries. <a href="https://journalpanorama.org/wp-content/uploads/2022/10/Zelt-American-Photographs-Abroad.pdf">As art historian Natalie Zelt argues</a>, Hine’s pictorial treatment of Black children – either ignored or forced to the margins of his images – implied to viewers that the face of childhood in America was, by default, white. </p>
<p>The perceived racial hierarchies of Hine’s era reverberate into the present, where underage migrants of color live and work at the margins of society.</p>
<figure class="align-center ">
<img alt="A group of women hold drums and signs reading 'Popeyes Stop Exploiting Child Labor.'" src="https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=414&fit=crop&dpr=1 600w, https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=414&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=414&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=520&fit=crop&dpr=1 754w, https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=520&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/531004/original/file-20230608-29-lcdhg2.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=520&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Workers protest outside a Popeye’s restaurant in Oakland, Calif., on May 18, 2023, after reports emerged of the franchise exploiting child labor.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/betty-escobar-left-and-other-fast-food-workers-protest-at-news-photo/1491552588?adppopup=true">Jane Tyska/Digital First Media/East Bay Times via Getty Images</a></span>
</figcaption>
</figure>
<p><a href="https://www.reuters.com/investigates/section/underage-workers/">Contemporary reports</a> of child labor violations offer few images to accompany their texts, graphs and statistics. There are legitimate reasons for this. By not including identifying personal information or portraits, news outlets protect a vulnerable population. <a href="https://www.unicef.org/eca/media/ethical-guidelines">Ethical guidelines</a> frown upon revealing private details of the lives of children interviewed. And, as Hine’s experience demonstrates, it can be difficult to infiltrate the sites of these labor violations, since they are typically kept secure.</p>
<p>Digital cameras and smartphones offer a workaround. Beginning in 2015, the International Labor Organization <a href="https://www.dol.gov/agencies/ilab/our-work/child-forced-labor-trafficking/My-PEC">urged child laborers in Myanmar</a> to become “young activists” and use their own images and words to create “photo stories” – echoing Hine’s use of the term – that the organization could then disseminate.</p>
<p>Photographs of child labor in foreign countries are far more common than those made in the U.S., which leaves the impression that child labor is someone else’s problem, not ours. Perhaps it’s too hard for Americans to look at this domestic issue square in the eyes. </p>
<p>A similar effect is at work when viewing Hine’s photographs today. While they were originally valued for their immediacy, they can seem to belong to a distant past.</p>
<p>But if Hine’s photographic archive of child laborers is evidence of the power of photography to sway public opinion, does the lack of images in today’s reporting – even if nobly intended – create a disconnect? </p>
<p>Is the public capable of understanding the harmful consequences of lack of labor enforcement when the faces of the people affected are missing from the picture?</p><img src="https://counter.theconversation.com/content/204078/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Beth Saunders does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
While Lewis Hine’s early-20th century photographs of working children compelled Congress to limit or ban child labor, the US Department of Labor is now under fire for failing to enforce these laws.
Beth Saunders, Curator and Head of Special Collections and Gallery, University of Maryland, Baltimore County
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/203736
2023-04-18T12:45:04Z
2023-04-18T12:45:04Z
Donald Trump and the dying art of the courtroom sketch
<figure><img src="https://images.theconversation.com/files/521346/original/file-20230417-14-llf0n.png?ixlib=rb-1.1.0&rect=8%2C3%2C1183%2C710&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Donald Trump appears in court in New York City, in a courtroom sketch by Jane Rosenberg.</span> <span class="attribution"><a class="source" href="https://i.guim.co.uk/img/media/6fa532e81c5de5a4afe9a2a30a38e58312e1a1d3/0_0_3024_1814/master/3024.jpg?width=1200&quality=85&auto=format&fit=max&s=4f81991d2b7e63b853af67c410069ade">Jane Rosenberg/Reuters</a></span></figcaption></figure><p>For the first time in its history, The New Yorker featured a courtroom sketch <a href="https://www.newyorker.com/culture/cover-story/cover-story-2023-04-17">on its cover</a>. </p>
<p>The image, which appears on its April 17, 2023, issue, gives viewers a glimpse of a historic court proceeding that could not be captured by cameras: the arraignment hearing of Donald Trump two weeks earlier. </p>
<p>Because Trump is the first former U.S. president to be <a href="https://www.npr.org/2023/04/05/1168256845/donald-trump-becomes-the-first-president-charged-with-criminal-activity">criminally indicted</a>, there is immense public interest in this case. However, when Trump pleaded not guilty to 34 felony counts of falsifying business records, his reactions and expressions could be visually recorded only by three approved courtroom artists.</p>
<p>In a way, it was a throwback to an era when only artists could provide the public with visual records of court proceedings. Yet with more and more jurisdictions allowing cameras into courtrooms, courtroom artists now find themselves working in a <a href="https://www.aetv.com/real-crime/exclusive-interview-with-the-courtroom-sketch-artist-from-the-cosby-trial">dying field</a>.</p>
<p>Having studied both <a href="https://www.taylorfrancis.com/books/edit/10.4324/9781315611693/synesthetic-legalities-sarah-marusek">courtroom sketches</a> and <a href="https://doi.org/10.1007/s11196-019-09676-7">tabloid crime photography</a>, I sometimes wonder what might be lost if courtroom art were to become extinct.</p>
<h2>The history of courtroom sketches</h2>
<p>Despite their dwindling numbers, courtroom artists are still able to pursue their craft because many judges continue to forbid photography in their courtrooms.</p>
<p>Yet a national standard for banning cameras in U.S. courtrooms is less than 100 years old.</p>
<p>When news photography flourished after World War I, courtroom photographs became a staple of tabloids such as the New York Daily News. These newspapers regularly sent their reporters to cover high-profile trials, taking advantage of the <a href="https://heinonline.org/HOL/LandingPage?handle=hein.journals/judica63&div=8&id=&page=">uneven patchwork of judicial positions</a> on whether cameras should be allowed in courtrooms.</p>
<p>The trial of <a href="https://www.britannica.com/biography/Bruno-Hauptmann">Bruno Richard Hauptmann</a> spurred a wave of regulations against cameras in courtrooms.</p>
<p>In 1935, Hauptmann was tried for kidnapping and murdering the child of Charles Lindbergh. To cover the so-called “<a href="https://slate.com/technology/2020/10/supreme-court-oral-arguments-cameras-lindbergh-baby-trial.html">Trial of the Century</a>,” an estimated 700 reporters and more than 130 cameramen rushed to Flemington, New Jersey, <a href="https://heinonline.org/HOL/LandingPage?handle=hein.journals/frdipm20&div=31&id=&page=">leading to reports</a> of photographers climbing on the counsel’s table, shoving their flashbulbs in witnesses’ faces and jockeying with one another to take pictures of Hauptmann.</p>
<figure class="align-center ">
<img alt="Black and white photograph of a large group of photographers posing outside a courtroom." src="https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=448&fit=crop&dpr=1 600w, https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=448&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=448&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=563&fit=crop&dpr=1 754w, https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=563&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/521335/original/file-20230417-974-47aqpw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=563&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The trial of Bruno Richard Hauptmann attracted hoards of photographers.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/les-photographes-attendent-devant-le-palais-de-justice-la-news-photo/843622860?adppopup=true">Keystone-France/Gamma-Rapho via Getty Images</a></span>
</figcaption>
</figure>
<p>After investigating the sensational publicity surrounding the Hauptmann trial, the American Bar Association went on to ban courtroom photography in <a href="https://uknowledge.uky.edu/cgi/viewcontent.cgi?article=3156&context=klj">Canon 35</a> of its 1937 Canons of Judicial Ethics. Following the American Bar Association’s lead, Congress enacted <a href="https://www.federalrulesofcriminalprocedure.org/title-ix/rule-53-courtroom-photographing-and-broadcasting-prohibited/">Rule 53</a> of the Federal Rules of Criminal Procedure in 1944, which prohibited photography in federal courtrooms during judicial proceedings. </p>
<p>This statutory ban remains in place today in American federal criminal courts and in the U.S. Supreme Court. </p>
<p>The bulky cameras of the past, <a href="https://supreme.justia.com/cases/federal/us/381/532/">along with their cables, microphones and wires</a>, required judges, witnesses, lawyers and jurors to navigate around them. Today’s cameras, however – whether in their compact, portable form or as remotely controlled, permanently mounted features in courtrooms – operate as less physically disruptive recorders of court proceedings.</p>
<p>Although cameras can give the general public direct access to what happens during a trial, they can also threaten what the American Bar Association has termed the “fitting dignity and decorum” of court proceedings. When cameras are permitted, <a href="https://www.nytimes.com/1994/11/08/us/judge-in-simpson-trial-allows-tv-camera-in-courtroom.html">as they were in the O.J. Simpson trial</a>, judges and lawyers sometimes worry that the proceedings will turn into a circuslike spectacle.</p>
<h2>An artistic flash</h2>
<p>Because the history of courtroom sketches cannot be separated from the history of prohibiting photography in the courtroom, cameras and human artists are often positioned as competitors in the production of courtroom images. </p>
<p>Working with a print or television news agency, freelance courtroom artists need to draw quickly to meet news deadlines. Notably, courtroom artist Mary Chaney was able to depict, <a href="https://loc.gov/item/prn-21-007/">through more than 260 sketches</a>, the criminal and civil trials of the four Los Angeles police officers charged with beating Rodney King.</p>
<figure class="align-center ">
<img alt="Drawing of man raising two fingers." src="https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=417&fit=crop&dpr=1 600w, https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=417&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=417&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=524&fit=crop&dpr=1 754w, https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=524&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/521341/original/file-20230417-982-akyh66.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=524&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Mary Chaney’s sketch of Rodney King on the witness stand during his 1994 trial.</span>
<span class="attribution"><a class="source" href="https://blogs.loc.gov/loc/files/2021/02/Screen-Shot-2021-02-23-at-4.24.49-PM-768x534.png">Library of Congress</a></span>
</figcaption>
</figure>
<p>When courtroom illustrators, such as David Rose, assert that “<a href="http://articles.latimes.com/1986-04-12/local/me-3545_1_bill-robles">the camera sees everything, but captures nothing</a>,” they are arguing that the camera’s mechanical eye is a poor substitute for – as Chicago courtroom artist Andy Austin <a href="https://hpherald.newsbank.com/doc/news/1752B9C870981428">puts it</a> – “the human eye, the human hand, dealing with a human subject for viewing by humans.” </p>
<p>While the camera can immediately generate highly detailed images of a trial, it cannot capture the emotional resonance of a courtroom moment. By funneling the emotional highs and lows of a trial through their body, courtroom artists can bring to their work irreplaceable sensory and dramatic insights.</p>
<p>Part of the drama stems from a courtroom artist’s ability to compress hours of court action into a single drawing. Artists can also manipulate the composition and perspective of their drawings to create “<a href="http://www.marilynchurch.com/book">artistic pull</a>.” Even though judges, lawyers, witnesses and the defendant may be physically spread out in the actual courtroom, the artist can bring them into close proximity with one another and the viewer.</p>
<p>It is in this way that courtroom sketches can make viewers feel the <a href="http://www.thestar.com/news/2007/03/24/drawn_to_the_law.html">emotional pull</a> of the trial’s main characters.</p>
<h2>One sketch goes viral</h2>
<p>This is what happened in Jane Rosenberg’s viral courtroom sketch of Trump. </p>
<p>Compared with the <a href="https://www.businessinsider.com/courtroom-sketches-capture-former-president-donald-trumps-arraignment-2023-4">drawings made by Christine Cornell and Elizabeth Williams</a>, Rosenberg’s image is the only one that depicts Trump <a href="https://www.newyorker.com/culture/cover-story/cover-story-2023-04-17">looking glum</a>, with his arms crossed as he eyes Manhattan District Attorney Alvin Bragg. </p>
<p>Because Bragg is not visible in the image, it appears as though Trump is fully facing the viewer with an expression that has been simultaneously described as despondent, disdainful and “<a href="https://hyperallergic.com/813359/courtroom-artist-jane-rosenberg-on-her-viral-sketch-of-trump/">pissed off</a>.”</p>
<p>To allow viewers to focus even further on Trump’s facial expression and body language, the New Yorker cover crops Rosenberg’s illustration, so that it becomes a portrait of a former president in criminal court. Made up of energetic pastel-chalk lines that are suggestive but ultimately unfinished, the rough sketch aesthetically aligns with the moral “sketchiness” that has long dogged Trump.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1645540940679987201"}"></div></p>
<h2>The afterlives of courtroom sketches</h2>
<p>When <a href="https://twitter.com/reuterspictures/status/1643357029753409541">Reuters tweeted Rosenberg’s courtroom sketch of Trump</a>, it jump-started the image’s afterlife. </p>
<p>Even though the practice of courtroom illustration has been described as a dying art form, courtroom sketches, like other cultural artifacts, are not only preserved in <a href="https://www.loc.gov/exhibitions/drawing-justice-courtroom-illustrations/about-this-exhibition/">special collections and exhibits</a>; they can also evolve through successive framings and interpretations. </p>
<p>In our current digital world, courtroom sketches can go viral on social media, especially if the artist fails to accurately capture the likeness of a high-profile, celebrity defendant. </p>
<p>Rosenberg herself is no stranger to creating viral courtroom sketches. When covering <a href="https://www.nbcsports.com/boston/new-england-patriots/deflategate-timeline">Deflategate</a> – the deflated ball controversy involving NFL star Tom Brady – she drew a portrait of the then-New England Patriots quarterback that elicited comparisons to <a href="https://www.newyorker.com/magazine/2015/08/31/sketchy">Quasimodo, Lurch and Thriller-era Michael Jackson</a>.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"638428301829271552"}"></div></p>
<p>Courtroom sketches can also be creatively transformed into online memes. Rosenberg’s Trump sketch <a href="https://hyperallergic.com/813208/most-biting-memes-of-donald-trump-arraignment/">has been photo-edited</a> to evoke Edvard Munch’s “The Scream,” to include a bucket of KFC fried chicken and to appear as if he’d been caught by the Scooby Doo gang.</p>
<p>Trump’s fans and foes <a href="https://www.marketwatch.com/story/why-a-donald-trump-mug-shot-could-become-the-culture-icon-of-our-time-7297ed0e">may not have gotten their mugshot</a>. But they have a viral courtroom sketch, and what started as an image drawn under a courtroom’s tightly regulated conditions has since taken on a life of its own.</p><img src="https://counter.theconversation.com/content/203736/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anita Lam does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Whereas ‘the camera sees everything, but captures nothing,’ courtroom artists can channel the emotional highs and lows of a trial through a single image.
Anita Lam, Associate Professor, York University, Canada
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/197490
2023-03-16T21:10:01Z
2023-03-16T21:10:01Z
There Will Be No More Night: Documentary raises ethical questions about using war footage
<figure><img src="https://images.theconversation.com/files/505704/original/file-20230121-24-oue55u.jpg?ixlib=rb-1.1.0&rect=16%2C9%2C1577%2C884&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Screenshot taken from 'There Will Be No More Night' by Éléonore Weber. </span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>In his book <a href="https://www.versobooks.com/books/416-war-and-cinema"><em>War and Cinema</em></a>, cultural theorist Paul Virilio noted that modern warfare depends on the “logistics of perception.” According to him, a new arena of conflict has emerged with the development of sophisticated imaging technology. Like better weaponry, the side with better cameras often gains superiority. </p>
<p>Virilio said new imaging technology “makes darkness transparent and gives to military contestants an image of what the night is no longer able to conceal.” With thermal and night-vision cameras, any moving presence glowing in darkness becomes susceptible to gunfire by combat helicopters hovering above conflict zones. </p>
<p>Éléonore Weber’s 2020 documentary, <em><a href="https://www.imdb.com/title/tt10917134/">There Will Be No More Night</a></em>, reflects on this phenomenon. It uses leaked military footage from U.S. and French helicopters during war missions in Iraq, Syria and Afghanistan. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/its-been-20-years-since-the-us-invaded-iraq-long-enough-for-my-undergraduate-students-to-see-it-as-a-relic-of-the-past-199460">It's been 20 years since the US invaded Iraq – long enough for my undergraduate students to see it as a relic of the past</a>
</strong>
</em>
</p>
<hr>
<p>The unnerving sequence of night-vision footage shows airstrikes on civilians suspected of being militants by pilots with shaky conviction. The blurry, grainy images accompany radio-transmitted exchanges between aircraft and machine gun operators, confessions of a pilot who suffers from chronic hallucinations and a scripted monologue. </p>
<p>Weber creatively uses forensic sources to contemplate the technology of modern warfare, where military-grade surveillance and imaging almost serve as a proxy for guns.</p>
<p>As we approach the 20th anniversary of the <a href="https://www.cfr.org/timeline/iraq-war">U.S.-led invasion of Iraq</a>, it is important to reflect on the use of war footage in media and the ethical questions around the use of footage depicting human death.</p>
<h2>Highlighting human rights abuses</h2>
<p><em>There Will Be No More Night</em> underscores the fallacy that advanced imaging provides accuracy and error-proof precision to modern war. The documentary shows how sophisticated war machines are driven by the personal idiosyncrasies of drone operators who launch deadly missiles using systems that resemble <a href="https://www.cbc.ca/news/science/turning-video-gamers-into-the-ultimate-drone-pilots-1.1398870">video games</a>.</p>
<p>While Virilio traced aesthetic similarities between the videography of war and cinema, Weber’s documentary film uses war footage to highlight the camera’s impairing role in contemporary conflicts.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/LSdQ8GGQtB8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Trailer for the documentary ‘There Will Be No More Night’ by Éléonore Weber.</span></figcaption>
</figure>
<p>The surveillance recordings document the offhanded killing of the people targeted. Weber includes the <a href="https://www.theguardian.com/us-news/2020/jun/15/all-lies-how-the-us-military-covered-up-gunning-down-two-journalists-in-iraq">infamous Wikileaks footage</a> showing the airstrike that killed Iraqi Reuters photographer Saeed Chmagh and his colleagues in 2007. According to the pilots, Chmagh’s camera tripod resembled an RPG grenade launcher in the grainy footage. </p>
<p>In other instances, farmers carrying ploughs get mistaken for militants. Another harrowing scene depicts a person showered with bullets because he appeared unusually calm when cornered by a helicopter pilot. </p>
<p>Advanced imaging technologies in warfare seemingly operate on a peculiar logic, where framing inevitably leads to the manufacturing and annihilation of suspects. According to media theorist Harun Farocki, they generate “<a href="https://doi.org/10.7228/manchester/9781526107213.003.0004">operational images</a>” that do not merely represent but execute the functions of operations they belong to. </p>
<p>Weber’s creative use of forensic materials records a series of war violations. Scholars Patrick Brian Smith and Ryan Watson use the term “<a href="https://doi.org/10.1177/01634437221088954">mediated forensics</a>” to describe the use of new media technologies and practices in human rights discourse. </p>
<p>Research-activist groups like <a href="https://forensic-architecture.org/">Forensic Architecture</a>, <a href="https://situ.nyc/research">SITU Research</a> and <a href="https://lab.witness.org/">WITNESS Media Lab</a> perform forensic analysis of raw media evidence to highlight human rights issues. They do so using techniques and technologies such as <a href="https://www.sciencedirect.com/topics/agricultural-and-biological-sciences/photogrammetry">photogrammetry</a>, geolocation mapping, 3D-imaging and pattern analysis to infer unseen viewpoints from limited visual evidence.</p>
<h2>A question of ethics</h2>
<p><em>There Will Be No More Night</em> sidesteps such principled forensic analysis. Instead of dissecting raw media evidence and disclosing new perspectives around specific events, it simply reproduces images of brutal killings for a generalized, self-absorbed reflection on modern warfare. </p>
<p>Consequently, the film becomes emotionally distressing and ethically dubious. One cannot discard the uneasy concerns of witnessing 125 minutes of footage depicting brutal massacres from the cockpit.</p>
<p>The documentary also humanizes one pilot, Pierre V., as he reflects on his nightmares after controlling infrared and thermal cameras for several months. But nothing is heard from the other side; those who live under the perpetual threat of the weapons and cameras, and need to devise inventive ways to escape their thermal imagery. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/oiW55_48GuU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Irish photographer Richard Mosse discusses his documentary ‘Heat Maps’.</span></figcaption>
</figure>
<p>A related problem surfaces in the documentary project <a href="https://www.newyorker.com/culture/photo-booth/richard-mosses-heat-maps-a-military-grade-camera-repurposed-on-the-migrant-trail"><em>Heat Maps</em></a> by Irish photographer Richard Mosse. He uses thermal video cameras to construct composite images of refugee camps in and around the Mediterranean. </p>
<p>But the visually arresting photographs further expose the subjects and deny them self-representation. Mosse also enjoys freedom of movement and has control over the photographed images of the subjects — rights the subjects themselves do not have. </p>
<p>Despite its acute critique of modern warfare, <em>There Will Be No More Night</em> could have devised measures to work around the reproduction of visuals of death. Its distanced approach, driven by a voice-over commentary, fails to account for divergent perspectives. </p>
<p>What appears jarringly absent in the film are the voices of those people who are continually mapped by the imaging technologies of modern warfare and the social and psychological effects the technologies have on them.</p><img src="https://counter.theconversation.com/content/197490/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Santasil Mallik does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
As we approach the 20th anniversary of the invasion of Iraq, it is important to reflect on the use of war footage in media and the ethical questions around the use of footage depicting human death.
Santasil Mallik, PhD Student, Media Studies, Western University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/198854
2023-02-24T13:12:03Z
2023-02-24T13:12:03Z
Why are so many Gen Z-ers drawn to old digital cameras?
<figure><img src="https://images.theconversation.com/files/508381/original/file-20230206-19-4a5n1e.jpg?ixlib=rb-1.1.0&rect=287%2C473%2C4423%2C2925&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A student on a school bus holding a digital point-and-shoot camera.</span> <span class="attribution"><a class="source" href="https://commons.wikimedia.org/wiki/File:Student_on_school_bus_holding_camera.jpg">Jason Zhang/Wikimedia Commons</a></span></figcaption></figure><p>The latest digital cameras boast ever-higher resolutions, better performance in low light, smart focusing and shake reduction – and they’re <a href="https://www.tomsguide.com/us/best-phone-cameras,review-2272.html">built right into your smartphone</a>.</p>
<p>Even so, some Gen Z-ers are now opting for <a href="https://www.nytimes.com/2023/01/07/technology/digital-cameras-olympus-canon.html">point-and-shoot digital cameras from the early 2000s</a>, before many of them were born.</p>
<p>It’s something of a renaissance, and not just for older cameras. The digital camera industry as a whole is seeing a resurgence. Previously, industry revenue peaked in 2010 and was shrinking annually through 2021. Then it saw new growth in 2022, and <a href="https://www.statista.com/outlook/cmo/consumer-electronics/tv-radio-multimedia/digital-cameras/worldwide">it is projected to continue growing for the coming years</a>. </p>
<p>But why?</p>
<p><a href="https://www.lifewire.com/why-digital-cameras-are-cool-again-and-how-to-make-the-most-of-them-7099549">One explanation</a> is nostalgia, or a yearning for the past. And indeed, <a href="https://theconversation.com/how-coronavirus-made-us-nostalgic-for-a-past-that-held-the-promise-of-a-future-140651">nostalgia can be an effective coping strategy</a> in times of change and upheaval – the COVID-19 pandemic is just one of the disorienting shifts of the past few decades.</p>
<p>But my research on <a href="https://books.emeraldinsight.com/page/detail/information-experience-in-theory-and-design/?k=9781839093692">people’s experiences with technology</a>, which <a href="https://doi.org/10.3390/info10100297">includes photography</a>, suggests a deeper explanation: seeking meaning. </p>
<p>It’s not that these Gen Z-ers are longing to return to childhood, but that they are finding and expressing their values through their technological choices. And there’s a lesson here for everyone.</p>
<h2>The human need for meaning</h2>
<p>Humans have many needs – food, shelter, sex and so on. But humans also <a href="https://academic.oup.com/book/6301">feel the urge to find meaning in life</a>. </p>
<p>Meaning is <a href="https://blogs.scientificamerican.com/beautiful-minds/the-differences-between-happiness-and-meaning-in-life/">different from happiness</a>. Though happiness and meaning are <a href="https://doi.org/10.1080/17439760802303044">often correlated</a>, meaning doesn’t necessarily include the pleasure that characterizes happiness. Meaningful pursuits may involve struggle, suffering or even sacrifice. <a href="https://doi.org/10.1145/2858036.2858225">Meaning also lasts longer</a>, whereas happiness is fleeting.</p>
<p>What does meaning do for people? </p>
<p>At its core, meaning is about identifying one’s values and making choices to develop oneself as a person. It allows a person to engage with the various aspects of their personality – “the multitudes” contained therein, as <a href="https://theconversation.com/guide-to-the-classics-walt-whitmans-leaves-of-grass-and-the-complex-life-of-the-poet-of-america-116055">Walt Whitman</a> wrote. </p>
<p>Put differently, meaning is about weaving a personal narrative from the facts of life. And it really is a need, not just something that’s nice to have. <a href="https://global.oup.com/academic/product/finding-meaning-in-an-imperfect-world-9780190657666?cc=us&lang=en&">Meaning is what makes life feel valuable and worth living</a>.</p>
<h2>Seeking meaning with technology</h2>
<p>Why do people adopt one technology over another? According to what scholars call the <a href="https://www.sciencedirect.com/topics/social-sciences/technology-acceptance-model">technology acceptance model</a>, people consider two major aspects when choosing a technology: its perceived usefulness and its perceived ease of use.</p>
<p>But certainly there are other considerations, especially for personal technologies. People choose some technologies for <a href="https://aisel.aisnet.org/jais/vol15/iss2/1/">the way they contribute to meaning</a>. And the search for meaning extends beyond choosing a technology to the way a person uses and experiences it. For example, many people <a href="https://doi.org/10.1037/qup0000232">use social media in constructing their sense of self</a>.</p>
<p>In my own research, I <a href="https://doi.org/10.1002/asi.24142">discerned four themes involved</a> in people’s meaningful experiences with technology: </p>
<ol>
<li><strong>Presence</strong>: People choose formats and technologies that will help them be more present and attentive during the experience.</li>
<li><strong>Centripetal force</strong>: A person’s relationship with the technology begins with a central practice but gradually expands to become a bigger part of their life. For example, as a person’s photography practice becomes more meaningful, they may find themselves printing photos, curating their collection and shopping for more equipment.</li>
<li><strong>Curiosity</strong>: A sense of wonder and interest guides the experience. </li>
<li><strong>Self-construction</strong>: Meaningful experiences with technology contribute to the person’s sense of self.</li>
</ol>
<p>In <a href="https://doi.org/10.1177/0165551516670099">my research on ultra-distance runners</a>, who run races even longer than marathons, I saw all these elements at play. Runners chose particular shoes, GPS watches, sensors and software – or avoided them – in part to be more present with their bodies.</p>
<p>This can make the running itself more meaningful, along with other activities such as <a href="https://doi.org/10.1108/AJIM-03-2017-0071">writing race recaps</a>, keeping a training log and sharing photos. </p>
<figure class="align-center ">
<img alt="Runner wearing orange pinnie checks watch." src="https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/512084/original/file-20230223-5838-ucxxfs.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Marathoner Youssef Sbaai checks his watch after winning the Sofia Marathon in October 2020.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/youssef-sbaai-of-morocco-seen-checking-his-watch-after-news-photo/1229021181?phrase=runner%20checking%20watch&adppopup=true">Artur Widak/NurPhoto via Getty Images</a></span>
</figcaption>
</figure>
<p>Over time, running becomes a central part of a person’s identity – they become “a runner.” In the end, long-distance running is not always enjoyable, <a href="https://doi.org/10.1080/00948705.2016.1206826">but it is definitely meaningful</a>.</p>
<p>And so technology, whether it’s the kind associated with running or some other activity, becomes a key way people can discern their values and make choices that support and better embody those values. </p>
<h2>The meaning within old digital cameras</h2>
<p>In this context, using a standalone digital camera immediately enhances the meaningfulness of an experience. Meaning is about exercising choice, and nowadays most people don’t own a camera at all – they just use their smartphone. </p>
<p>Digital cameras also enable presence: You need to remember to carry the camera around, and in return it won’t give you notifications or show you other apps while you’re shooting.</p>
<figure class="align-right ">
<img alt="A sleek and minimalist point-and-shoot digital camera from 2008." src="https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=519&fit=crop&dpr=1 600w, https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=519&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=519&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=652&fit=crop&dpr=1 754w, https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=652&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/508397/original/file-20230206-15-tnvv3.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=652&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A 2008 Nikon Coolpix S520, one example of the kinds of digital cameras seeing a resurgence today.</span>
<span class="attribution"><span class="source">Simon Speed/Wikimedia Commons</span></span>
</figcaption>
</figure>
<p>That goes for any standalone camera. But old cameras, in particular, have a set of qualities that help users make meaning. </p>
<p>First, the image quality is poorer. But on social media, photos that get posted are less about polish and precision and more about sharing experiences and telling stories. As social media theorist Nathan Jurgenson writes in his book “<a href="https://www.versobooks.com/books/2947-the-social-photo">The Social Photo</a>,” “As a medium, social photography becomes an important means to experience something not representable as an image but instead as a social process: an appreciation of impermanence for its own sake.”</p>
<p>As a person chooses which photos to share and how to edit them, they are expressing their values and developing their sense of self. To some extent, smartphone photo filters allow for some of this expression, but old digital cameras produce different kinds of visual effects and lack <a href="https://store.google.com/intl/en/ideas/articles/what-is-an-ai-camera/">the automated features</a> designed to professionalize the look of each image.</p>
<p>Older cameras also introduce challenges in getting the images onto social media. They require cables, software and multiple steps to transfer the images. It’s a far cry from one-click <a href="https://theconversation.com/chatgpt-dall-e-2-and-the-collapse-of-the-creative-process-196461">image generation with artificial intelligence</a>. What this means is that photography involves many more activities beyond simply taking photos. Photography becomes a bigger part of one’s life. </p>
<p>All this friction increases a person’s involvement in the process, inviting choices along the way. This is precisely the thinking behind <a href="https://doi.org/10.1145/2556288.2557178">the slow technology movement</a>, which aims to design technology for goals like self-reflection, rather than efficiency or productivity. Research on meaningful design shows <a href="https://doi.org/10.1145/3064857.3079126">people form stronger attachments to products</a> when they have to make more choices or get more involved. </p>
<p>When it comes to finding meaning in older forms of photography – whether you use a digital camera or a film camera – the slower process of creating and sharing images outweighs the speed, efficiency and crisp imagery of smartphone cameras. </p>
<h2>Crafting a more meaningful life</h2>
<p>The meaning hidden within old digital cameras contains broader lessons.</p>
<p>In recent years, critics have bemoaned <a href="https://www.tabletmag.com/sections/news/articles/everything-is-broken">the rupturing of social institutions</a> and the transformation of digital platforms into places that merely serve as <a href="https://www.wired.com/story/tiktok-platforms-cory-doctorow/">vehicles to sell ads and collect data from users</a>. During the pandemic, life itself threatened to go digital with all <a href="https://theconversation.com/what-is-the-metaverse-and-what-can-we-do-there-179200">the hype surrounding the metaverse</a>. </p>
<p>I believe that a key to living well in the near future is to identify where you can create choices, so you don’t feel like you’re drifting along at the mercy of algorithms and the whims of Big Tech.</p>
<p>Perhaps you could start <a href="https://www.nytimes.com/2022/12/15/style/teens-social-media.html">a chapter of the Luddite Club</a> – as a group of teens in Brooklyn recently did – and play board games in the park on weekends. Perhaps you could opt for a paper book rather than a podcast, specifically because you can’t do something else while you’re reading it.</p>
<p>On the surface, deliberately rejecting the latest, flashiest forms of technology may seem like a problem – “You’ll be left behind and miss out!” </p>
<p>But on the other hand, slowing down life by engaging with slower technology creates space to make choices more thoughtfully in relation to your values – and cultivate more meaningful involvement in your own life.</p><img src="https://counter.theconversation.com/content/198854/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tim Gorichanaz does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Smartphone cameras tend to be more advanced than their clunky, point-and-shoot predecessors. But the allure of cameras from the early 2000s reflects a broader search for meaning.
Tim Gorichanaz, Assistant Teaching Professor of Information Studies, Drexel University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/176292
2022-03-03T20:21:27Z
2022-03-03T20:21:27Z
Data from thousands of surveillance cameras confirms that protected areas safeguard species diversity
<figure><img src="https://images.theconversation.com/files/449836/original/file-20220303-21-144i2sq.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C4800%2C3197&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Camera traps capture information about an area's biodiversity.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>We have entered what some scientists refer to as <a href="https://dx.doi.org/10.1073/pnas.1922686117">Earth’s sixth major extinction</a>. Human disturbances, such as over-harvesting of crops, habitat destruction and invasive species, are the biggest drivers of biodiversity loss. Some studies estimate that the current species extinction rate is <a href="https://doi.org/10.1126/science.1246752">1,000 times the normal background rate</a>. </p>
<p>One of the most central solutions to biodiversity conservation is setting aside areas for nature. Spaces like national parks, community conservation areas and nature reserves are designed to be protected areas for biodiversity to thrive. The <a href="https://www.cbd.int/">Convention on Biological Diversity</a> — the first global biodiversity treaty — <a href="https://www.cbd.int/sp/targets/">set a target of 17 per cent of total global land area</a> to be protected by 2020. </p>
<p>While this goal was <a href="https://doi.org/10.1111/conl.12158">not quite met</a>, the effectiveness of <a href="https://doi.org/10.1016/j.biocon.2015.08.029">existing protected areas has also been questioned</a>, especially for <a href="https://doi.org/10.1016/j.biocon.2013.02.018">their success in protecting animals</a>.</p>
<h2>Monitoring and enforcement</h2>
<p>Some parks lack the effective enforcement of protections. For instance, <a href="https://doi.org/10.1007/s10531-008-9368-6">Sierra Chinajá in Guatemala is one example of a “paper park”</a> where the land is designated as protected, but no protections have been enforced.</p>
<p>In other cases, <a href="https://doi.org/10.1126/science.aap9565">ongoing human activity within these parks</a> has limited the effectiveness of conservation mandates. As the world discusses new targets, there is a clear need to better understand how well parks are working as a conservation strategy.</p>
<p>Our team set out to address this knowledge gap for terrestrial mammal species, which provide critical ecological services for ecosystems and people. To do so, we capitalized on a powerful tool that is gaining widespread use in wildlife conservation: the camera trap. </p>
<p>Advances in image-capturing technologies mean that researchers can install remote cameras (known as camera traps) in protected areas and leave them running for long periods. Camera traps are automatically triggered by a change in motion and heat in their immediate vicinity. For researchers, they’re like eyes in the woods, observing animals as they pass by.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="a night capture photo of a deer looking back at the camera" src="https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/449838/original/file-20220303-15-poqma0.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Camera traps sense motion to automatically capture images of animals in the wild.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>We analyzed data from over <a href="https://doi.org/10.1111/conl.12865">8,600 camera traps deployed across the world</a>. We found that the amount of official protection an area has is an important determinant of mammal diversity.</p>
<p>As the use of camera traps has increased, <a href="https://doi.org/10.1002/fee.1448">so has the number of ecosystems surveyed</a>, allowing researchers to gain knowledge about wildlife. For instance, we now know more about the abundances and activities of animals living in <a href="https://doi.org/10.1002/fee.1807">Canada’s boreal forests</a> and <a href="https://doi.org/10.1111/cobi.13232">China’s tropical rainforests</a> than ever before.</p>
<p>Ecologists have called for a collaborative effort to <a href="https://doi.org/10.1002/fee.1448">put together camera trap data to look at the bigger picture</a>. Some current collaborations include <a href="https://www.wildlifeinsights.org/team-network">the Tropical Ecology, Assessment and Monitoring (TEAM) Network</a>, <a href="https://emammal.si.edu/">eMammal</a> and <a href="https://doi.org/10.1111/geb.12600">an assessment of global patterns</a> in mammalian carnivore diversity. Our research brought together 91 studies from camera trap surveys in more than 20 countries on four continents.</p>
<h2>Human impacts</h2>
<p>While environmental factors such as <a href="https://doi.org/10.1111/1365-2656.12313">temperature and vegetation productivity</a> are known to affect the distributions and diversity of species, the impact of human activities is not as well understood. </p>
<figure class="align-center ">
<img alt="a night time camera capture of someone walking with a shotgun" src="https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&rect=0%2C0%2C2560%2C1916&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/445522/original/file-20220209-17-647os6.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Human activities threaten the survival of animals living in protected areas.</span>
<span class="attribution"><span class="source">(Cheng Chen)</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>We analyzed camera trap data to determine the relative importance of protected area coverage, <a href="https://doi.org/10.1038/ncomms12558">human footprint (the cumulative human effect on the environment)</a> and <a href="https://doi.org/10.1038/nature25181">how easily people could access a given natural area</a>.</p>
<p>Our analysis illustrated the importance of protected areas in predicting the diversity of mammals, even when other types of human disturbances were present to some extent (such as logging or hunting). Also, over 60 per cent of the <a href="https://www.iucn.org/theme/protected-areas/about/protected-areas-categories/category-v-protected-landscapeseascape">protected areas in our study</a> were classified as areas where both commercial and traditional forms of human activity are allowed, suggesting that biodiversity protection may indeed be compatible with certain types and intensities of human use.</p>
<h2>Monitoring biodiversity</h2>
<p>The second half of this year’s <a href="https://www.cbd.int/cop/">Conference of the Parties to the Convention on Biological Diversity</a> is scheduled be held in April, where one of the main goals is to discuss the post-2020 biodiversity framework. This framework will set new targets for global and national efforts to conserve biodiversity. </p>
<p>To inform these targets and evaluate their success, there is an urgent need for reliable indicators of biodiversity change, and rigorous assessments of conservation effectiveness. Our study highlights how camera trap surveys can generate standardized data on many species within mammal communities across varied ecosystems. This monitoring tool has great potential to become an integral part of global biodiversity monitoring systems designed to keep a closer watch on, and ultimately better protect, the Earth’s wild creatures.</p><img src="https://counter.theconversation.com/content/176292/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Cheng Chen receives funding from China Scholarship Council Doctoral Scholarships. </span></em></p><p class="fine-print"><em><span>Cole Burton receives funding from the Canada Research Chairs program and the Natural Sciences and Engineering Research Council of Canada (NSERC). </span></em></p>
Data from camera traps around the world provide a strong case to support the designation of protected wilderness areas.
Cheng Chen, PhD candidate, Forestry, University of British Columbia
Cole Burton, Canada Research Chair in Terrestrial Mammal Conservation, University of British Columbia
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/145700
2020-09-13T19:49:26Z
2020-09-13T19:49:26Z
Behind the new Samsung Fold: how the quest to maximise screen size is driving major innovation
<figure><img src="https://images.theconversation.com/files/357642/original/file-20200911-22-apy4aa.jpg?ixlib=rb-1.1.0&rect=40%2C209%2C1360%2C702&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Samsung</span></span></figcaption></figure><p>To enlarge a phone, or not to enlarge a phone? That is the question. In the world of flagship smartphones, there seems to be one clear trend: bigger is better. </p>
<p>Manufacturers are trying to strip away anything that might stand in the way of the largest possible slab of screen. There is also growing demand for thinner phones with diminishing <a href="https://www.lifewire.com/bezel-4155199">bezels</a> (the area surrounding a screen). </p>
<p>This trend has now culminated in the latest innovation in smartphone design, the <a href="https://www.t3.com/au/news/best-folding-phones">foldable screen phone</a>. These devices sport thin <a href="https://www.techradar.com/au/news/what-is-oled">OLED</a> self illuminating screens that can be folded in half.</p>
<p>The newest release is the <a href="https://www.theverge.com/21427462/samsung-galaxy-z-fold-2-review">Samsung Galaxy Z fold 2</a> – a device that is almost three-quarters screen and has extravagant overtones rivalled only by a hefty <a href="https://www.samsung.com/au/smartphones/galaxy-z-fold2/buy/">A$2,999 price tag</a>.</p>
<p>But to prevent the phones themselves from growing to unwieldy size, manufacturers are having to find ways to balance size with usability and durability. This presents some interesting engineering challenges, as well as some innovative solutions. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A giant, old-style phone" src="https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=472&fit=crop&dpr=1 600w, https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=472&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=472&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=593&fit=crop&dpr=1 754w, https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=593&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/357605/original/file-20200911-22-1vlsst9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=593&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Why do we love large phones?</span>
<span class="attribution"><span class="source">Pixabay</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<h2>Internal design complexities of folding phones</h2>
<p>Modern phones still typically use a thin LCD or plastic OLED display covered by an outer glass panel. </p>
<p>Folding displays are a new category that exploit the flexibility of OLED display panels. Instead of simply fixing these panels to a rigid glass panel, they carefully engineer the panel so that it bends – but never quite tightly enough to snap or crack. </p>
<p>Internal structural support is needed to make sure the panel doesn’t crease, or isn’t stressed to the point of creating damage, discolouration or visible surface ripples. </p>
<p>Since this is a mechanical, moving system, reliability issues need to be considered. For instance, how long will the hinge last? How many times can it be <a href="https://www.theverge.com/2019/10/4/20898484/samsung-galaxy-fold-folding-test-failure-durability">folded and unfolded</a> before it malfunctions? Will dirt or dust make its way into the assembly during daily use and affect the screen?</p>
<p>Such devices need an added layer of reliability over traditional slab-like phones, which have no moving parts.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-new-iphone-se-is-the-cheapest-yet-smart-move-or-a-premium-tech-brand-losing-its-way-136507">The new iPhone SE is the cheapest yet: smart move, or a premium tech brand losing its way?</a>
</strong>
</em>
</p>
<hr>
<h2>Large screen, thin phone: a recipe for disaster?</h2>
<p>Each generation of smartphones becomes thinner and with smaller bezels, which improves the viewing experience but can make the phone harder to handle. </p>
<p>In such designs, the area of the device you can grip without touching the display screen is small. This leads to a higher chance of <a href="https://www.cnet.com/news/study-19-percent-of-people-drop-phones-down-toilet/">dropping the device</a> – a blunder even the best of us have made. </p>
<p>There’s an ongoing tussle between consumers and manufacturers. Consumers want a large, viewable surface as well as an easily portable and rugged device. But from an engineering point of view, these are usually competing requirements. </p>
<p>You’ll often see people in smartphone ads holding the device with two hands. In real life, however, most people use their phone with <a href="https://www.smartinsights.com/mobile-marketing/mobile-design/research-on-mobile-interaction-behaviour-and-design/">one</a> <a href="https://alistapart.com/article/how-we-hold-our-gadgets/">hand</a>. </p>
<p>Thus, the shift towards larger, thinner phones has also given rise to a boom in demand for assistive tools attached to the back, such as <a href="https://www.androidcentral.com/best-popsockets">pop-out grips and phone rings</a>.</p>
<p>In trying to maximise screen size, smartphone developers also have to account for interruptions in the display, such as the placement of cameras, laser scanners (for face or object identification), proximity sensors and speakers. All are placed to minimise visual intrusion.</p>
<h2>Now you see it, now you don’t</h2>
<p>In the engineering world, to measure the physical world you need either cameras or sensors, such as in a fingerprint scanner. </p>
<p>With the race to increase the real estate space on screens, typically these cameras and scanners are placed somewhere around the screen. But they take up valuable space.</p>
<p>This is why we’ve recently seen tricks to carve out more space for them, such as <a href="https://www.techradar.com/au/news/this-is-the-worlds-first-smartphone-where-half-the-screen-is-a-fingerprint-scanner">pop up</a> cameras and <a href="https://www.google.com/search?q=phone+screen+hole+for+camera&source=lmns&bih=598&biw=1280&rlz=1C5CHFA_enAU871AU871&safe=active&hl=en&sa=X&ved=2ahUKEwjXvcyoveDrAhUwhUsFHXvqBYMQ_AUoAHoECAEQAA">punch-hole</a> cameras, in which the camera sits in a cutout hole allowing the display to extend to the corners. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Front view of Samsun Galaxy Note 10." src="https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=439&fit=crop&dpr=1 600w, https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=439&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=439&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=551&fit=crop&dpr=1 754w, https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=551&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/357640/original/file-20200911-18-r1bxyk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=551&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The Samsun Galaxy Note 10 has a centered punch hole front-facing camera.</span>
<span class="attribution"><span class="source">Samsung</span></span>
</figcaption>
</figure>
<p>But another fantastic place for sensors is right in front of us: the screen. Or more specifically, under the screen.</p>
<p>Samsung is one company that has suggested placing selfie-cameras and fingerprint readers behind the screen. But how do you capture a photo or a face image through a layer of screen? </p>
<p>Up until recently, this has been put in the “too hard basket”. But that is changing: Xiaomi, Huawei and <a href="https://www.extremetech.com/mobile/262497-samsung-patent-shows-phone-camera-inside-display">Samsung</a> all have patents for <a href="https://www.phonearena.com/news/samsung-galaxy-s21-s30-under-display-camera_id125174">under-display cameras</a>.</p>
<p>There are a range of ways to do this, from allowing a camera to see through the screen, to using <a href="https://www.rp-photonics.com/microlenses.html">microlenses</a> and camera pixels distributed throughout the display itself – similar to an insect’s <a href="https://www.britannica.com/animal/insect/Nervous-system#ref250944">compound eye</a>. </p>
<p>In either case, the general engineering challenge is to implement the feature in a way that doesn’t impact screen image quality, nor majorly affect camera resolution or colour accuracy.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Close up of an insect's compound eyes" src="https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/357639/original/file-20200911-20-1vwk072.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Insects have compound eyes. These are made up of repeating units called the ommatidia, sometimes with thousands in each eye. Each ommatidia is a separate visual receptor.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Laptops in our pockets</h2>
<p>With up to 3.8 billion smartphone users <a href="https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/">expected by 2021</a>, mobile computing is a primary consumer technology area seeing significant growth and investment.</p>
<p>One driver for this is the professional market, where larger mobile devices allow more efficient on-the-go business transactions. The second market is individuals who who <a href="https://www.statista.com/topics/779/mobile-internet/"><em>only</em> have a mobile device</a> and no laptop or desktop computer.</p>
<p>It’s all about choice, but also functionality. Whatever you choose has to get the job done, support a positive user experience, but also survive the rigours of the real world.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/apples-iphone-11-pro-wants-to-take-your-laptops-job-and-price-tag-123372">Apple's iPhone 11 Pro wants to take your laptop's job (and price tag)</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/145700/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrew Maxwell does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
The upcoming Galaxy Z Fold 2 is almost three-quarters screen. And while that’s convenient, it’s important to actually be able to hold the phone. As design evolves, how do manufacturers adapt?
Andrew Maxwell, Senior Lecturer, University of Southern Queensland
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/141010
2020-06-19T12:08:22Z
2020-06-19T12:08:22Z
Holding on and holding still, a son photographs his father with Alzheimer’s
<figure><img src="https://images.theconversation.com/files/342846/original/file-20200618-41230-509agp.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C1947%2C1555&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">'With Dad,' Marlborough, Massachusetts, Oct. 29, 1998.</span> <span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span></figcaption></figure><p><em>In 1985, when Stephen DiRado was just a few years out of college, he bought his first <a href="https://www.bhphotovideo.com/find/Product_Resources/largeformat1.jsp">large-format</a>, 8x10 camera. Since each exposure cost eight bucks in today’s dollars, the process required contemplation; he couldn’t simply snap 100 images and pick out the handful he liked best. The stakes were high, but the payoff was immense: A well-executed photograph could contain enough rich detail to tell a whole story.</em> </p>
<p><em>He was hooked. He would lug the 35-pound camera to places in Worcester, Massachusetts, like <a href="https://stephendirado.com/bell-pond/">Bell Pond</a> and <a href="https://stephendirado.com/mall-series/">the Worcester Center Galleria</a> to photograph people whom, as he put it, “I had no business being with.” The <a href="https://stephendirado.com/wp-content/uploads/2018/05/Worcester-MA-1985-aaa.jpg">neighborhood kids</a>, <a href="https://stephendirado.com/wp-content/uploads/2015/07/Mall39.jpg">cops</a>, <a href="https://stephendirado.com/wp-content/uploads/2015/07/Mall30.jpg">clerks</a>, <a href="https://stephendirado.com/wp-content/uploads/2018/05/Worcester-Argentos-Lenny-and-George-1985.jpg">butchers</a> and <a href="https://stephendirado.com/wp-content/uploads/2015/06/BellPond14.jpg">families</a> who let DiRado into their worlds were generous enough to pose – and hold still – so he could make a photograph.</em></p>
<p><em>“I think I disarmed everybody with the huge camera,” he explained, “because there was nothing to conceal, nothing to hide.”</em></p>
<p><em>He was also constantly photographing his family and friends, who became so used to seeing the big box on a tripod during dinners and holiday gatherings that it became “almost invisible.”</em></p>
<p><em>In 1993, DiRado noticed something didn’t seem quite right with his father, Gene, so he made an appointment to photograph him at his home in Marlborough, Massachusetts. It was the beginning of a 16-year project making photographs of his father, who was eventually diagnosed with Alzheimer’s disease. His book of the photographs, “<a href="https://www.davisart.com/products/davis-select/davis-select-art-books/with-dad/">With Dad</a>,” was published in November 2019.</em> </p>
<p><em>In an interview, which has been edited for length and clarity, Stephen DiRado describes the agony, anxiety and devotion he felt during those years. It was an entirely different kind of story and challenge: What do you do when the subject is a disease as much as a person, and when the disease then subsumes the person, to the point where he can’t remember his own son?</em> </p>
<p><em>Yet Stephen continued to show up, camera in tow. Miraculously, the camera remained as strong a conduit from son to father as it had ever been, a channel forged from thousands of photographs taken over decades.</em> </p>
<p><em>In the camera’s presence, even though Gene could no longer recognize Stephen, he knew enough to hold still.</em> </p>
<hr>
<p><strong>In the early stages of your dad’s disease, what kind of stories did the photographs tell?</strong></p>
<p>I grew up in a large Italian family that, at the drop of a pin, would get together. My mother always had extra plates at the dinner table. But when I was younger, I was very inhibited, while the rest of the family was so talkative. So I would just watch them and observe, and I started to understand more about body language. I was creating my own little stories about how what they were saying was not necessarily what their bodies were telling me.</p>
<p>Around the time my father was 57 or 58, I started noticing that something was off. He wasn’t as engaged anymore, and he started to isolate himself and sit in front of the TV, but not really watch. </p>
<p>That just didn’t seem like my father. So I started to make appointments to photograph him at his house in Marlborough, Massachusetts. I’d look through the photos wondering what was going on, what could be wrong. </p>
<p>I thought one might hold the answer. It’s from 1993 and it’s in his backyard. I put him in the center of the photo, like a bull’s-eye, and he’s holding his all-time love, his dog, Missie. My father’s manicured, the dog’s manicured. Those are my father’s hedges and bushes, they’re manicured. It’s a pretty put together guy there. But there’s something about the look that was, for me, a little off. There’s something too manicured about it all. The surface is kind of fake. So I thought it must be depression.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342818/original/file-20200618-41234-1a7gf77.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">‘Gene and Missie,’ Marlborough, Massachusetts, Oct. 16, 1993.</span>
<span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p><strong>When did the seriousness of the disease really start to hit home?</strong></p>
<p>In 1998, he had a stroke. I went right to UMass Medical Center, and I stayed with him for the next three days, hanging out and photographing him. And at one point one of the nurses said, “I think your father has some form of dementia, and he might even have this thing called Alzheimer’s.” </p>
<p>So I remember saying to my father, “Dad, they say you might have this thing Alzheimer’s.” He said, “Well, how long do you think I’ll have it?” I said, “Dad, I don’t know. This is not good. But you can count to 10, right?” He said, “Of course I can count to 10.”</p>
<p>And he said, “One, two, three, four, five, six – oh I don’t want to do this.”</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342822/original/file-20200618-41234-i49xbi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">‘Recovering from First Stroke, RM 407,’ Worcester, Massachusetts, May 19, 1998.</span>
<span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>After the stroke, he was really starting to go downhill. He had a cognitive test and, sure enough, he flunked it. There was a high probability – but it wasn’t decisive – that he had Alzheimer’s.</p>
<p>My brother and sister and I decided that we would “daddy-sit” and take turns on weekends to give my mother some time off to go visit family or just get away. So one weekend in November 2003, it was my turn. I fed him dinner. We watched TV, and he sat there, in his pajamas, like he always did.</p>
<p>But I noticed that every hour or so, he’d get up and go into the bathroom. I started eavesdropping but didn’t hear anything.</p>
<p>An hour later, he’d return to the bathroom. I finally said, “I’m coming into the bathroom next time you go in there.” I followed him in and he walked up to the mirror, and he just stared at himself. </p>
<p>I thought that he must be holding on to himself, his sense of identity. So I said, “Dad, you know what? I’m gonna photograph you looking in the mirror.” I dropped the legs of the tripod and said, “Dad you know the deal, I’m gonna have to hit the flash off this ceiling, and I’m gonna photograph you inspecting yourself. You have to stay very still.” </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=750&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=750&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=750&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=943&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=943&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342824/original/file-20200618-41217-i50021.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=943&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">‘Stranger in the Mirror,’ Marlborough, Massachusetts, Nov. 2, 2003.</span>
<span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The lens was cocked. I had the cable in my hand. “Here we go,” I said, “One, two –” and on “two” he turned and looked at the camera, and he smiled. I asked him what he was doing. </p>
<p>He said, “The man in the mirror is looking at you. And I want to look at you.” </p>
<p>This was so far beyond what I had ever imagined. I guess I had been in denial. I wondered whether I should stop the project right then and there.</p>
<p>I eventually said, “Dad, what do we think about the man in the mirror?” </p>
<p>“He’s a good man,” he said. </p>
<p>“I think he’s a great man,” I said, “and I think we both need to look at the man in the mirror and make this photograph.”</p>
<p><strong>That sounds like a turning point – you were wondering whether you should stop the project. What were you afraid of and how did you push through?</strong></p>
<p>The thing about any project – it doesn’t matter which one – is that great trepidation. Is the work soft? Am I being indulgent? Am I photographing my father for selfish reasons? That never went away. </p>
<p>And you know what? It is a very selfish thing. All art is selfish. Don’t let anybody fool you. I make photos and my art because I’m telling a story to the best of my ability, and I’ll do everything in my powers to make it very powerful with the material that I have. I need to seize the moment and mold it. This is being offered to me right now. I have to deal with it.</p>
<p>But at the same time, I’m also making art for 100 years from now – forget vanity, forget about privacy. This is so 100 years from now, historians, doctors, kids, artists, whoever can look at these images. And I hope by then, there is no more Alzheimer’s, that it will be like looking at leper colony photos.</p>
<p><strong>Once your dad stopped being able to recognize you, how did he deal with the presence of this photographer and his camera?</strong></p>
<p>I’ve been photographing my family since I was 12 years old. I photograph 24/7. If you’re a part of my life, if we were hanging out in a room together, I’d be photographing you.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342825/original/file-20200618-41238-1aqetku.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">‘Ash Wednesday,’ Marlborough, Massachusetts, Feb. 9, 2005.</span>
<span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>By the time he went into the nursing home in July 2004, it was just the camera he recognized. To him, I was no longer existent. But he recognized the camera and knew enough to stay still. I think that this was one of the hard, ingrained things – tens of thousands of times being photographed by me, saying “hold still, hold still, hold still.”</p>
<p><strong>How often did you photograph him once he was in the nursing home?</strong></p>
<p>I went two or three times a week during a five-year period. Whenever I would get in my car to leave, I would get all nervous, even though I had been doing this forever. I’d start thinking about how I needed to make some kind of statement of value, and I’d get a stomach ache. </p>
<p>I’d be like, “Oh, you’re so full of s— DiRado, you go through this every friggin’ week. I’m so fed up with you. Get in that car right now.” And I would drive there feeling like I was gonna throw up, but the minute I touched the door to the nursing home, it all went away. I became my father’s son, a soldier intent on making the best possible art I could.</p>
<p>That’s another thing about the camera: When you carry 35 pounds over your shoulder to some destination, you’re going to make a photo. You’re going to make something.</p>
<p>And then, about once a week, after leaving, I would take a back road to Worcester so I could stop at Newbury Comics, where I would treat myself to a used video. After all, I had just been a good boy, right? We’re always our parents’ kids.</p>
<p><strong>Towards the end, he looks so peaceful.</strong></p>
<p>He slept often. It definitely brought me back to being 5 years old and sneaking into my parents’ bedroom and watching them sleep. These are very peaceful, quiet moments for any child who has done this. </p>
<p>He became a human still life. I would study his ears, his face. I could take the time to light him, to notice his hands, his fingernails growing out.</p>
<p>During the last six months of his life, something happened. It was like he found some level of spirituality or calmness. He was always surrounded by these stuffed animals and holding on to them. And he was always smiling. He was someplace else, between Earth and heaven.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=480&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=480&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=480&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=603&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=603&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342827/original/file-20200618-41234-gsykjt.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=603&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Marlborough, Massachusetts, Nov. 11, 2009.</span>
<span class="attribution"><span class="source">Stephen DiRado</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure><img src="https://counter.theconversation.com/content/141010/count.gif" alt="The Conversation" width="1" height="1" />
What does an artist do when the subject is a disease as much as a person, and when the disease then subsumes the person – to the point where he can’t recognize his own son?
Nick Lehr, Arts + Culture Editor
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/123372
2019-09-15T20:17:31Z
2019-09-15T20:17:31Z
Apple’s iPhone 11 Pro wants to take your laptop’s job (and price tag)
<p>What a week it has been in the Apple core. In recent days the tech giant has released a litany of products, including new phones, watches, tablets, and more.</p>
<p>The big-ticket items are clearly the new iPhone 11 range. These hint at some interesting technology directions, which will most likely spread across the mobile sector.</p>
<p>Of course, it’s hardly radical to create a phone that is also a camera, web browser, computer, and gaming device. That idea is as old as smart phones themselves.</p>
<p>But Apple’s continued progression down this road raises the question of whether this trend can be sustained indefinitely, or whether there is in fact a limit to what the market will bear in terms of functionality, aesthetics, and cost. The new iPhones are priced from <a href="https://www.apple.com/au/shop/buy-iphone/iphone-11">A$1,199 for the basic model</a> up to <a href="https://www.apple.com/au/shop/buy-iphone/iphone-11-pro">A$2,499 for a top-spec iPhone 11 Pro Max</a>.</p>
<h2>Cameras, computing and competition</h2>
<p>In keeping with its rivals, Apple has clearly made the camera system the focus (pardon the pun) of its new iPhones. Aesthetically minded users might find the cluster of camera lenses jarring – more function than form – and doubly distressing if you’re unlucky enough to suffer <a href="https://en.wikipedia.org/wiki/Trypophobia">trypophobia</a>, the fear of irregularly clustered bumps or holes. </p>
<p>The back of the 11 Pro sports three cameras with different focal lengths. Despite still being only 12 megapixels each, in this era of filters and digital enhancements, pixel-count is no longer the crucial metric. </p>
<p>Each camera, including the front-facing one, can be used simultaneously. It’s now conceivable to film an entire feature-length movie on a phone (should you ever actually want to). This requires a significant amount of internal coordination to ensure that colour grading and exposure blend seamlessly between these cameras, which in turn brings us to the question of computing power. </p>
<p>The new iPhones are equipped to handle not just complex computational photography but also advanced augmented reality and fast-learning artificial intelligence. </p>
<p>This level of highly integrated computing is one of the clearest direction changes in the iPhone lineup. It makes perfect sense from Apple’s point of view, not just because it helps to enhance performance, but because Apple controls its entire research, development and production line anyway.</p>
<p>But all of this integration comes with a couple of obvious downsides for the user. One is that it’s increasingly difficult to <a href="https://theconversation.com/sustainable-shopping-if-you-really-truly-need-a-new-phone-buy-one-with-replaceable-parts-93069">service your own phone</a>. The other is that for all their “multitasking” claims, it’s still only possible to do one thing at a time. One of the reasons I sound sceptical about filming feature-length movies on an iPhone is the question of what happens if you receive a phone call halfway through shooting a big scene.</p>
<h2>What are ‘pro’ phones really for?</h2>
<p>Despite the “Pro” moniker, and the suggestion that they can be used to produce commercial-standard creative work, even top-end iPhones are still inherently personal devices. Of course, Apple isn’t really pitching its phones as essential kit for film directors. The actual use case is somewhat more prosaic.</p>
<p>The top-end price tag of A$2,499 looks remarkably like laptop pricing. For professionals who do most of their work on their phone, Apple clearly thinks even this hefty price tag will represent a sensible investment for a versatile piece of kit. </p>
<p>Remember that mobile phones in the early 1990s were comparatively just as astounding in price, yet they sold to professionals who were busy and affluent enough to require one (or at least wanted to look as if they were). </p>
<p>That said, flagship phone pricing is creating a digital divide between those who insist on the latest phone and those happy to make do with an older model. As a result, the budget and mid-range phone market has become as competitive as it is varied, with fantastic handsets available for less than A$400 outright, as well as a booming secondhand market.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/3-reasons-why-we-are-addicted-to-smartphones-84041">3 reasons why we are addicted to smartphones</a>
</strong>
</em>
</p>
<hr>
<p>I always consider repairability when buying technology. I maintain my phone by replacing screens and batteries, which anyone can do with the right guidance. But many manufacturers work hard to <a href="https://theconversation.com/repair-or-replace-how-to-fight-constant-demands-for-new-stuff-66299">thwart these home repair efforts</a>.</p>
<p>Many phone components, <a href="https://www.ifixit.com/News/apple-is-locking-batteries-to-iphones-now">including batteries</a>, are now often “authenticated” with the phone’s central processing unit, so that should an unofficial repair occur the device may refuse to work as intended. Sadly, users have little control over this.</p>
<p>If you buy a device, you should have the <a href="https://en.wikipedia.org/wiki/Electronics_right_to_repair">right to repair</a> it. When buying a flagship phone, remember you will almost undoubtedly one day drop it on the floor, so it pays to think about how you’ll get it fixed, and whether you’re happy to play by the manufacturer’s rules. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/sustainable-shopping-if-you-really-truly-need-a-new-phone-buy-one-with-replaceable-parts-93069">Sustainable shopping: if you really, truly need a new phone, buy one with replaceable parts</a>
</strong>
</em>
</p>
<hr>
<p>It is clear that modern mobile devices are trying to be the “everything” device, balancing functionality with aesthetics, and even trying to take a bite out the laptop market (with a price tag to match). Premium pricing structures have been tested and appear set to say. It seems that expensive phones bristling with high-performance cameras have become the new norm.</p><img src="https://counter.theconversation.com/content/123372/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrew Maxwell does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
The idea of a phone that can do everything is hardly new. But the premium pricing of Apple’s iPhone 11 begs the question of how far this trend can realistically be taken.
Andrew Maxwell, Senior Lecturer, University of Southern Queensland
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/118426
2019-07-10T10:02:17Z
2019-07-10T10:02:17Z
Moon landings footage would have been impossible to fake – a film expert explains why
<figure><img src="https://images.theconversation.com/files/281387/original/file-20190626-76713-hm0lk6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Buzz Aldrin on the moon.</span> <span class="attribution"><span class="source">NASA / Neil A. Armstrong</span></span></figcaption></figure><p>It’s been half a century since the magnificent <a href="https://theconversation.com/uk/topics/50th-anniversary-of-moon-landing-71605?utm_source=TC&utm_medium=linkback&utm_campaign=moonseries2019&utm_content=inlineasseta">Apollo 11 moon landing</a>, yet many people still don’t believe it actually happened. Conspiracy theories about the event dating back to the 1970s are in fact more popular than ever. A common theory is that film director Stanley Kubrick helped NASA fake the historic footage of its six successful moon landings. </p>
<p>But would it really have been possible to do that with the technology available at the time? I’m not a space travel expert, an engineer or a scientist. I am a filmmaker and lecturer in film post-production, and – while I can’t say how we landed on the moon in 1969 – I can say with some certainty that the footage would have been impossible to fake.</p>
<p>Here are some of the most common beliefs and questions – and why they don’t hold up.</p>
<p><strong>‘The moon landings were filmed in a TV studio.’</strong></p>
<p>There are <a href="http://www.elementsofcinema.com/general/film-digital.html">two different ways</a> of capturing moving images. One is film, actual strips of photographic material onto which a series of images are exposed. Another is video, which is an electronic method of recording onto various mediums, such as moving magnetic tape. With video, you can also broadcast to a television receiver. A standard motion picture film records images at 24 frames per second, while broadcast television is typically either 25 or 30 frames, depending on where you are in the world.</p>
<p>If we go along with the idea that the moon landings were taped in a TV studio, then we would expect them to be 30 frames per second video, which was the television standard at the time. However, we know that video from the first moon landing was recorded at <a href="http://news.bbc.co.uk/1/hi/sci/tech/4791883.stm">ten frames per second</a> in SSTV (Slow Scan television) with a <a href="https://en.wikipedia.org/wiki/Apollo_TV_camera">special camera</a>.</p>
<hr>
<p><em><strong>To the moon and beyond is a new podcast series from The Conversation marking the 50th anniversary of the moon landings. <a href="https://theconversation.com/uk/podcasts/moon-and-beyond">Listen and subscribe here</a>.</strong></em></p>
<hr>
<p><strong>‘They used the Apollo special camera in a studio and then slowed down the footage to make it look like there was less gravity.’</strong></p>
<p>Some people may contend that when you look at people moving in slow motion, they appear to be in a low gravity environment. Slowing down film requires more frames than usual, so you start with a camera capable of capturing more frames in a second than a normal one – this is called overcranking. When this is played back at the normal frame rate, this footage plays back for longer. If you can’t overcrank your camera, but you record at a normal frame rate, you can instead artificially slow down the footage, but you need a way to store the frames and generate new extra frames to slow it down.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=629&fit=crop&dpr=1 600w, https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=629&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=629&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=791&fit=crop&dpr=1 754w, https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=791&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/282034/original/file-20190701-105215-150r8f4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=791&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Apollo Lunar Television Camera, as it was mounted on the side of the Apollo 11 Lunar Module when it telecasted Armstrong’s ‘One small step’.</span>
<span class="attribution"><span class="source">NASA</span></span>
</figcaption>
</figure>
<p>At the time of the broadcast, magnetic disk recorders capable of storing slow motion footage <a href="https://youtu.be/-TelJ75pzP4?t=348">could only capture 30 seconds in total</a>, for a playback of 90 seconds of slow motion video. To capture 143 minutes in slow motion, you’d need to record and store 47 minutes of live action, which simply wasn’t possible.</p>
<p><strong>‘They could have had an advanced storage recorder to create slow motion footage. Everyone knows NASA gets the tech before the public.’</strong></p>
<p>Well, maybe they did have a super secret extra storage recorder – but one almost 3,000 times more advanced? Doubtful. </p>
<p><strong>‘They shot it on film and slowed down the film instead. You can have as much film as you like to do this. Then they converted the film to be shown on TV.’</strong></p>
<p>That’s a bit of logic at last! But shooting it on film would require thousands of feet of film. A typical reel of 35mm film – at 24 frames per minutes second – lasts 11 minutes and is <a href="https://en.wikipedia.org/wiki/Reel">1,000 foot long</a>. If we apply this to 12 frames per second film (as close to ten as we can get with standard film) running for 143 minutes (this is how long the Apollo 11 footage lasts), you would need six and a half reels.</p>
<p>These would then need to be put together. The splicing joins, transfer of negatives and printing – and potentially grains, specks of dust, hairs or scratches – would instantly give the game away. There are none of these artefacts present, which means it wasn’t shot on film. When you take into account that the subsequent Apollo landings were shot at 30 frames per second, then to fake those would be three times harder. So the Apollo 11 mission would have been the easy one.</p>
<p><strong>‘But the flag is blowing in the wind, and there’s no wind on the moon. The wind is clearly from a cooling fan inside the studio. Or it was filmed in the desert.’</strong></p>
<p>It isn’t. After the flag is let go, it settles gently and then doesn’t move at all in the remaining footage. Also, how much wind is there inside a TV studio? </p>
<p>There’s wind in the desert, I’ll accept that. But in July, the desert is also very hot and you can normally see heat waves present in footage recorded in hot places. There are no heat waves on the moon landing footage, so it wasn’t filmed in the desert. And the flag still isn’t moving anyway.</p>
<hr>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/280931/original/file-20190624-97762-b4blia.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><strong>MORE ON THE MOON AND BEYOND</strong>
<br><a href="https://theconversation.com/uk/topics/to-the-moon-and-beyond-72729?utm_source=TC&utm_medium=linkback&utm_campaign=moonseries2019&utm_content=inlineasseta">Join us as we delve into the last 50 years of space exploration and the 50 years to come. From Neil Armstrong’s historic first step onto the lunar surface to present-day plans to use the moon as a launchpad to Mars, hear from academic experts who’ve dedicated their lives to studying the wonders of space.</a></p>
<hr>
<p><strong>‘The lighting in the footage clearly comes from a spotlight. The shadows look weird.’</strong> </p>
<p>Yes, it’s a spotlight – a spotlight, 93m miles away. It’s called the sun. Look at the shadows in the footage. If the light source were a nearby spotlight, the shadows would originate from a central point. But because the source is so far away, the shadows are parallel in most places rather than diverging from a single point. That said, the sun isn’t the only source of illumination – light is reflected from the ground too. That can cause some shadows to not appear parallel. It also means we can see objects that are in the shadow.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=410&fit=crop&dpr=1 600w, https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=410&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=410&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=515&fit=crop&dpr=1 754w, https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=515&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/282033/original/file-20190701-105164-1h7w5g9.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=515&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Stanley Kubrick.</span>
<span class="attribution"><span class="source">Instituto María Auxiliadora Neuquén/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p><strong>‘Well, we all know Stanley Kubrick filmed it.’</strong></p>
<p>Stanley Kubrick could have been asked to fake the moon landings. But as he was such a perfectionist, he would have insisted on shooting it on location. And it’s well documented <a href="http://flipthemoviescript.com/the-fear-of-flying-affected-stanley-kubricks-career/">he didn’t like to fly</a>, so that about wraps that one up… Next? </p>
<p><strong>‘It’s possible to recreate dinosaurs from mosquitoes the way they did in Jurassic Park, but the government is keeping it a secret.’</strong> </p>
<p>I give up.</p><img src="https://counter.theconversation.com/content/118426/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Howard Berry does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Conspiracy theorists claim NASA used the Apollo special camera to stage the moon landings in a studio and then slowed down the footage to make it look like there was less gravity.
Howard Berry, Head of Post-Production and Programme Leader for MA Film and Television Production, University of Hertfordshire
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/107957
2019-01-08T19:12:56Z
2019-01-08T19:12:56Z
How to take better photos with your smartphone, thanks to computational photography
<figure><img src="https://images.theconversation.com/files/248327/original/file-20181203-194925-mysryb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A light-trails long exposure of London's Tower Bridge, shot on iPhone8Plus using the NightCap app.</span> <span class="attribution"><span class="source">Rob Layton</span>, <span class="license">Author provided</span></span></figcaption></figure><p>Each time you snap a photo with your smartphone – depending on the make and model – it may perform <a href="https://www.theguardian.com/commentisfree/2018/oct/28/has-apple-given-its-iphone-xs-camera-worthy-of-its-name">more than a trillion operations</a> for just that single image.</p>
<p>Yes, you expect it to do the usual auto-focus/auto-exposure functions that are the hallmark of point-and-shoot photography. </p>
<p>But your phone may also capture and stack multiple frames (sometimes before you even press the button), capture the brightest and darkest parts of the scene, average and merge exposures, and render your composition into a three-dimensional map to artificially blur the background.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/robots-can-learn-a-lot-from-nature-if-they-want-to-see-the-world-92838">Robots can learn a lot from nature if they want to 'see' the world</a>
</strong>
</em>
</p>
<hr>
<p>The term for this is computational photography, which basically means that image capture is via a series of digital processes rather than purely optical ones. Image adjustment and manipulation take place in real time, and in the camera, rather than in post-production using any editing software. </p>
<p>Computational photography streamlines image production so everything – capture, editing and delivery – can be done in the phone, with much of the heavy lifting done as the picture is taken.</p>
<h2>A smartphone or a camera?</h2>
<p>What this means for the everyday user is that your smartphone now rivals, and in many cases surpasses, expensive DSLR cameras. The ability to create professional-looking photos is in the palm of your hand.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=799&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=799&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=799&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1005&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1005&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248608/original/file-20181204-23246-1vl98cd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1005&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Low-light photography shot on iPhone 8 Plus.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>I started in photography more than 30 years ago with film, darkrooms, a bagful of cameras and lens, and later the inevitable switch to <a href="https://www.pcmag.com/encyclopedia/term/42047/dslr">DSLRs</a> (with digital single-lens reflex, light travels through the lens to a mirror [the reflex] that sends the image to the viewfinder and flips up when the shutter is fired for the image sensor to capture the image). </p>
<p>But <a href="https://roblayton.com.au/">my photography</a> now is done exclusively with an iPhone – because it’s cheaper and always with me. I have two accessory lenses, two rigs (one for <a href="https://www.instagram.com/isurf_burleigh/">underwater</a>, the other for land), a tripod and a bunch of photography apps.</p>
<p>It’s the apps that often are the powerhouse of computational smartphone photography. Think of it like a hotted-up car. Apps are bespoke add-ons that harness and enhance existing engine performance. And, as with car racing, the best add-ons usually end up in mass production.</p>
<p>That certainly seems to be the case with Apple’s <a href="https://www.apple.com/au/iphone-xs/cameras/">iPhone Xs</a>. It has supercharged computational photography through its advances in low-light performance, smart HDR (<a href="https://www.pcmag.com/encyclopedia/term/70384/hdr-for-photos">High Dynamic Range</a>) and artificial depth-of-field: this is arguably the best camera phone on the market right now.</p>
<p>A few months ago that <a href="https://mashable.com/article/huawei-p20-pro-review/">title was held</a> by the <a href="https://consumer.huawei.com/au/phones/p20-pro/">Huawei P20 Pro</a>. Before the Huawei it was <a href="https://www.businessinsider.com.au/google-pixel-2-camera-photos-2018-7">probably</a> <a href="https://store.google.com/au/product/pixel_2">Google’s Pixel 2</a> – <a href="https://www.theverge.com/2018/10/15/17973484/google-pixel-3-xl-review-camera-features-screen-battery-price-photos">until</a> the <a href="https://store.google.com/au/product/pixel_3">Pixel 3</a> came out.</p>
<p>The point is, manufacturers are leapfrogging each other in the race to be the best smartphone camera in an image-obsessed society (when was the last time you saw a smartphone marketed as a phone?).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=800&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=800&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=800&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1005&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1005&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248632/original/file-20181204-126662-18nk5xd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1005&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Stars are discernible in this image which proves astrophotography is possible on smartphone.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>Phone producers are pulling the rug from beneath traditional camera manufacturers. It’s a bit like the dynamic between newspapers and digital media: newspapers have the legacy of quality and trust, but digital media are responding better and faster to market demands. So too are smartphone manufacturers.</p>
<p>So, right now, the main areas of smartphone computational photography that you may be able to employ for better pictures are: portrait mode; smart HDR; low light and <a href="https://shuttermuse.com/glossary/long-exposure/">long exposure</a>.</p>
<h2>Portrait mode</h2>
<p>Conventional cameras use long lenses and large apertures (openings for light) to blur the background to emphasise the subject. Smartphones have small focal lengths and fixed apertures so the solution is computational – if your device has more than one rear camera (some, including the Huawei, have three).</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=270&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=270&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=270&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=340&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=340&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248613/original/file-20181204-23249-k2e3a5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=340&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An image in portrait mode that shows the 3D depth map generated to control the bokeh (blur).</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>It works by using both cameras to capture two images (one wide angle, the other telephoto) that are merged. Your phone looks at both images and determines a depth map – the distance between objects in the overall image. Objects and entire areas can then be artificially blurred to precise points, depending on where on that depth map they reside.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=800&fit=crop&dpr=1 600w, https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=800&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=800&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1005&fit=crop&dpr=1 754w, https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1005&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/250857/original/file-20181217-185252-jqbhw.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1005&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">This portrait of a young longbow archer was shot with the Halide app, the background blurred in Focos app, and final editing done in Lightroom CC for mobile. Notice the bowstring disappears in low-contrast areas on the depth map, showing limitations in a technology not yet perfected.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>This is how portrait mode works. A number of third-party camera and editing apps allow fine adjustment so you can determine exactly how much and where to put the <a href="https://photographylife.com/what-is-bokeh">bokeh</a> (the blurred part of the image, also known as depth-of-field).</p>
<p>Other than what’s already in a smartphone, (iOS) apps for this include <a href="https://itunes.apple.com/us/app/focos/id1274938524">Focos</a>, <a href="https://itunes.apple.com/us/app/halide-camera/id885697368">Halide</a>, <a href="https://itunes.apple.com/au/app/procam-6/id730712409">ProCam6</a>, <a href="https://itunes.apple.com/au/app/darkroom-photo-editor/id953286746">Darkroom</a>.</p>
<p>Android apps are harder to recommend, because it’s an uneven playing field at the moment. Many developers choose to stick to Apple because it is a standardised environment. That said, you may try <a href="https://play.google.com/store/apps/details?id=com.google.android.GoogleCamera">Google Camera</a> or <a href="https://play.google.com/store/apps/details?id=net.sourceforge.opencamera">Open Camera</a> </p>
<h2>Smart HDR</h2>
<p>The human eye can perceive contrast far greater than cameras. To bring more highlight and shadow detail into your photo (the dynamic range), HDR (High Dynamic Range) is a standard feature on most newer smartphones.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248627/original/file-20181204-126677-bgpw88.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">HDR exposes for shadow and highlight details to extend the dynamic range.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>It draws on a traditional photography technique by which multiple frames are exposed from shadows to highlights and then merged. How well this performs depends on the speed of your phone’s sensor and ISP (image signal processor).</p>
<p>A number of HDR apps are also available, some of which will take up to 100 frames of a single scene, but you may need to keep your phone steady to avoid blurring. Try (iOS) <a href="https://itunes.apple.com/au/app/hydra-amazing-photography/id947824428">Hydra</a>, <a href="https://itunes.apple.com/au/app/pro-hdr-x/id927823151">ProHDRx</a> or (Android) <a href="https://play.google.com/store/apps/details?id=com.eyeappsllc.prohdr">Pro HDR Camera</a>. </p>
<h2>Low-light and long exposure</h2>
<p>Smartphones have small <a href="https://www.cambridgeincolour.com/tutorials/camera-sensors.htm">image sensors</a> and pixel depth, so they struggle in low light. The computational trend among developers and manufacturers is to take multiple exposures, stack them on top of each other, and then average the stack to reduce <a href="https://www.cambridgeincolour.com/tutorials/image-noise.htm">noise</a> (the random pixels that escape the sensors).</p>
<p>It’s a traditional (and manual) technique in Photoshop that’s now automatic in smartphones and is an evolution of HDR. This is how the Google Pixel 3 and Huawei P20 see so well in the dark.</p>
<p>It also means that long exposures can be shot in daylight (prohibitive with a DSLR or film) without risk of the image <a href="https://shuttermuse.com/glossary/overexposure/">overexposing</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248620/original/file-20181204-23237-1nrxjmi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A three-second exposure of passing storm clouds at midday, made possible through computation.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>In an app such as <a href="https://itunes.apple.com/us/app/nightcap-camera/id754105884">NightCap</a> (Android, try <a href="https://play.google.com/store/apps/details?id=com.flavionet.android.camera.pro">Camera FV-5</a>), long exposures are an averaged process, such as this (image above) three-second exposure of storm clouds travelling past a clock tower.</p>
<p>Light trails, such as the main image (top) of London’s Tower Bridge and these images (below) of downtown San Francisco and a fire-twirler are an additive process to capture emerging highlights. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=701&fit=crop&dpr=1 600w, https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=701&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=701&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=880&fit=crop&dpr=1 754w, https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=880&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/250860/original/file-20181217-185252-1qqex9g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=880&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Light Trails mode was used to capture passing traffic in this long exposure of downtown San Francisco.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=653&fit=crop&dpr=1 600w, https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=653&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=653&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=821&fit=crop&dpr=1 754w, https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=821&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/250861/original/file-20181217-185252-14u7z8o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=821&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Light Trails mode was used to capture this fire twirler at Burleigh Heads on the Gold Coast.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>A tripod is essential unless you use Adobe’s free editing app Lightroom (<a href="https://itunes.apple.com/us/app/adobe-lightroom-cc/id878783582">iOS</a> and <a href="https://play.google.com/store/apps/details?id=com.adobe.lrmobile">Android</a>), which has a very good camera with a long exposure feature that adds auto-alignment to its image stacking.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/want-a-better-camera-just-copy-bees-and-their-extra-light-sensing-eyes-80385">Want a better camera? Just copy bees and their extra light-sensing eyes</a>
</strong>
</em>
</p>
<hr>
<p>Long exposure in iPhone’s native camera app can be made by tapping the Live mode button. The iPhone records before you press the shutter, so you need to keep the camera stable before and after you take the picture. Then, in the Photos app, swipe the image up to reveal four modes: Live, Loop, Bounce and Long Exposure. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=527&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=527&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=527&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=662&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=662&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248623/original/file-20181204-23258-gp315i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=662&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A long exposure made with iPhone’s Live photo mode.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure>
<p>The key to successful smartphone photography is to understand not just what your phone can do, but also its limitations, such as true <a href="https://www.nikonusa.com/en/learn-and-explore/a/tips-and-techniques/understanding-focal-length.html">optical focal length</a> (although this <a href="https://light.co/camera">device</a> by Light is challenging that). However, the advances in computational photography are making this a dynamic and compelling space.</p>
<p>It is worth remembering, too, that smartphones are merely a tool, and computational photography the technology that powers the tool. This old adage still rings true: it is the photographer who takes the picture, not the camera. Mind you, the taking is becoming so much easier.</p>
<p>Happy snapping.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/248625/original/file-20181204-23264-1goeif8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An underwater housing for iPhone (AxisGo by Aquatech) was used to capture this picture of a father and daughter swimming in the ocean.</span>
<span class="attribution"><span class="source">Rob Layton</span></span>
</figcaption>
</figure><img src="https://counter.theconversation.com/content/107957/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rob Layton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Today’s smartphones have the technology to help you take amazing photographs – so long as you do it right.
Rob Layton, Senior Teaching Fellow (Journalism), Bond University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/100217
2018-08-15T10:23:02Z
2018-08-15T10:23:02Z
Cameras can catch cars that run red lights, but that doesn’t make streets safer
<figure><img src="https://images.theconversation.com/files/230920/original/file-20180807-160647-3xl3gv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Many major U.S. cities have hidden cameras to catch drivers who run red lights.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/tram-traffic-light-showing-red-1019625316?src=WtTd1QIh_LMaFLLiUrBfNg-1-3">Gints Ivuskans/shutterstock</a></span></figcaption></figure><p><em><a href="https://theconversation.com/camaras-que-identifican-a-infractores-no-suponen-una-mejora-para-la-seguridad-vial-101645">Leer en español</a></em>.</p>
<p>The automobile is a killer. In the U.S., <a href="https://www.economist.com/united-states/2015/07/04/road-kill">36,675 people died</a> in traffic accidents in 2014. The year before, 2.3 million people were injured in traffic accidents. </p>
<p>During the past decade, over <a href="http://www.iihs.org/iihs/topics/laws/automated_enforcement?topicName=red-light-running">438 U.S. municipalities</a>, including 36 of the 50 most populous cities, have employed electronic monitoring programs in order to reduce the number of accidents. Red light camera programs specifically target drivers that run red lights.</p>
<p><a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3078079">In a study</a> I co-authored with economist Paul J. Fisher, we examined all police-recorded traffic accidents for three large Texas cities over a 12-year period – hundreds of thousands of accidents. We found no evidence that red light cameras improve public safety. They don’t reduce the total number of vehicle accidents, the total number of individuals injured in accidents or the total number of incapacitating injuries that involve ambulance transport to a hospital.</p>
<h2>Red light cameras</h2>
<p>In a red light camera program, a camera is installed in a location where it can take photos or video of vehicles as they pass through the intersection. City employees or private contractors then review the photos. If a vehicle is in the intersection when the light is red, then a ticket is sent to the person who registered the vehicle. </p>
<p>These programs aim to reduce cross-street collisions. The idea is that drivers, fearing a higher chance that they will be fined, will be more likely to stop, lowering the number of angle, or “T-bone,” accidents.</p>
<p>Evidence clearly shows that camera programs are effective at decreasing the number of vehicles running red lights. <a href="https://www.sciencedirect.com/science/article/pii/S0001457506000273">In one study in Virginia</a>, red light cameras reduced the number of total drivers running red lights by 67 percent. </p>
<p>However, cameras can have contradictory effects on traffic safety. Some drivers who would have otherwise continued to proceed through the intersection when the light is yellow or red will now attempt to stop. That means that the number of accidents caused by vehicles not stopping at a red light will likely decrease.</p>
<p>But the number of accidents from stopping at a red light – such as rear-end accidents – is likely to increase. That’s not an inconsequential side effect. Some drivers will attempt to stop, accepting a higher risk of a non-angle accident like getting rear-ended, in order to avoid the expected fine.</p>
<p>The overall effect of a camera program on vehicle accidents and injuries depends on the net impact of these two effects. Overall driver safety could increase or decrease.</p>
<h2>Our study</h2>
<p>In our study, we focused on Houston, a major U.S. city that operated a large camera program at 66 intersections between 2006 and 2010. </p>
<p>One reason we chose Houston is to take advantage of the natural experiment that occurred when city residents <a href="https://www.houstonchronicle.com/news/houston-texas/houston/article/Opposition-putting-a-stop-to-red-light-cameras-4461447.php">passed a referendum in November 2010</a> to ban the cameras. </p>
<p>We accessed detailed accident information on every traffic accident in Texas from 2003 to 2014 through a public records information request. The data included the accident’s precise geocoded location; the type of accident; whether the driver ran a red light; and details on any injuries.</p>
<p>When the Houston cameras were removed, angle accidents increased by 26 percent. However, all other types of accidents decreased by 18 percent. Approximately one-third of all Houston intersection accidents are angle accidents. This suggests that the program’s drawbacks canceled out its benefits. </p>
<p>Our study showed no evidence that cameras reduce the total number of accidents. We estimate that total accidents are reduced by a statistically insignificant 3 percent after the cameras are turned off. </p>
<p>Likewise, there’s no evidence that the camera program reduced the number of traffic-related injuries or the likelihood of incurring an incapacitating injury.</p>
<p>The elevated number of traffic accidents at urban intersections is a serious public health issue. But our study shows that Houston’s camera program was ineffective in improving traffic safety. Electronic monitoring is not the solution.</p><img src="https://counter.theconversation.com/content/100217/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Justin Gallagher does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Hundreds of US cities have red light cameras to try to catch traffic violations and prevent accidents. But research shows that the cameras may encourage other types of accidents.
Justin Gallagher, Assistant Professor of Economics, Case Western Reserve University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/97695
2018-06-18T05:17:01Z
2018-06-18T05:17:01Z
The privacy problem with camera traps: you don’t know who else could be watching
<figure><img src="https://images.theconversation.com/files/221688/original/file-20180605-175438-skeqeo.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A spotted-tailed Quoll detected during a small mammal survey at Carrai Plateau, New South Wales.</span> <span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span></figcaption></figure><p>We use remotely activated cameras – known as camera traps – to study the ecology and population responses of wildlife and pest species in management programs across Australia.</p>
<p>These devices are used widely by scientists, researchers and managers to detect rare wildlife, monitor populations, study behaviour and measure long term wildlife population health. </p>
<p>But the lack of transparency surrounding how these images are transmitted, where they are stored, and who has access to them in transit, has scientists worried.</p>
<p>We’ve discovered that images captured by these devices may potentially be accessed by more than those intended, and that this could pose potential privacy breaches, and even poaching risks.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/publish-and-dont-perish-how-to-keep-rare-species-data-away-from-poachers-80239">Publish and don’t perish – how to keep rare species' data away from poachers</a>
</strong>
</em>
</p>
<hr>
<h2>A chance discovery</h2>
<p>It was an accidental discovery that our images can travel from the field to big overseas internet servers. We had not considered the transmission path of our images, and who may have access to them along the way.</p>
<p>Manufacturers have developed camera traps that are capable of transmitting image data using the telecommunications network (in Australia this is 3G and soon to move to 4G). </p>
<p>Most of these camera trap models can transmit images using both MMS (Multi Media Message Service), where the image is sent in an SMS (Short Message Service) to a smart phone, and via SMTP (Simple Mail Transfer Protocol), where the image is transmitted to an email address. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=800&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=800&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=800&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1005&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1005&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221689/original/file-20180605-175438-1bugqwc.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1005&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A 3G camera trap set in the Strzelecki Desert and sending images to the authors email and phone.</span>
<span class="attribution"><span class="source">PM</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>In Australia, when you buy a 3G compatible camera trap you just need to add a SIM card from a service provider. The images will then be sent from the camera trap at a field site to your work or home in seconds. This process is made simple for users by manufacturers who set up default settings to assist you in programming the camera trap. </p>
<p>If, like most people, you don’t over-ride the default settings, then your data will be managed for you. An attractive offer, especially for those people who are not tech-savvy or who don’t have time to fiddle around with programming equipment. </p>
<p>But where are your images going? Who has the legal right to access and store them? How secure is each stage of the transmission path, and are your images being used without your knowledge?</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/explainer-what-is-4g-9448">Explainer: what is 4G?</a>
</strong>
</em>
</p>
<hr>
<h2>An evaluation process</h2>
<p>Our research team has been evaluating the transmission of images via SMTP for a larger research project, aimed at developing camera trap transmission via satellite.</p>
<p>We have been testing and comparing several models of 3G camera trap, which includes evaluating the message structure and headers.</p>
<p>It was these investigations that revealed some alarming information that pose several potential risks to camera trap users when a camera trap is set up using the default settings for SMTP transmission. </p>
<p>Each manufacturer will use different methods, but in essence when an image is transferred through some 3G telecommunication service, the image is sent to one or more web-servers, where the image may be stored, then sent to the recipient email address or phone. </p>
<p>These servers can be in any country. Our investigations of the five models we tested identified that images are being sent via some large, well-known Asian and North American companies. The exact location of each server, and the full transmission pathway cannot be fully known. </p>
<p>Exactly what happens to these images during transmission also remains unknown. But most practitioners we have spoken to have no idea their images could potentially be going to servers overseas, so it raises several concerns for users.</p>
<h2>A privacy concern</h2>
<p>One of our foremost concerns is how legal professionals would interpret ownership and distribution of images of people under privacy legislation. Camera traps deployed to detect wildlife often detect unsuspecting people walking past.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221702/original/file-20180605-175451-nug60m.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A harmless image of an un-suspecting person walking past a camera trap could end up in a court of law if the image is used without their permission.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>It’s a <a href="https://www.publish.csiro.au/book/7150">legal mine field</a> when a camera trap user potentially distributes an image of a person without their permission.</p>
<p>It was an issue raised back in 2012 when an unnamed Austrian politician was <a href="http://www.spiegel.de/international/europe/forest-sex-footage-sparks-debate-in-austria-a-838691.html">caught in a sexual encounter by a camera trap</a>. In that case the image wasn’t released publicly but it raised concerns over a potential breach of privacy.</p>
<p>In Australia, such an image belongs to the person who is photographed irrespective of where the images were taken, so strictly speaking they could pursue legal action against anyone distributing it. </p>
<p>Clearly there would be extenuating circumstances, but whether or not there is a case to be answered is yet to be tested and would depend on the country and legislation involved.</p>
<p>Camera traps are also used for security purposes by authorities, farmers and members of the public, so potential legal and sensitive data could be distributed over the internet. As there is a lack of transparency surrounding the transmission pathway, storage, and usage of the data, this could be a huge concern.</p>
<p>In Australia, this might constitute a breach under the <a href="https://www.oaic.gov.au/privacy-law/privacy-act/">Privacy Act 1988</a> dependent on the whether any personal data is disclosed and the potential for serious harm which might result.</p>
<h2>All in the cloud</h2>
<p>The Australian government has <a href="https://www.oaic.gov.au/privacy-law/privacy-archive/privacy-speeches-archive/privacy-and-the-cloud">released policy and guidelines</a> concerning the protection of data privacy when using cloud services. </p>
<p>But these requirements might not extend, or have not been adopted, in the context of technological based ecology monitoring and so valuable data could currently be leaving Australian shores.</p>
<p>How this data is used is also largely unknown. It may serve many commercial purposes for companies, such as data mining, advertising, and machine learning and artificial intelligence development, to name but a few. Exactly what country, where and how securely the data is stored remains a mystery.</p>
<p>Of real concern for many international wildlife conservation groups is the potential misuse of wildlife images that could identify threatened species and locations. This information could be illegally accessed by poachers, or those looking to sell the data for profit. </p>
<p>Our disclaimer here is that we have no evidence to prove or deny that such practices are occurring, but the potential exists and the lack of transparency is alarming.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/scientists-are-accidentally-helping-poachers-drive-rare-species-to-extinction-78342">Scientists are accidentally helping poachers drive rare species to extinction</a>
</strong>
</em>
</p>
<hr>
<h2>Reducing the risk</h2>
<p>Until recently we did not fully comprehend the risks we were taking by using 3G camera traps without taking some precautions. Like most, we accepted that our data was safe and controlled by Australian telecommunications systems, and had no concept that the images may be transmitted or stored by servers overseas.</p>
<p>We now know the risks and that in many cases this image management protocol can be circumvented by over-riding the camera’s default settings. In the ideal world every user would know the full transmission pathway of the image and could take steps to make sure it is as secure as practically possible. Given this is not possible, we recommend that where possible, users program camera traps to send SMTP images direct to an email address that they have more control over. </p>
<p>It will take a little extra time to program the camera traps, but at least users will have more control over the path of their image from the field to any receiving device.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221695/original/file-20180605-175438-lfx34e.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The right thing captured in the camera trap: a spotted-tailed Quoll.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure><img src="https://counter.theconversation.com/content/97695/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul D Meek receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia</span></em></p><p class="fine-print"><em><span>Greg Falzon receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia</span></em></p><p class="fine-print"><em><span>James Bishop receives funding from Dept Agriculture and Water Resources, Centre for Invasive Species Solutions, Australian Wool Innovation and Meat and Livestock Australia.
James Bishop receives PhD research funding from an Australian Postgraduate Award (APA).
</span></em></p>
Remote cameras used to track wildlife in Australia could pose a privacy risk, especially if the images they capture fall into the wrong hands.
Paul D Meek, Adjunct Lecturer in School of Environmental and Rural Science, University of New England
Greg Falzon, Lecturer in Computational Science, University of New England
James Bishop, PhD candidate, software engineer, University of New England
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/94282
2018-05-18T10:41:43Z
2018-05-18T10:41:43Z
75 years of instant photos, thanks to inventor Edwin Land’s Polaroid camera
<figure><img src="https://images.theconversation.com/files/219447/original/file-20180517-26274-1f6mmvc.jpg?ixlib=rb-1.1.0&rect=0%2C0%2C2618%2C2070&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Edwin Land, on the left, invented and commercialized a number of technologies, most of which centered on light.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-OH-USA-APHS150797-Polaroid-Land-Camera/155ca24494f748d3aae778e1db3f8755/2/0">AP Photo</a></span></figcaption></figure><p>It probably happens every minute of the day: A little girl demands to see the photo her parent has just taken of her. Today, thanks to smartphones and other digital cameras, we can see snapshots immediately, whether we want to or not. But in 1943 when <a href="https://www.acs.org/content/acs/en/education/whatischemistry/landmarks/land-instant-photography.html">3-year-old Jennifer Land</a> asked to see the family vacation photo that her dad had just taken, the <a href="https://www.library.hbs.edu/hc/polaroid/instant-photography/the-idea-of-instant-photography/">technology didn’t exist</a>. So her dad, <a href="https://www2.rowland.harvard.edu/book/export/html/16141">Edwin Land, went to work inventing it</a>.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroid camera faces the viewer" src="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=884&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=884&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=884&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1111&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1111&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218832/original/file-20180514-100703-7r2u85.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1111&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The original Polaroid camera freed users from needing to trek to a darkroom to develop their images.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/cNomGxIq6MI">Lindsay Moe/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Three years later, after plenty of scientific development, Land and his Polaroid Corp. realized the miracle of nearly instant imaging. The film exposure and processing hardware are contained within the camera; there’s no muss or fuss for the photographer, who just points and shoots and then watches the image materialize on the photo once it spools out of the camera. Land demonstrated his new technology publicly for the first time on <a href="https://mobile.twitter.com/OpticaWorldwide/status/1098613395765501955">Feb. 21, 1947, at a meeting</a> of the Optical Society of America.</p>
<p>Land is probably best known for the “instant photo” – or the spiritual progenitor of today’s <a href="http://www.dailymail.co.uk/sciencetech/article-3619679/What-vain-bunch-really-24-billion-selfies-uploaded-Google-year.html">ubiquitous selfie</a>. His Polaroid camera was first released commercially in 1948 at retail locations and prices aimed at the postwar middle class. But this is just one of a host of technological breakthroughs Land invented and commercialized, most of which centered around light and how it interacts with materials. The technology used to show a 3D movie and the goggles we wear in the theater were made possible by Land and his colleagues. The camera aboard the U-2 spy plane, as featured in the movie “<a href="https://www.imdb.com/title/tt3682448/">Bridge of Spies</a>,” was a Land product, as were even some aspects of the plane’s mechanics. He also worked on theoretical problems, drawing on a deep understanding of both chemistry and physics.</p>
<p><a href="https://scholar.google.com/citations?user=8hzH2SoAAAAJ&hl=en&oi=ao">I’m a vision scientist</a> who has touched many of the fields in which Land made great advances, through my own work on new imaging methods, image processing techniques and human color vision. As the 2018 recipient of the <a href="https://www.osa.org/en-us/awards_and_grants/awards/award_description/edwinland/">Edwin H. Land Medal</a>, awarded by the Optical Society of America and the <a href="https://www.optica.org//en-us/about/newsroom/news_releases/2018/the_optical_society_and_society_for_imaging_scienc/">Society for Imaging Science and Technology</a>, my own work relies on Land’s technological innovations that made modern imaging possible.</p>
<h2>Controlling light’s properties</h2>
<p>Edwin Land had his first optics breakthrough as a young man, when he figured out a convenient and affordable method to control one of the fundamental properties of light: polarization.</p>
<p>You can think of light as waves propagating from a source. Most light sources produce a mixture of waves with all different physical properties, such as wavelength and amplitude of vibration. Light is considered polarized if the amplitude varies in a consistent manner perpendicular to the direction the wave is traveling.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="diagram of only vertical lightwaves passing through filter" src="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=280&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=280&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=280&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=352&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=352&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219275/original/file-20180516-155569-1a1sjoe.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=352&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A polarizing filter can block all the light waves that don’t match its orientation.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/ko/image-vector/polarization-light-waves-421267105">Fouad A. Saad/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Given the right material for the light waves to pass through, the light waves may be rotated into another plane, slowed down or blocked. Modern 3D goggles work because one eye receives light waves vibrating along the horizontal plane while the other eye receives the light vibrating along the vertical plane. </p>
<p>Before Land, researchers built components to control polarization from rock crystals, which were assigned almost magical names and properties, though they merely decreased the velocity or amplitude of light waves traveling at specific orientations. Land created “polarizers” by growing small crystals and embedding them in plastic sheets, altering the light passing through depending on its orientation in relation to the rows of crystals. His inexpensive polarizer made it possible to reliably and practically filter light so only wavelengths with a particular orientation would pass through.</p>
<p>Land founded the Polaroid Corp. in 1937 to commercialize his new technology. His sheet polarizers found applications ranging from the identification of chemical compounds to adjustable sunglasses. Polarizing filters became standard in photography to reduce glare. Today the principles of polarized light are used in most computer and cellphone screens to enhance contrast, decrease glare and even turn on or off individual pixels.</p>
<p><a href="https://doi.org/10.1167/iovs.03-0124">Polarizing filters help researchers visualize structures</a> that might not be seen otherwise – from astronomical features to biological structures. In my own field of vision science, polarization imaging localizes classes of chemicals, such as <a href="https://doi.org/10.1364/JOSAA.24.001468">protein molecules leaking from blood vessels</a> in diseased eyes. Polarization is also combined with high-resolution imaging techniques to detect <a href="https://doi.org/10.1038/s41598-017-03529-8">cellular damage</a> beneath the reflective retinal surface. </p>
<h2>A new way to get the data out</h2>
<p>Before the days of high-speed digital capture of data and affordable high-resolution displays, or use of videotape, Polaroid photography was the method of choice to obtain output in many scientific labs. Experiments or medical tests needed graphical or pictorial output for interpretation, often from an analog oscilloscope which plotted out a voltage or current change over time. The oscilloscope was fast enough to capture key features of the data – but recording the output for later analysis was a challenge before Land’s instant camera came along.</p>
<p>A common example in vision science is the recording of eye movements. A research study reported in 1960 plotted light reflected from an observer’s moving eye on an oscilloscope screen, which was photographed with a <a href="https://doi.org/10.1364/JOSA.50.000245">mounted Polaroid camera</a> – not unlike the consumer Polaroid camera a family might pull out at a birthday party. For decades, research labs and medical facilities used <a href="https://www.ebay.com/p/Tektronix-C-5c-Oscilloscope-Camera-for-Polaroid-Film-B054450/1437576020">setups consisting of a Polaroid camera and a mounting rig</a> to collect electrical signals displayed on oscilloscope screens. The format sizes are less than dazzling compared to modern digital resolutions, but they were revolutionary at the time.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=599&fit=crop&dpr=1 600w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=599&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=599&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=753&fit=crop&dpr=1 754w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=753&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/218867/original/file-20180514-100693-jtafii.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=753&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Land’s inventions led to the widespread use of polarized light to characterize tissues and objects, as in this pseudo-color image of a diabetic patient’s retina that unmasks irregular structures caused by edema.</span>
<span class="attribution"><span class="source">Ann Elsner</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<p>In 1987, with the founding of my new retinal imaging laboratory, there was no inexpensive method to provide shareable output of our <a href="https://doi.org/10.1016/0042-6989(95)00100-E">novel images</a>. After a few years of struggling to obtain high-quality output for conferences and publications, the Polaroid Corp. came to our rescue, with the donation of a printer, allowing our scientific contributions to reach an audience beyond our lab.</p>
<h2>Eyes are not cameras</h2>
<p>Land’s contributions go beyond patenting over 500 innovations and inventing products that millions purchased. His understanding of the interaction of light and matter promoted novel ways of characterizing chemicals with polarized light. And he provided insights into the workings of the human visual system that had seemed to defy the laws of physics, coming up with what he called the <a href="https://pdfs.semanticscholar.org/8b2a/d82ce40117417fa36ba16941ce022f2185f3.pdf">Retinex theory</a> of color vision to explain how people perceive a broad range of color <a href="https://doi.org/10.1364/JOSAA.3.000916">without the expected wavelengths</a> being present in the room.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Polaroids clipped to a string agains brick wall" src="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/219101/original/file-20180515-195311-6j3cax.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Quick prints can be shared and displayed.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/hillaryandanna/760585681">Hillary Hartley</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Despite his brilliance, Land’s Polaroid Corp. eventually hit hard times in the decades after his death in 1991. Heavily invested in its film sales, Polaroid wasn’t prepared as all tiers of the imaging market went digital, with everyone from consumer photographers to high-end medical and optical imagers abandoning film and processing.</p>
<p>But rather than sink with the film market, Polaroid reinvented itself with new products that could help output the new world of digital images. And in a case of history repeating itself, <a href="https://us.polaroid.com/collections/instant-cameras">Polaroid</a> and other manufacturers of instant cameras are enjoying renewed popularity with younger generations who had no exposure to the original versions. Just like little Jennifer Land, plenty of people today still want a tangible version of their pictures, right now.</p>
<p><em>This is an updated version of an article originally published on May 18, 2018. It corrects the year Jennifer Land inspired her father’s invention.</em></p><img src="https://counter.theconversation.com/content/94282/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ann Elsner receives funding from NIDILRR and NIH. She owns shares in Aeon Imaging, LLC.</span></em></p>
Whether at a family gathering or in a research lab, getting access to images immediately was a game-changer. And Land’s innovations went far beyond the instant photo.
Ann Elsner, Professor of Optometry, Indiana University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/90258
2018-01-22T12:25:46Z
2018-01-22T12:25:46Z
The next generation of cameras might see behind walls
<figure><img src="https://images.theconversation.com/files/202620/original/file-20180119-110121-1wggumj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><span class="license">Author provided</span></span></figcaption></figure><p>You might be really pleased with the camera technology in your latest smartphone, which can recognise your face and take slow-mo video in ultra-high definition. But these technological feats are just the start of a larger revolution that is underway.</p>
<p>The latest camera research is shifting away from increasing the number of mega-pixels towards fusing camera data with computational processing. By that, we don’t mean the Photoshop style of processing where effects and filters are added to a picture, but rather a radical new approach where the incoming data may not actually look like at an image at all. It only becomes an image after a series of computational steps that often involve complex mathematics and modelling how light travels through the scene or the camera.</p>
<p>This additional layer of computational processing magically frees us from the chains of conventional imaging techniques. One day we may not even need cameras in the conventional sense any more. Instead we will use light detectors that only a few years ago we would never have considered any use for imaging. And they will be able to do incredible things, like see through fog, inside the human body and even behind walls.</p>
<h2>Single pixel cameras</h2>
<p>One extreme example is the <a href="https://dx.doi.org/10.1098%252Frsta.2016.0233">single pixel camera</a>, which relies on a beautifully simple principle. Typical cameras use lots of pixels (tiny sensor elements) to capture a scene that is likely illuminated by a single light source. But you can also do things the other way around, capturing information from many light sources with a single pixel. </p>
<p>To do this you need a controlled light source, for example a simple data projector that illuminates the scene one spot at a time or with a series of different patterns. For each illumination spot or pattern, you then measure the amount of light reflected and add everything together to create the final image. </p>
<p>Clearly the disadvantage of taking a photo in this is way is that you have to send out lots of illumination spots or patterns in order to produce one image (which would take just one snapshot with a regular camera). But this form of imaging would allow you to create otherwise impossible cameras, for example that work at wavelengths of light beyond the visible spectrum, where good detectors <a href="https://www.nature.com/articles/ncomms12010">cannot be made into cameras</a>.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-amazing-camera-that-can-see-around-corners-51948">The amazing camera that can see around corners</a>
</strong>
</em>
</p>
<hr>
<p>These cameras could be used to take photos through <a href="https://www.osapublishing.org/oe/abstract.cfm?uri=oe-23-11-14424">fog or thick falling snow</a>. Or they could <a href="http://advances.sciencemag.org/content/3/4/e1601782">mimic the eyes of some animals</a> and automatically increase an image’s resolution (the amount of detail it captures) depending on what’s in the scene.</p>
<p>It is even possible to capture images from light particles that have <a href="https://www.nature.com/articles/nature13586">never even interacted</a> with the object we want to photograph. This would take advantage of the idea of “quantum entanglement”, that two particles can be connected in a way that means whatever happens to one happens to the other, even if they are a long distance apart. This has intriguing possibilities for looking at objects whose properties might change when lit up, such as the eye. For example, does a retina look the same when in darkness as in light?</p>
<h1>Multi-sensor imaging</h1>
<p>Single-pixel imaging is just one of the simplest innovations in upcoming camera technology and relies, on the face of it, on the traditional concept of what forms an picture. But we are currently witnessing a surge of interest for systems where that use lots of information but traditional techniques only collect a small part of it.</p>
<p>This is where we could use multi-sensor approaches that involve many different detectors pointed at the same scene. <a href="https://www.nasa.gov/mission_pages/hubble/multimedia/index.html">The Hubble telescope</a> was a pioneering example of this, producing pictures made from combinations of many different images taken at different wavelengths. But now you can buy commercial versions of this kind of technology, such as the <a href="https://www.lytro.com/255D">Lytro camera</a> that collects information about light intensity and direction on the same sensor, to produce images that can be refocused after the image has been taken.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/202582/original/file-20180119-80168-3gleod.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Light L16.</span>
<span class="attribution"><span class="source">Light</span></span>
</figcaption>
</figure>
<p>The next generation camera will probably look something like the <a href="https://light.co/camera">Light L16 camera</a>, which features ground-breaking technology based on more than ten different sensors. Their data are combined combined using a computer to provide a 50Mb, re-focusable and re-zoomable, professional-quality image. The camera itself looks like a very exciting Picasso interpretation of a crazy cell-phone camera.</p>
<p>Yet these are just the first steps towards a new generation of cameras that will change the way in which we think of and take images. Researchers are also working hard on the problem of seeing through fog, <a href="https://www.nature.com/articles/ncomms1747">seeing behind walls</a>, and even imaging deep inside the <a href="https://www.nature.com/articles/nphoton.2014.107">human body and brain</a>.
All of these techniques rely on combining images with models that explain how light travels through through or around different substances.</p>
<p>Another interesting approach that is gaining ground relies on artificial intelligence to “learn” to <a href="https://www.osapublishing.org/optica/abstract.cfm?uri=optica-4-9-1117">recognise objects from the data</a>. These techniques are inspired by learning processes in the human brain and are likely to play a major role in <a href="https://arxiv.org/abs/1709.07244">future imaging systems</a>.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cDbGFT5rM0I?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Single photon and quantum imaging technologies are also maturing to the point that they can take pictures with incredibly low light levels and videos with incredibly fast speeds reaching a trillion frames per second. This is enough to even capture images <a href="https://www.nature.com/articles/ncomms7021">of light itself</a> travelling across as scene.</p>
<p>Some of these applications might require a little time to fully develop but we now know that the underlying physics should allow us to solve these and other problems through a clever combination of new technology and computational ingenuity.</p><img src="https://counter.theconversation.com/content/90258/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Daniele Faccio receives funding from EPSRC, QuantIC - The Quantum Hub for Imaging, The Leverhulme Trust, DSTL.</span></em></p><p class="fine-print"><em><span>Stephen McLaughlin receives funding from EPSRC for a variety of research grants which analyse data which require the computational imaging methods described in the article</span></em></p>
Single-pixel cameras, multi-sensor imaging and quantum technologies will change the way we take photos.
Daniele Faccio, Professor of Quantum Technologies, University of Glasgow
Stephen McLaughlin, Head of School of Engineering and Physical Sciences, Heriot-Watt University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/73855
2017-03-21T00:09:52Z
2017-03-21T00:09:52Z
How to stop the thieves when all we want to capture is wildlife in action
<figure><img src="https://images.theconversation.com/files/159436/original/image-20170305-29017-umuzub.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The camera traps that help monitor animals, so long as the cameras don't get stolen.</span> <span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span></figcaption></figure><p>Many Australian field scientists, including myself, have been swayed in recent years by the attraction of using camera traps to survey <a href="http://www.publish.csiro.au/AM/AM14021">wildlife</a>.</p>
<p>But we’ve also attracted some unsavoury characters of the human form who are seriously threatening the viability of our research. </p>
<p>Camera trap devices (typically costing A$300 to A$900 each) don’t require a human operator to push the button; they can be remotely deployed in the bush taking photos for months with the sole function of recording animals.</p>
<p>They were used by the BBC to capture rare footage of the snow leopard for David Attenborough’s <a href="http://www.bbcearth.com/planetearth2/">Earth II</a> series.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/V-ekvBYHLJc?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Camera traps have led to some fantastic ecological findings, including newly discovered species such as the <a href="http://zookeys.pensoft.net/articles.php?id=3550">Olinguito</a> from South America; extensions of the species range of rare predators such as Pallas’ cat in Iran; and previously unrecorded animal behaviours such as dingo <a href="http://www.publish.csiro.au/AM/AM16018">cannibalism</a>.</p>
<p>Importantly, they have reduced the cost of conducting surveys for wildlife that are normally expensive and resource-intensive.</p>
<p>Camera traps are now the “go-to” survey tool for scientists and a device used by farmers, land managers and citizen scientists alike. They are showing us pictures of animals doing things they only do when humans are not present.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161700/original/image-20170321-9144-n945fl.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">This black coastal wild dog was captured on camera trap walking along a forest track on the fringe of Coffs Harbour township.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>But these unaccompanied pieces of technology are vulnerable to antisocial behaviours that are normally associated with humans in urban environments.</p>
<h2>Camera thefts and vandalism</h2>
<p>The theft and vandalism of remote scientific equipment is not new, but the exponential adoption of camera traps across the <a href="http://www.italian-journal-of-mammalogy.it/article/view/8789">world</a> has significantly increased the chance of an encounter between human and device.</p>
<p>The level of risk and financial loss has reached breaking point and is causing us scientists to revisit how we use camera traps to prevent theft.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161512/original/image-20170320-8880-znmag9.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Gone: Thieves cut the back off the security post, then peeled open the steel with a pneumatic device, cut the lock and stole the camera trap.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>In the cohort of dedicated wildlife researchers I work with, we have lost more than A$70,000 worth of equipment in recent years due to theft.</p>
<p>Camera traps are usually fixed to posts or trees at sites where animals are most likely to pass and therefore be detected. </p>
<p>In our research, we primarily study predators – such as dingoes, foxes and feral cats – so our camera traps are set in transects along roads and trails in the bush. This makes them very obvious to would-be thieves.</p>
<p>Following the theft of 15 camera traps in 2011 from one of our research trials in New South Wales, we grappled with how we might avoid having our equipment, and the precious data contained within, from being stolen.</p>
<h2>An elevated approach</h2>
<p>We initially thought we could put them <a href="http://onlinelibrary.wiley.com/doi/10.1002/rse2.28/full">3 metres high in trees</a>, hoping thieves would not see them because they were out of their line of sight.</p>
<p>Initial trials confirmed that most people had no idea that our camera traps were in the trees, so it looked promising.</p>
<p>That was until we conducted a further trial to compare the wildlife detected at different camera trap heights and realised that the higher the devices, the less we detected. So that option was off the drawing board.</p>
<p>Following this, we <a href="http://www.publish.csiro.au/am/am12014">designed a security post</a> that encased the camera trap in a steel box welded to a steel post with a lock shield (to prevent them from being ground off), and sunk 1m into the ground with two bags of concrete.</p>
<p>We attached friendly notes to the posts letting people know that we were only interested in animals and that pictures of people would be deleted.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161509/original/image-20170320-8849-1ejhzkv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The security post design with explanatory sign used by the research team to monitor predators.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Initial field trials looked promising. Some people kicked them or bumped into them with their vehicles but no one removed the posts. So we set up several 25km transects of camera trap security posts in the forest in the Northern NSW Tablelands and Coffs Coast.</p>
<h2>A determined attack</h2>
<p>But within a few days of deployment, one post at Coffs Harbour was pulled out of the ground by a large machine, leaving a large chasm in the ground. </p>
<p>Within a week, 11 more had been put through every conceivable extraction method – all failed. So the thieves took to our posts with battery-powered grinders and we were 12 camera traps short.</p>
<p>Not to be outsmarted, we replaced the posts with Mad Max-like anti-missile constructed security posts. We even filled the concrete slurry poles with cheap nylon rope, hoping the plastic might melt and jam the grinding discs.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161510/original/image-20170320-8887-su0a4g.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In a newer security post version, nicknamed Mad Max, steel rods have been welded across the weak spots and grinder jamming rubber has been added.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>While this seemed to have worked, they recently found another weak spot: they found some of the old-style posts that had never been damaged, ground off sections of metal, put pneumatic jacks in the gaps and levered them apart like a can opener before extracting our code-locked camera traps along with our irreplaceable data.</p>
<p>In desperation, we have had to resort to placing camera traps in suboptimal sites to avoid areas we believe to be high-risk. Begrudgingly, guarding against theft has become an important factor in our experimental design.</p>
<h2>Why steal the cameras?</h2>
<p>We are often asked, why do people steal our camera traps? It’s likely that at some sites we have intercepted some illegal activity and the criminals don’t want to be identified.</p>
<p>In one case we know we recorded some illegal dumping, so the perpetrator removed the evidence by pulling up the whole post. </p>
<p>But overall it’s confusing as to why people will go to such lengths to steal devices that we clearly advertise as code-locked and therefore unusable. What is it that drives people to theft and vandalism? </p>
<p>Most of the time I think these people just don’t think about their actions and steal because they can. But there is no intellectual evaluation of the activity. </p>
<p>Sadly for practitioners, these mindless acts of transgression have dire consequences. Not only do we lose valuable and difficult to obtain financial resources, but we lose irreplaceable data on memory cards.</p>
<h2>We need a solution</h2>
<p>We have considered numerous solutions to these attacks, such as putting out nicely worded signs explaining our study, advising the community about our work in the media so they don’t feel threatened, placing camera traps on camera traps to try and catch the thieves, even gluing leaves and bark on the camera traps as a form of camouflage. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161511/original/image-20170320-8871-1oid7i6.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Battery pack grinders made easy work of this security post, the thieves cut through the lock shield, then cut open the padlock and stole the camera trap.</span>
<span class="attribution"><span class="source">Paul Meek</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>We are testing tiny tracking systems but the technology is not suitable yet. We have tried it all and been thwarted. </p>
<p>We have now resorted to <a href="https://www.surveymonkey.com/r/BB75PX5">surveying our colleagues</a> in the hope of gathering further evidence of the significance of this threat to science.</p>
<p>More than 300 camera trap practitioners have shared their economic and data loses and ideas with our team so far. </p>
<p>It is our hope that we may generate enough information on the effects of this degenerate behaviour, that we may find some technological solutions that will ultimately allow us to continue contributing to global science, without fear of losing our equipment and data.</p><img src="https://counter.theconversation.com/content/73855/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul D Meek does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
One of the problems with using automatic cameras to track wildlife is that people keep stealing them. And they go to great efforts to do so. But why?
Paul D Meek, Adjunct Lecturer in School of Environmental and Rural Science, University of New England
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/52562
2016-02-10T11:07:11Z
2016-02-10T11:07:11Z
Exposed to a deluge of digital photos, we’re feeling the psychological effects of image overload
<p><a href="http://www.pewinternet.org/2015/04/09/teens-social-media-technology-2015/">Twenty-four percent of U.S. teens</a> say they’re online “almost constantly.” Now much of that time, it seems, is spent incessantly compiling and navigating vast collections and streams of images. </p>
<p>In a 2014 survey, the photo sharing app Instagram <a href="http://blogs.wsj.com/digits/2015/10/16/survey-finds-teens-prefer-instagram-snapchat-among-social-networks/">supplanted Twitter</a> as the social media platform considered “most important” by U.S. teens. </p>
<p>These results stayed the same for 2015, confirming just how crucial image sharing and consumption have become to young people’s everyday online experiences. Not surprisingly, Facebook and Twitter have since <a href="http://www.usatoday.com/story/tech/2013/03/07/facebook-news-feed/1970495/">become</a> more <a href="https://blog.twitter.com/2013/picture-this-more-visual-tweets">image-driven</a>. And Snapchat – which enables users to create and share ephemeral photographs and short videos – is <a href="http://www.ibtimes.com/snapchat-outranks-instagram-fastest-growing-social-network-american-millennials-2220089">one of the fastest-growing</a> social networks.</p>
<p>Indeed, our relationship with photographs is rapidly changing. As we snap, store and communicate with thousands of images on our phones and computers, a number of researchers and theorists are already beginning to point to some of the unintended consequences of this “image overload,” which range from heightened anxiety to memory impairment.</p>
<h2>Overwhelmed – and distracted – by images</h2>
<p>In the Rhetoric of Photography course that I’ve taught at the University of Texas at Austin over the past few years, image glut was a constant topic of discussion among my students. </p>
<p>They repeatedly expressed feeling overrun by photographs and addicted to posting images. They even waxed nostalgic about the clunky plastic cameras of their childhoods, wistfully recalling the days of limited exposures and a waiting period before seeing their developed prints. </p>
<p>“Images are produced, commodified, made public and circulated on an unprecedented scale,” sociologist Martin Hand writes in his book <a href="http://www.polity.co.uk/book.asp?ref=9780745647159"><em>Ubiquitous Photography</em></a>. </p>
<p>Image overload hinges on feeling visually saturated – the sense that because there’s so much visual material to see, remembering an individual photograph becomes nearly impossible. </p>
<p>For my students, this feeling was marked at times by general frustration, low-grade anxiety and flat-out fatigue. Image overload also suggests a level of exhaustion with the process of monitoring and creating photo streams – surviving the pressure to digitally document one’s everyday life and to bear witness to others’ ever-growing image banks. </p>
<p>Many accumulate thousands of images on their phones and digital cameras. The daunting task of organizing, altering and deleting these can evoke feelings of dread. Indeed, according to a 2015 report, the average smartphone user has <a href="https://gigaom.com/2015/01/23/personal-photos-videos-user-generated-content-statistics/">630 photos</a> stored on his or her device.</p>
<p>Martin Hand also notes the “degrees of anxiety, concern and fascination” that his own students demonstrated in response to the daunting demands of public image proliferation and upkeep. </p>
<p>“Aside from anxieties about accidental deletion or irrevocable loss,” Hand continues, “people often express concern over the inability to organize, classify or even look at all their digital images in ways that are meaningful for them.”</p>
<p>Meanwhile, Fred Ritchin, the Dean of the School at the International Center of Photography, <a href="http://books.wwnorton.com/books/After-Photography/">argues</a> that the constant stream of visual information contributes to the kind of fragmented focus that former Microsoft executive Linda Stone calls <a href="http://lindastone.net/qa/continuous-partial-attention/">“continual partial attention.”</a> </p>
<p>In other words, by always being tuned in and responsive to digital technologies, we become less aware of our surroundings. As our attention succumbs to the allure of being someplace else, our concentration suffers.</p>
<h2>Photo-taking memory impairment</h2>
<p><a href="http://www.npr.org/2014/05/22/314592247/overexposed-camera-phones-could-be-washing-out-our-memories">According to psychologist Maryanne Garry</a>, the overabundance of digital images may be detrimental to memory formation. </p>
<p>Garry argues that a constant flood of photographs doesn’t actively inspire remembrance or generate understanding. As Garry explains, narratives are crucial to memory formation. When viewing a barrage of images, unless there’s some sort of timeline, contextualization or intense focus, we’ll fail to place the image within an overarching story – and it becomes that much more difficult to retain the memory of the image. </p>
<p>Meanwhile, through her <a href="http://pss.sagepub.com/content/25/2/396">research</a>, psychologist Linda Henkel has encountered what she describes as the “photo-taking impairment effect” – the idea that photographing may discourage remembering.</p>
<p>In Henkel’s study, students who visited an art museum with cameras in tow remembered fewer of the objects they photographed than those they simply observed. And if they <em>did</em> remember the photographed object, they were less likely to recall specific details. </p>
<p>However, a second study found that if a student took the time to zoom in on an object, their memory was not impaired – an indication that increased attention and cognitive engagement can counteract this effect.</p>
<h2>Snapping photos in the here and now</h2>
<p>During my third semester teaching The Rhetoric of Photography, I created an assignment to allow my students to explore their concerns about image overload. </p>
<p>Students would spend at least a week shooting with a disposable camera before developing their film and writing about the experience. I specifically asked them to comment on film scarcity, the inability to digitally manipulate or review images, the feel of the camera and the delay between shooting and seeing their photographs. </p>
<p>Reflecting on the disposable camera assignment, many students delighted in the deliberately slow pace of the process. </p>
<p>“Without the option to manipulate or review each of these photos, I had to think even further about the size of my frame, lighting orientation and subject proximity to the camera lens,” one student wrote. “Capturing in this way was satisfying and relaxing. Despite the fact that I could not alter or delete exposures, I had the opportunity to breathe and set up the perfect shot.”</p>
<p>Another commented, “While modern technology has given us the comfort of not having to physically move around as much to take a photograph, when you actually do it you feel more in the moment.”</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/110679/original/image-20160208-2617-kwm2wq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">A limited number of frames meant a more thoughtful approach.</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-372180121/stock-photo-twisted-film-from-the-camera-vintage-toning.html?src=6zyCaLmrtBwjh3h04rz18g-5-87">'Film' via www.shutterstock.com</a></span>
</figcaption>
</figure>
<p>Students seemed able to achieve the type of heightened focus that Henkel argues may enhance memories. Many students simply felt liberated. The pressure to alter an image until it was just right for public consumption was lifted. </p>
<p>The stress and anxiety my students routinely referred to speaks to the changing role of images, especially for younger generations. No longer do photographs primarily function as works of art or memorial objects.</p>
<p>Instead, as media Studies professor José Van Dijck explains in “<a href="http://www.sup.org/books/title/?id=10395">Mediated Memories in the Digital Age</a>”: </p>
<blockquote>
<p>Even though photography may still capitalize on its primary function as a memory tool for documenting a person’s past, we are witnessing a significant shift, especially among the younger generation, toward using it as an instrument for interaction and peer bonding. </p>
</blockquote>
<p>Part of what image overload may well register, then, is the regular pressure to <em>communicate</em> through photographs, which requires a series of ensuing steps beyond simply clicking the shutter: editing, posting, promoting, and responding. </p>
<p>At the end of the assignment, students had roughly 24 pictures to show for their week (fewer than some might post online in a typical day). But they came away with a clearer sense of their own patterns of perception and photographic engagement. They also gained confidence in their capacity to step back (if only slightly) from nonstop image feeds. </p>
<p>With photo streams continuing to proliferate, greater self-awareness can counteract feelings of drowning amidst a flood of images. And by engaging with analog technologies like disposable cameras, we’ll be better equipped to foster a slower, more intentional form of attention that’s crucial to defending our memories and sensations from being washed away.</p><img src="https://counter.theconversation.com/content/52562/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rebecca Macmillan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Snapping and sharing photographs has never been easier. But being inundated with images can have a host of unintended consequences, from heightened anxiety to impaired memory.
Rebecca Macmillan, Ph.D. Candidate in English, The University of Texas at Austin
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/53525
2016-01-27T12:19:01Z
2016-01-27T12:19:01Z
How your smartphone is changing cinema
<figure><img src="https://images.theconversation.com/files/109216/original/image-20160126-19649-oy5b0s.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Starvecrow</span></span></figcaption></figure><p>Smartphone technologies are increasingly playing a major part in film production, distribution and reception. This month sees the launch of what is being <a href="https://www.youtube.com/watch?v=3mRZB9ufahQ">billed</a> as the “<a href="http://www.starvecrow.com/">world’s first selfie movie</a>”. And next a <a href="http://www.shield5.com/">series</a> is to air on Instagram. Last year, Tangerine became the first film shot on an iPhone to feature at the <a href="https://theconversation.com/tangerine-the-film-that-takes-trans-issues-mainstream-on-an-iphone-50427">Sundance film festival</a>. </p>
<p>The first known film shot entirely on an iPhone was <a href="http://www.imdb.com/title/tt1817229/">Night Fishing</a> (2011). The director attached a 35mm lens to the iPhone’s camera in order to achieve a cinematic look. Night Fishing draws on the framing and grammar of traditional film, eschewing the characteristics traditionally associated with portable recording such as unstable imagery, shaky camera moves, distorted audio, and sickness-invoking motion.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/109217/original/image-20160126-19651-1gnjy4l.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The iPhone mounted onset of Night Fishing, 2011.</span>
<span class="attribution"><span class="source">Image courtesy of Moho Film</span></span>
</figcaption>
</figure>
<p>More recently, <a href="https://theconversation.com/tangerine-the-film-that-takes-trans-issues-mainstream-on-an-iphone-50427">Tangerine</a> presented a blend of traditional codes and conventions associated with cinematic storytelling (such as cross-cutting and using an <a href="https://vimeo.com/77109842">anamorphic adapter</a> to achieve a wide screen) with newer mobile-specific features (such as continual takes, long tracking shots, and hand-held fluid camera work). The result is an absorbing on-screen intimacy with the characters, and a unique screen aesthetic – a hybrid of old and new methods of cinematic storytelling.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/_YJxN8hoQbQ?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>New modes of viewing</h2>
<p><a href="http://ragethemovie.com">Rage</a> (2009) was the first feature film to be designed for mobile phone viewing, and one which embedded the mobile phone symbiotically into the processes of production, distribution and consumption. Although the film itself was shot using a conventional video camera held by the director, it clearly considers the mobile phone in its creation: each of the protagonists addresses a fictional camera operator who is filming each of their private exchanges using his mobile phone. Rage was distributed simultaneously as a theatrical release and as a downloadable film via <a href="http://www.babelgum.com/">Babelgum</a> (for free) to be watched on a mobile phone.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/109219/original/image-20160126-19660-np4qte.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The Silver Goat premiere, 2012.</span>
<span class="attribution"><span class="source">© Sam Pearce</span></span>
</figcaption>
</figure>
<p>The launch of the iPad in April 2010 opened up further possibilities for cinematic-style storytelling. <a href="http://www.imdb.com/title/tt1872206/">The Silver Goat</a> (2012) was the first feature film to be created exclusively for the iPad, the first to be released as an app in the UK and several other countries, and the first in the world to have an iPad-only premiere. This took place on a London Route Master bus which traversed many of the film’s locations throughout the city while the audience members watched the film on their individual iPads.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=570&fit=crop&dpr=1 600w, https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=570&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=570&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=716&fit=crop&dpr=1 754w, https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=716&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/109218/original/image-20160126-20387-yso5or.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=716&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">APP, 2013.</span>
<span class="attribution"><span class="source">© Raymond van der Bas</span></span>
</figcaption>
</figure>
<p>Then there’s the film <a href="http://www.imdb.com/title/tt2536436/?ref_=fn_al_tt_1">APP</a> (2013), for which audiences were required to download the accompanying app – Iris - prior to entering the cinema and then encouraged to interact with it in the auditorium. A horror film in which an app takes over the main protagonists (and the audiences phones) – alludes to the consequences of our new reliance on smartphone devices and its subversion of our privacy. </p>
<p>APP exemplifies how new mobile cinema forms – through choice of story, subject matter and style – can explore the impact of computer mediated communications on our everyday life. This is also a recent topic of documentary cinema in Werner Herzog’s soon to be released <a href="https://www.youtube.com/watch?v=TXyjgeM6CEQ">Lo and Behold: Reveries of the Connected World</a>.</p>
<h2>New ways of filmmaking</h2>
<p>The consequences of mobile technologies on our everyday lives is most explicitly explored in the newly released <a href="http://www.starvecrow.com/">#Starvecrow</a>, the <a href="https://www.youtube.com/watch?v=3mRZB9ufahQ">“world’s first selfie movie”</a>. The film is a blend of improvised footage, shot entirely on the actors mobile devices, with characters turning their cameras on themselves and each other. Blurring reality with fiction, the improvised material is cut with found footage from the actors’ own mobile phone film libraries and personal home video archives culminating in over 70 hours of footage.</p>
<p>This material was then mined and assembled, and further semi-scripted scenes were shot to create multi-streamed narratives in an 85-minute feature film. This style, coupled with the challenging themes of the film, makes for an uncomfortable viewing experience, and is an unapologetic social comment on the darker side of the mass-uptake of new technologies – the pervasiveness of self documentation, self-surveillance, narcissism and social voyeurism.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/uV4iBy3MMu8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Today, the ubiquity of the smartphone means that a generation’s behaviour is being recorded and made publicly available on social media for future audiences. Through these social media channels - lives are now characterised, shaped, and sometimes ruined by naive behaviour and past misdemeanours - the implications of which we, as a society, are yet to fully comprehend.</p>
<p>As with all emergent media forms, the content and themes reflect and exemplify the tools of their making - ultimately creating new ways of storytelling, new modes of production and new types of audience engagement. As such, these pioneering and visionary examples of smartphone films will undoubtedly take their place as significant innovations in the history of cinema.</p><img src="https://counter.theconversation.com/content/53525/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Sarah Atkinson receives funding from the Arts and Humanities Research Council (AHRC). </span></em></p>
This year has already seen the first selfie movie, the first series to air on Instagram – mobile phones are increasingly playing a major role in the film world.
Sarah Atkinson, Senior Lecturer in Digital Cultures, King's College London
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/52906
2016-01-12T19:33:47Z
2016-01-12T19:33:47Z
How to find a meteorite that’s fallen to Earth
<figure><img src="https://images.theconversation.com/files/107625/original/image-20160108-3334-hpekae.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The November 27 fireball as photographed by the Desert Fireball Network observatory at William Creek, South Australia.</span> <span class="attribution"><span class="source">Desert Fireball Network</span>, <span class="license">Author provided</span></span></figcaption></figure><p>A bright fireball lit up the night sky around <a href="http://www.environment.sa.gov.au/parks/Find_a_Park/Browse_by_region/Flinders_Ranges_and_Outback/Kati_Thanda-Lake_Eyre_National_Park">Kati Thanda</a> (Lake Eyre South) in South Australia on November 27, 2015.</p>
<p>But how to find the impact site of that meteorite? And how can we know where in the solar system the object came from?</p>
<p>Thankfully, a new meteorite tracking system we’ve installed in Australia has enabled us to answer these questions, helping us better understand the history and composition of our solar system.</p>
<p>Meteorites are the oldest rocks in existence. They contain a unique physical record of the formation and evolution of the solar system, and the processes that led to terrestrial planets.</p>
<p>They sample hundreds of different heavenly bodies, a compositional diversity that spans the entire inner solar system.</p>
<p>But the most basic piece of data – context – is absent. In almost all cases, meteorite researchers have no idea where their samples came from.</p>
<p>What they need are orbits and the ability to track meteorites back to their place of origin in the solar system. The goal of the <a href="http://fireballsinthesky.com.au/fact-sheets/what-is-the-desert-fireball-network/">Desert Fireball Network</a> is to provide that data.</p>
<h2>A network of ‘eyes’</h2>
<p>This is a project that started in 2012 and since then we’ve installed a network of 32 automated observatories in remote areas of Australia. They are capable of operating for 12 months without maintenance, storing all imagery collected over that period.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=527&fit=crop&dpr=1 600w, https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=527&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=527&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=663&fit=crop&dpr=1 754w, https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=663&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/107711/original/image-20160111-16074-lytnz3.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=663&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The locations of some of the automated camera stations.</span>
<span class="attribution"><a class="source" href="http://fireballsinthesky.com.au/maps/dfn-cameras-map/">Desert Fireball Network (clickable map available)</a>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>Although they are high resolution intelligent imaging systems, they cost around A$5,000 each, which is only a fraction of the cost of previous systems. We’ve completely automated data reduction, so we can potentially scale up the system to arbitrary size without needing hordes of poor PhD students doing manual labour.</p>
<p>And members of the public can contribute by sending in their own reports via a smartphone app that we’ve developed called <a href="http://fireballsinthesky.com.au/download-app/">Fireballs in the Sky</a>.</p>
<p>Trying to track an object moving at many kilometres a second, from the edge of the Earth’s atmosphere to the surface, isn’t easy. You have to account for everything from minor distortions in the camera lenses, to the effect of winds blowing the object off course when the light has gone out. </p>
<p>We would only know that it worked when we found a rock on the ground.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/107859/original/image-20160111-6964-ub4mzd.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">One of the automated cameras keeping watch on the sky.</span>
<span class="attribution"><span class="source">Desert Fireball Network, Curtin University</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>A green flash in the sky</h2>
<p>When that fireball lit up the skies above South Australia in November, it was imaged by five Desert Fireball Network automatic observatories. The stations sent alerts to our server in Perth, attaching thumbnails of the fireball image.</p>
<p>With data from just a couple of cameras, we could tell pretty quickly that we had a meteorite on the ground. First, we had to get out to South Australia to pick up additional data from cameras that weren’t online, so that we could precisely triangulate the fireball. </p>
<p>We took a light aircraft flight from William Creek, which showed us that there was a feature on the surface that might be where the rock plunged into the mud. Now we had to get out on the lake.</p>
<p>Some of our team set to work pulling together all the data. The more accurately we could pinpoint the fall position, the easier any search would be. Their analysis showed that the object came in at a very steep angle, with a velocity of 50,000km/h, and punched down low in the atmosphere, still visible as a fireball at 18km altitude.</p>
<p>When it entered the atmosphere, it was about 80kg. At the end of the fireball it had more likely been whittled down to between 2kg and 6kg.</p>
<p>Alongside the effort to work all this out, we were putting together logistics for the trip. We knew we had to get there quickly. There had already been rain. Much more of it and any trace of the rock might be wiped away.</p>
<p>In addition, Kati Thanda has spiritual significance for the Arabana people. We would need their permission before we could go out on the lake. But the Arabana understood the urgency, and gave consent almost immediately. The Arabana guides, Dean Stuart and Dave Strangway, who came with us on the trip were a huge help.</p>
<h2>The search is on</h2>
<p>We got to the lake shore on December 29. But the lake doesn’t have a firm surface; it’s thick mud. We had to pick our way out to the fall site – almost at the centre of the lake – trying to find a route that would support a quad bike. Eventually, we found a way in.</p>
<p>Next day we got to the site, and searched the area, but didn’t find any trace of the feature that we’d seen a couple of weeks before from the air. Time was running out. Rain was coming in. We figured we might have just have one more day left.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=384&fit=crop&dpr=1 600w, https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=384&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=384&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=483&fit=crop&dpr=1 754w, https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=483&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/107626/original/image-20160108-3323-kn74ti.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=483&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Professor Phil Bland and PhD student Robert Howie digging the meteorite out of the mud in the middle of Kati Thanda (Lake Eyre) South.</span>
<span class="attribution"><span class="source">Jonathan Paxman, Desert Fireball Network</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>So we decided to double down: one of our team would fly over the site, while two of us would search on the ground. If they saw anything from the air they would radio, circle the spot, and we could check it immediately.</p>
<p>It was overcast and drizzling as we headed out to the shore, but heavy rain held off long enough for us to get to the fall site. For an hour, the plane just circled.</p>
<p>Then we got a call that they’d seen it. We ran to the spot, and found the last remnant of the feature that our friend had seen a couple of weeks before. The meteorite had punched a deep hole in the mud.</p>
<p>Digging down through that pipe my fingers eventually touched a rock. We’d found our meteorite. The rock is 1.6kg in weight, a bit lighter than we’d expected, and it’s probably an ordinary <a href="http://dawn.jpl.nasa.gov/meteorite/explore_meteorites_chondrites.asp">chondrite</a>, the most common type of meteorite. But we need to do some analyses to tell for sure.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/107717/original/image-20160111-6972-5uq6tk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The 1.6kg meteorite close up.</span>
<span class="attribution"><span class="source">Desert Fireball Network, Curtin University</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<h2>An unexpected surprise</h2>
<p>We didn’t know it when we built the network, but it turns out it can do a lot more than we ever expected. We can track satellites, space debris and rocket launches. We’ve even tested systems that will let us do fundamental astronomy. And, with a minor upgrade, we’ll have a facility that can spot supernovae and optical counterparts to gamma ray bursts.</p>
<p>But it’s the potential for planetary research that still gets us excited. Already, we’ve seen more fireballs than have ever been recorded up to now, giving us a unique window on what’s hitting the Earth.</p>
<p>As we recover more rocks, we will gradually build a geological map of the inner solar system. If we can link a meteorite to an asteroid, then we essentially have a sample-return mission to near-Earth asteroids, without the need for spacecraft. </p>
<p>This first rock we’ve recovered is just the start. In itself, it’s a research gold mine. But it also proves that our system works so there should be many more.</p><img src="https://counter.theconversation.com/content/52906/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Phil Bland receives funding from the Australian Research Council, via their Australian Laureate Fellowship scheme. </span></em></p>
It’s no easy task to find a meteorite that’s just been seen flashing across the sky. But it helps if you have an automatic network of “eyes” on the night sky.
Phil Bland, ARC Laureate Fellow, Curtin University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/47286
2015-10-16T03:54:36Z
2015-10-16T03:54:36Z
South Africa mulls body cameras to improve police accountability, safety
<figure><img src="https://images.theconversation.com/files/98540/original/image-20151015-30734-ebhozy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A Colorado Springs officer with a body-worn camera. There is growing support to introduce the technology in South Africa.
</span> <span class="attribution"><span class="source">Reuters/Rick Wilking</span></span></figcaption></figure><p>The argument for the use of police body cameras is gaining momentum in South Africa, amid growing demand for greater police accountability, especially in the wake of the <a href="http://www.sahistory.org.za/article/marikana-massacre-16-august-2012">Marikana massacre</a>. </p>
<p>The cameras are typically worn on the chests of police officers. Their use is becoming <a href="https://www.opendemocracy.net/robert-muggah/cop-cams-go-global">common globally</a>. </p>
<p>The chairperson of parliament’s portfolio committee on policing, <a href="http://www.parliament.gov.za/live/content.php?Item_ID=215&CommitteeID=99">Francois Beukman</a>, recently asked for <a href="http://www.sabc.co.za/news/a/77a0510049aaa037bbe6bba84320b537/SAPS-asked-to-use-more-technology-in-combating-police-killings-20153008">serious discussions</a> on introducing body cameras in the <a href="http://www.saps.gov.za/">South African Police Service</a>. </p>
<p>The police say they are discussing the use of <a href="https://pmg.org.za/committee-meeting/21400/">technology</a>, including body cameras, to improve communications and for the <a href="http://www.sabc.co.za/news/a/77a0510049aaa037bbe6bba84320b537/SAPS-asked-to-use-more-technology-in-combating-police-killings-20153008">safety of officers</a>. South Africa has a high rate of <a href="https://www.issafrica.org/iss-today/iss-today-how-to-stop-police-brutality-and-the-killing-of-police-officers-in-south-africa">police murders</a>.</p>
<h2>The case for police-worn body cameras</h2>
<p>A recent <a href="https://www.ojpdiagnosticcenter.org/sites/default/files/spotlight/download/Police%20Officer%20Body-Worn%20Cameras.pdf">study</a> involving the random assignment of body cameras to half of the 54 patrol officers in Rialto, California, showed:</p>
<blockquote>
<p>… shifts without cameras experienced twice as many incidents of use of force as shifts with cameras.</p>
</blockquote>
<p>and</p>
<blockquote>
<p>… the rate of use of force incidents per 1000 contacts was reduced by 2.5 times.</p>
</blockquote>
<p>In a recent examination of allegations of police inefficiency in <a href="http://www.khayelitshacommission.org.za/images/towards_khaye_docs/Khayelitsha_Commission_Report_WEB_FULL_TEXT_C.pdf">Khayelitsha</a>, a township for black people in Cape Town, criminologist <a href="http://www.criminology.uct.ac.za/dr-andrew-faull">Andrew Faull</a> suggested body cameras could improve the ability of police management to monitor officers’ <a href="http://www.khayelitshacommission.org.za/bundles/bundle-twelve/category/266-1-expert-reports.html?start=20">interactions</a> with civilians. </p>
<p>In March 2015, the <a href="http://www.apcof.org/home/">African Policing Civilian Oversight Forum</a> produced a <a href="http://igarape.org.br/wp-content/uploads/2015/02/AE-14_SMART-POLICING1.pdf">comprehensive report</a> on smart policing technologies. The report highlighted the need for body cameras on local police.</p>
<p>It also cited the Igarape Instute’s <a href="http://www.igarape.org.br/en/smart-policing/">Smart Policing Project</a>. The project consists of an app for smartphones that tracks video, audio and GPS coordinates passively and in real time. It aims to improve police accountability and strengthen public safety in low- and middle-income settings in Brazil, Kenya and South Africa. </p>
<p>South Africa ran a one-month pilot scheme under the Smart Policing Project in October 2014. The <a href="http://www.saps.gov.za/">South African Police Service</a> and the Cape Town municipal police declined to participate because the technology was <a href="http://igarape.org.br/wp-content/uploads/2015/02/AE-14_SMART-POLICING1.pdf">not stable</a>. According to Gideon Morris, the provincial secretary
for police, the focus of the pilot was:</p>
<p>… >to get the technology stable enough and to try and expand the use of it once they know that the prototype is working fully. Once this had been achieved the discussion with SAPS will be resumed.</p>
<p>Nevertheless, traffic officers from the National Department of Transport who took part in the pilot were not averse to wearing body cameras. They responded positively, saying their actions were questioned less by the public who knew they were being recorded.</p>
<h2>Police brutality a rising problem</h2>
<p>Incidents of <a href="http://www.csvr.org.za/index.php/media-articles/latest-csvr-in-the-media/2494-why-sa-cops-are-so-brutal.html">police brutality</a> are an ongoing problem in South Africa and body camera technology may help reduce them. Among the most recent high-profile cases:</p>
<ul>
<li><p>In 2013, cellphone images showed <a href="http://www.news24.com/Tags/People/mido_macia">Mido Macia</a>, a Mozambican national working in South Africa as a taxi driver, handcuffed and being dragged behind a <a href="https://www.google.co.za/search?q=Mido+Macia&rlz=1C1CHWA_enZA634ZA634&espv=2&biw=1366&bih=633&tbm=isch&tbo=u&source=univ&sa=X&ved=0CDQQ7AlqFQoTCO3IxeqIxMgCFYdcFAod6qYLfg">police van</a>. The images were shot by members of the public and were <a href="http://www.timeslive.co.za/local/2015/08/25/Judge-throws-book-at-cops-who-murdered-Mido-Macia">used in court</a>. </p></li>
<li><p>In 2011, the South African Broadcasting Corporation broadcast images of the killing by police of a protester <a href="http://www.sabc.co.za/news/a/d1a81b804d5cbf7f8f72ffe570eb4ca2/SAHRCundefinedunpacksundefinedreportundefinedonundefinedTatanesundefinedmurder-20120711">Andries Tatane</a>. Despite the evidence, the inability to accurately identify the officers involved resulted in their <a href="http://www.bdlive.co.za/national/2013/03/28/court-acquits-police-officers-in-andries-tatane-case">acquittal</a>. </p></li>
</ul>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/oL-FuBGioHw?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<h2>Use of technology not without risks</h2>
<p>The advantages of body cameras seem obvious. The civilian oversight forum report highlighted the potential benefits of recordings, including for police training. <a href="http://www.bwvsg.com/wp-content/uploads/2013/07/BWV-Scottish-Report.pdf">Research</a> from Scotland supports the evidentiary benefits of body cameras, saying they expedite resolution of cases. </p>
<p>They also provide additional protection for the police from public complaints. That way, they may even reduce the costs to the state from civil claims.</p>
<p>But the use of footage is not without <a href="https://theconversation.com/beware-the-unintended-consequences-of-police-worn-body-cameras-47882">unintended consequences</a>. In Australia, it was found that a victim’s demeanour at a scene often differed greatly from what was presented in the courtroom. Footage depicting the initial reactions of a victim compared to the post traumatic aftermath can result in a victim essentially being re-victimised in the process. </p>
<p>The disadvantages of cameras also relate to <a href="http://harvardlawreview.org/2015/04/considering-police-body-cameras/">privacy issues</a> for both the police and the public.</p>
<h2>Legislative changes required</h2>
<p>Among the hurdles to the use of technology in policing is the the need for enabling legislation, especially to cover cases where privacy and evidential matters arise. Unfortunately the law will always remain light years behind technology.</p>
<p>The <a href="http://www.gov.za/sites/www.gov.za/files/a25-02.pdf">Electronic Communications and Transactions Act</a> governs the admissibility and evidential weight of electronic evidence in court. </p>
<p>South African law relating to electronic evidence is, however, hampered by the lack of procedures governing the collection, storage and presentation of electronic evidence for purposes of <a href="https://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2009_1/watney/watney.pdf">criminal proceedings</a>. </p>
<p>The <a href="http://www.justice.gov.za/salrc/media/20141210-DP131.pdf">South African Law Reform Commission</a> has recommended changes to legislation to address the issue.</p>
<h2>Clearing hurdles to progress</h2>
<p>The use of body cameras in South Africa is an inevitable, welcome progression. But for this to happen, both the public and the police must actively embrace the technology for their mutual benefit.</p>
<p>There also needs to be discussion about the extent that these technological advances can help or hinder academic research of the police. Police are notoriously difficult for researchers to access. Will it mean they have a legitimate reason to exclude personal access and require researchers to study camera footage instead?</p><img src="https://counter.theconversation.com/content/47286/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gráinne Perkins does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Police brutality is an ongoing problem in South Africa. Police-worn body cameras may help reduce such incidents by improving accountability. They may also contribute to the safety of officers.
Gráinne Perkins, PhD Student, Centre of Criminology, University of Cape Town
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/47267
2015-09-24T05:38:47Z
2015-09-24T05:38:47Z
Snap: smartphones give dedicated digital cameras a run for their money
<figure><img src="https://images.theconversation.com/files/95768/original/image-20150923-25782-1cq0f1x.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Smartphone cameras do have their uses but can they rival a traditional digital camera?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/ashtonpal/9624479689/">Flickr/AshtonPal</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Apple’s recent offering of <a href="https://theconversation.com/apple-provides-a-technology-spritz-keeping-their-products-fresh-but-familiar-47360">new tech toys</a> includes the latest <a href="http://www.apple.com/au/iphone/compare/">iPhone</a> – available in <a href="http://www.apple.com/pr/library/2015/09/21iPhone-6s-iPhone-6s-Plus-Arrive-on-Friday-September-25.html">stores from Friday</a> – and it boasts some mighty camera power.</p>
<p>Looking at the 12-megapixel (MP) still image size and 4K video in the <a href="http://www.apple.com/au/iphone-6s/">iPhone 6S</a>, we need to wonder if this is the only camera/video recorder you’ll ever need? In fact, how long will we continue to call these digital pocket-size computers “smartphones”, since the ability to use them as a phone seems less important that other functions?</p>
<p>But can a smartphone camera outsmart the more traditional digital single-lens reflex (DSLR) or the mirrorless cameras favoured by enthusiasts and professional photographers?</p>
<p>The new iPhone is getting plenty of <a href="http://www.news.com.au/technology/gadgets/first-review-apples-iphone-6s-has-two-killer-features-that-will-change-everything/story-fn6vihic-1227539486975">reviews this week</a>, but let’s take a deeper a look at what Apple is offering in the camera department.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/95666/original/image-20150922-16682-x0jdzg.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The leap from 8MP to 12MP is not that great and far from the 24MP of a typical DSLR.</span>
<span class="attribution"><span class="source">The Conversation (Background image: Flickr/Reto Fetz)</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<p>It’s worth examining the megapixel arms race a bit more clearly and you can see the jump from iPhone 5’s 8MP to 12MP is not that great. So before you run off to update your iPhone or Android device, keep the pixel count in mind. </p>
<p>So a 12MP image on the new iPhone is not that much bigger, especially when compared to some of the typical DSLRs on the market at the moment. These can take <a href="http://www.digitaltrends.com/photography/best-dslr-cameras/">images up to 24MP</a> and even higher, some even <a href="https://photographylife.com/are-you-ready-for-50-mp-cameras">pushing 50MP</a>.</p>
<p>Those iPhone publicity photographs we are seeing displayed on large billboards are a bit of a stretch. We view billboards from great distances and they are generally printed at 50 to 150 DPI. We should be just as impressed with the actual printer quality and how it handles the data, rather than the data sent to the printer from the iPhone.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/95769/original/image-20150923-25794-y6z9v3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Shot on an iPhone – the billboard campaign by Apple to showcase photos taken on iPhones.</span>
<span class="attribution"><span class="source">The Conversation</span>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<p>If we were to put a two-metre-wide photographic quality print made from a good-quality DSLR camera, such as a Nikon or a Canon, next to a print made from an 8MP iPhone in a gallery setting, the quality of the DSLR’s image would be obvious, especially when you can walk up to the print and look into the detail.</p>
<p>So if your images need to live as print, as well as on screen, then the DSLR is still the way to go. But if your images are only going to live on a screen of some sort, then maybe you never need to use a DSLR again. </p>
<h2>DSLR vs the smartphone</h2>
<p>Of course there are plenty of things to consider other than just the megapixels, such as image sensor size, zoom and focus options, low light conditions and other functions and options. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/FHUf6yE-Hts?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>This globally networked lens can also expose the less friendly corners of the planet via apps such as <a href="http://www.ibanet.org/Article/Detail.aspx?ArticleUid=11E76B66-D949-4738-9347-E67FBFBB9441">eyeWitness to atrocities</a>, developed by the International Bar Association. </p>
<p>The app permits sound, video and photo recording, locks the data so it can’t be manipulated and sends it to a secure cloud. The data can then be verified and distributed to global media. </p>
<p>So when it comes to deciding which camera to use, I feel it’s about intent. Most of the time photographers will go for the device that gives then the ability to best craft an image.</p>
<p>Control of light, depth of field, quality of focus, bokeh, focal length, framing and that crucial capturing of the moment in time and space are all second nature to many photographers.</p>
<p>A photographer will also make good use of a smartphone when needed. Benjamin Lowy’s iPhone photos have made it onto <a href="http://www.engadget.com/2012/11/06/time-magazine-cover-shot-with-iphone/">the cover of Time magazine</a>. He also found the iPhone made him less of a target in <a href="http://www.benlowy.com/editorial/libya--revolution/libya--the-fall-of-tripoli/">war zones such as Libya</a>. Using an iPhone, he could move quickly and blend with the crowd unencumbered by camera bags. </p>
<p>“It’s a fast little camera and I do like that on a tough assignment,” he said. Although he added that the “pros will push me aside”, assuming he is a tourist or an amateur.</p>
<h2>The choice is yours</h2>
<p>One thing new smartphones have yet to address is the issue of batteries. Many years ago traditional cameras did not have batteries, so they never suffered from a flat battery. Today’s digital cameras do need batteries but their life can far exceed that of a smartphone, lasting many weeks compared to about a day on a smartphone.</p>
<p>There was a time when cameras were purchases for life; they were handed down from generation to generation. No need for a megapixel count, they were resolution-independent. They just relied on good film and quality lenses. </p>
<p>Today’s smartphones and digital cameras will never be handed down to a new generation, unless they are digital archaeologists. </p>
<p>In the end, no matter what device you choose to use, it’s who’s behind the technology that counts. And if you wait around long enough we may see the best apps of the smartphone and the best qualities of a DSLR converge and then what will we call it?</p><img src="https://counter.theconversation.com/content/47267/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Phillip George does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
The latest iPhone from Apple is out Friday and it offers a bigger and better camera than previous models. But will smartphone cameras ever replace the traditional digital cameras?
Phillip George, Associate Professor, UNSW Art & Design , UNSW Sydney
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/34837
2014-12-18T10:34:57Z
2014-12-18T10:34:57Z
Welcome to Politics4K
<p>While much of the 2014 midterm election analysis centered on the Republican takeover of the Senate, the pundits may have overlooked an important development: the end of a time when politicians looked a little less lifelike, even to viewers in HD.</p>
<p>Thanks to bigger and better processors inside journalists’ cameras, and, especially, a fourfold increase in resolution on viewers’ digital displays, the next era in political campaigning – let’s call it “Politics4K” – has arrived. </p>
<p>Earlier this fall, New York Times technology columnist Molly Wood <a href="http://www.nytimes.com/2014/10/09/technology/personaltech/sharper-image-4k-tv-gimmick-worth-having.html?_r=0">explained 4K</a>:</p>
<blockquote>
<p>From a technical perspective, the term 4K refers to displays with twice the vertical resolution and twice the horizontal resolution of high-definition TVs. The UHD designation combines the higher pixel count of 4K with improvements to on-screen colors that make the on-screen picture brighter and more realistic.</p>
</blockquote>
<p>So by the 2016 presidential election, voters will be able to screen their candidates in unprecedented clarity and color. With nothing less than the White House in the balance, campaigns of all political stripes now need to rethink their campaign optics – or watch their rivals come shining through.</p>
<h2>A milestone moment in campaign optics</h2>
<p>Presidential campaign adviser William P. Wilson – who died last week – may have been the first to understand the importance of campaign optics; <a href="http://www.nytimes.com/2014/12/12/us/william-p-wilson-kennedys-tv-aide-for-historic-1960-debate-is-dead-at-86.html">according to his obituary</a>:</p>
<blockquote>
<p>In 1960 little was understood about the potential reach of television in American politics. Still, though he was just 32 at the time, Mr. Wilson was as experienced with the medium as anyone in the field. He already had the distinction of being the first television consultant ever hired by a presidential campaign.</p>
</blockquote>
<p>In his classic 1979 media study “The Powers That Be,” David Halberstam explains how Wilson – minutes before Senator John F. Kennedy’s first debate against sitting Vice President Richard M. Nixon – convinced a reluctant Kennedy that his face needed some touching-up.</p>
<blockquote>
<p>…Wilson insisted he needed some kind of makeup, mostly to close the pores and keep the shine down, and Kennedy asked if Wilson could do it, and Wilson, who knew the neighborhood, ran two blocks to a pharmacy, bought Max Factor Creme Puff, and made Kennedy up very lightly… On such decisions – Max Factor Creme Puff instead of Shavestick – rode the future leadership of the United States and the free world.</p>
</blockquote>
<p>The Kennedy-Nixon debates in 1960 launched presidential politics into the television age; the medium became a game-changer, even though network broadcasts were black and white, analog and low-resolution by contemporary standards.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/QazmVHAO0os?wmode=transparent&start=48" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">The first televised Kennedy-Nixon debate in 1960 was a milestone moment in campaign optics.</span></figcaption>
</figure>
<h2>Image control key for pols</h2>
<p>As the 20th century progressed, camera and television technology improved significantly – and become increasingly unforgiving.</p>
<p>Trust me: as a documentary filmmaker who has worked on a number of political films, I’ve come to realize that nothing correlates with campaign control more than optics. </p>
<p>The staff of former President Gerald Ford expressed displeasure with the close-up I framed up before a two-camera interview in his library studio. After we wrapped, his staff apologized; somehow, the videotape of his preferred wide-shot (think “White House Briefing”) had been perfectly recorded, but my tight-shot (think “60 Minutes”) suffered “technical difficulties” throughout.</p>
<p>In the middle of another interview – this one with a sitting Vice-President Al Gore – a staffer looking over my shoulder sucker-punched me when I quietly asked my cinematographer to “push in” for an extreme close-up.</p>
<p>Nothing like a shot to the kidney to prove how politics remains a perpetual exercise in control. </p>
<p>During the final year of the Clinton Administration, High-Definition television was in its infancy. After the White House granted me the first access to the Oval Office by a documentary filmmaker since the Kennedy administration, I was awarded a grant to produce my project in HD.</p>
<p>When I showed President Clinton’s special assistant some of our footage on (what was then) Washington’s only HD display, her jaw dropped: never before had she seen her boss depicted so vividly on screen.</p>
<p>In that instant we both realized the game had changed again; politicians would appear even more life-like on television.</p>
<p>Fifteen years later – as the prospects for another Clinton White House loom – another digital technology has reached new heights.</p>
<h2>Optics influences outcomes</h2>
<p>As of October 2014, the market penetration of Ultra HD television was only 7% of American homes. But due to steadily dropping prices for 4K displays – along with the availability of more 4K media – that number <a href="http://www.nytimes.com/2014/10/09/technology/personaltech/sharper-image-4k-tv-gimmick-worth-having.html?_r=0">is expected to grow exponentially</a> by the next presidential election.</p>
<p>Following Netflix’ lead, Amazon Prime commenced streaming 4K media in December. Election Night 2016 broadcast coverage in 4K should be a foregone conclusion. Furthermore, reasonably-priced 4K camcorders are already available to the reporters who will be embedded inside the 2016 primary campaigns.</p>
<p>While in the past, journalists wielding bulky cameras may have been able to be corralled, the proliferation of these camcorders will make it impossible for aides to shield their candidates from unflattering, high-resolution shots.</p>
<p>There’s a reason why this makes political operatives anxious. <a href="http://www.politico.com/news/stories/0810/40590.html">Study</a> after <a href="https://www.uni-muenster.de/imperia/md/content/psyifp/aeechterhoff/wintersemester2011-12/vorlesungkommperskonflikt/efranpatterson_effphysappnationelect_canadjbehsc1974.pdf">study</a> has shown that to voters, the candidates’ looks matter – in many cases, more than their party affiliation or policy stances. </p>
<p>In the world of politics, optics reign.</p>
<p>So while the next set of presidential candidates can run, they can’t hide from revealing 4K coverage – under all kinds of conditions, indoors and out, many less-than-flattering.</p>
<p>The likeliest prediction is that the Politics4K era will usher in plenty of unintended political consequences. With an electorate getting younger and more tech-savvy every year, how will politicians manage to maintain a youthful, energetic image?</p>
<p>Will the adage “the camera adds 10 pounds” become “the UltraHD camera adds 20 years” for certain candidates?</p>
<p>And will Politics4K become the great equalizer – or will age, gender and racial differences emerge in sharper contrast?</p>
<p>Too bad William P. Wilson didn’t live to see the day that UltraHD politics could be practiced in earnest. My guess is he’d already be working with the younger, more telegenic candidate, just as he did in 1960.</p><img src="https://counter.theconversation.com/content/34837/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ted Bogosian does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
While much of the 2014 midterm election analysis centered on the Republican takeover of the Senate, the pundits may have overlooked an important development: the end of a time when politicians looked a…
Ted Bogosian, Instructor and Visiting Filmmaker, Duke University
Licensed as Creative Commons – attribution, no derivatives.
tag:theconversation.com,2011:article/20741
2013-12-26T09:58:29Z
2013-12-26T09:58:29Z
Out! Goal! The ball was in! But could Hawk-Eye get it wrong?
<figure><img src="https://images.theconversation.com/files/38390/original/n2yn3c3c-1387797781.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The eye sees what the camera can't.</span> <span class="attribution"><span class="source">Scot Campbell</span></span></figcaption></figure><p>Hawk-Eye is a device used to reconstruct the track of the ball for LBW decisions in cricket and for line calls in tennis. It will be much in evidence during the remaining Ashes tests and is now being used for goal-line decisions in Premier League football. The technology is at its best when officials make a really bad decision. </p>
<p>But there are things you might not know about Hawk-Eye. For instance, it cannot track the ball to a millimetre even though one might get this impression when watching some replays; in tennis, those shots shown to be touching the line by a hair’s breadth and called in might actually be out and vice-versa.</p>
<p>Few people realised that there was an issue with accuracy until my colleagues and I <a href="http://www.cf.ac.uk/socsi/contactsandpeople/harrycollins/expertise-project/hawk-eye-debate.html">wrote about it in 2008</a>; even top scientists were quite surprised until they thought about it. </p>
<h2>How it works</h2>
<p>Reconstructed track-devices such as Hawk-Eye work by using a number of TV cameras to record the position of the ball in each frame, then a computer reconstructs the path and projects it forward from the last frame. </p>
<p>These devices were first used to aid leg-before-wicket decisions in cricket. The projection-forward principle is the same in tennis since it is unlikely that a camera shutter will be open at the exact moment the ball hits the ground next to the line so the crucial position has to be estimated from a series of previous positions.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=396&fit=crop&dpr=1 600w, https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=396&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=396&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=498&fit=crop&dpr=1 754w, https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=498&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/38394/original/7cqjgmb5-1387798891.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=498&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Coming to a football stadium near you.</span>
<span class="attribution"><span class="source">Shuji Kajiyama/AP</span></span>
</figcaption>
</figure>
<h2>What we uncovered</h2>
<p>From the frame-rate of the cameras and the speed of the ball, a back-of-an envelope calculation gave the range of possible accuracy and it turned out to be less than the replays suggested. So we telephoned the firm to talk about it and we hit a wall. As sociologists of science we had spent decades chatting with scientists about this kind of thing but suddenly we were told this information was private and lawyers were on call. Before we could publish our first paper we had to ask Cardiff University to back us in case we were hauled into court.</p>
<p>Our results were based on the range of possibilities for frame-rate and such other technical matters we could glean from the internet but detailed data for these devices was and still is secret. The International Tennis Federation refuses to release the details of its tests and the International Cricket Council also keeps its results under wraps. I have tried and tried to get the information from them and the scientists they commissioned to do the testing but am always met with the claim that the information is commercially sensitive.</p>
<h2>Margins of error</h2>
<p>The problem with reconstructed track devices is that their output is based on estimates. The position of the ball in any one frame is a blob of pixels. The future path of the ball must be extrapolated from at least three frames if the ball is swerving but if it is moving fast and the bounce point is near to the crucial impact point there may not be three frames. </p>
<p>Even with three frames, projections have errors and if, as in tennis, the ball distorts on impact, the footprint on which the line call is based is, again, the result of an inexact calculation – and so on. Hawk-Eye itself used to claim an average error of 3.6 millimetres; more recently it claims this has been improved to average of 2.2mms. However, particularly in tennis, the reliance on this technology to provide a definitive call means that this margin of error isn’t reflected in the replays, leading most fans to assume it is 100% accurate.</p>
<p>Accuracy, of course, will depend on the speed and the angle of the ball and many other factors which is why these are average figures and, as with all averages, on occasion the error will be bigger – sometimes much bigger. To know what is going on one needs details of the tests and the distribution of errors that resulted.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=381&fit=crop&dpr=1 600w, https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=381&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=381&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=479&fit=crop&dpr=1 754w, https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=479&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/38395/original/nntjhhzn-1387798945.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=479&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Or is it?</span>
<span class="attribution"><span class="source">Anja Niedringhaus/AP</span></span>
</figcaption>
</figure>
<h2>Tech and circuses</h2>
<p>Assuming that tennis and football lovers, unlike enthusiasts for, say, the professional wrestling circus, want to see fairness as well as an entertaining spectacle, they ought to know more about how the technology is trying to work out what happened to the ball. </p>
<p>When the ball is really close to the line we should see something like a spinning coin to indicate that the final judgement has a lot of chance in it. The crowd would still get its decision and fun but something closer to the truth would be on display. </p>
<p>More and more, computers are able to simulate what looks like reality and this is dangerous for the future of society. The public needs to learn to question technological claims such as those that have been made for anti-missile weapons systems. In certain sports some spectators think that technology is infallible when it is not.</p>
<p>Paul Hawkins, the founder of the Hawk-Eye company, <a href="http://blogs.wsj.com/numbersguy/sports-make-final-call-on-technology-1292/">recently said</a> our arguments were “typical of people who spent a lot of time in universities rather than on the tennis circuit”. He’s right, and thank goodness for that.</p><img src="https://counter.theconversation.com/content/20741/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Harry Collins does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>
Hawk-Eye is a device used to reconstruct the track of the ball for LBW decisions in cricket and for line calls in tennis. It will be much in evidence during the remaining Ashes tests and is now being used…
Harry Collins, Professor of Social Science, Cardiff University
Licensed as Creative Commons – attribution, no derivatives.