tag:theconversation.com,2011:/us/topics/new-study-46688/articlesnew study – The Conversation2023-08-25T12:27:58Ztag:theconversation.com,2011:article/2115982023-08-25T12:27:58Z2023-08-25T12:27:58ZAI scores in the top percentile of creative thinking<figure><img src="https://images.theconversation.com/files/544631/original/file-20230824-19-dofq41.jpg?ixlib=rb-1.1.0&rect=0%2C6%2C4071%2C2986&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Creativity involves generating something new -- a product or solution that didn't previously exist.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/red-apple-on-a-background-of-green-apples-royalty-free-image/536687143?phrase=repeated+objects+with+one+unique+object&adppopup=true">Maestria_diz/iStock via Getty Images</a></span></figcaption></figure><p>Of all the forms of human intellect that one might expect artificial intelligence to emulate, few people would likely place creativity at the top of their list. Creativity is wonderfully mysterious – and frustratingly fleeting. It defines us as human beings – and seemingly defies the cold logic that lies behind the silicon curtain of machines. </p>
<p>Yet, the use of AI for creative endeavors is now growing. </p>
<p>New AI tools like DALL-E and Midjourney are increasingly part of creative production, and some have started <a href="https://www.nytimes.com/2022/09/02/technology/ai-artificial-intelligence-artists.html">to win awards for their creative output</a>. The growing impact is both social and economic – as just one example, the potential of AI to generate new, creative content is a defining flashpoint behind the <a href="https://theconversation.com/what-are-hollywood-actors-and-writers-afraid-of-a-cinema-scholar-explains-how-ai-is-upending-the-movie-and-tv-business-210360">Hollywood writers strike</a>.</p>
<p>And if our recent study into the <a href="https://doi.org/10.1016/j.yjoc.2023.100065">striking originality of AI</a> is any indication, the emergence of AI-based creativity – along with examples of both its promise and peril – is likely just beginning. </p>
<h2>A blend of novelty and utiliy</h2>
<p>When people are at their most creative, they’re responding to a need, goal or problem by generating something new – a product or solution that didn’t previously exist. </p>
<p>In this sense, creativity is an act of combining existing resources – ideas, materials, knowledge – in a novel way that’s useful or gratifying. Quite often, the result of creative thinking is also surprising, leading to something that the creator did not – and perhaps could not – foresee. </p>
<p>It might involve an invention, an unexpected punchline to a joke or a groundbreaking theory in physics. It might be a unique arrangement of notes, tempo, sounds and lyrics that results in a new song. </p>
<p>So, as a researcher of creative thinking, I immediately noticed something interesting about the content generated by the latest versions of AI, including GPT-4. </p>
<p>When prompted with tasks requiring creative thinking, the novelty and usefulness of GPT-4’s output reminded me of the creative types of ideas submitted by students and colleagues I had worked with as a teacher and entrepreneur. </p>
<p>The ideas were different and surprising, yet relevant and useful. And, when required, quite imaginative. </p>
<p>Consider the following prompt offered to GPT-4: “Suppose all children became giants for one day out of the week. What would happen?” The ideas generated by GPT-4 touched on culture, economics, psychology, politics, interpersonal communication, transportation, recreation and much more – many surprising and unique in terms of the novel connections generated. </p>
<p>This combination of novelty and utility is difficult to pull off, as most scientists, artists, writers, musicians, poets, chefs, founders, engineers and academics can attest. </p>
<p>Yet AI seemed to be doing it – and doing it well.</p>
<h2>Putting AI to the test</h2>
<p>With researchers in creativity and entrepreneurship <a href="https://www.vm.vu.lt/apie/destytojai/2-uncategorised/637-christian-byrge">Christian Byrge</a> and <a href="https://www.umwestern.edu/directory/christian-gilde/">Christian Gilde</a>, I decided to put AI’s creative abilities to the test by having it take the Torrance Tests of Creative Thinking, <a href="https://www.sciencedirect.com/topics/psychology/torrance-test">or TTCT</a>. </p>
<p>The TTCT prompts the test-taker to engage in <a href="https://theconversation.com/how-to-unlock-your-creativity-even-if-you-see-yourself-as-a-conventional-thinker-196198">the kinds of creativity required for real-life tasks</a>: asking questions, how to be more resourceful or efficient, guessing cause and effect or improving a product. It might ask a test-taker to suggest ways to improve a children’s toy or imagine the consequences of a hypothetical situation, as the above example demonstrates.</p>
<p>The tests are not designed to measure <a href="https://doi.org/10.1098/rstb.2014.0099">historical creativity</a>, which is what some researchers use to describe the transformative brilliance of figures like Mozart and Einstein. Rather, it assesses the general creative abilities of individuals, often referred to as <a href="https://doi.org/10.1098/rstb.2014.0099">psychological or personal creativity</a>. </p>
<p>In addition to running the TTCT through GPT-4 eight times, we also administered the test to 24 of our undergraduate students. </p>
<p>All of the results were evaluated by trained reviewers at Scholastic Testing Service, a private testing company that provides scoring for the TTCT. They didn’t know in advance that some of the tests they’d be scoring had been completed by AI. </p>
<p>Since Scholastic Testing Service is a private company, it does not share its prompts with the public. This ensured that GPT-4 would not have been able to scrape the internet for past prompts and their responses. In addition, the company has a database of thousands of tests completed by college students and adults, providing a large, additional control group with which to compare AI scores.</p>
<p>Our results? </p>
<p>GPT-4 scored in the top 1% of test-takers for the originality of its ideas. From our research, we believe this marks one of the first examples of AI meeting or exceeding the human ability for original thinking. </p>
<p>In short, we believe that AI models like GPT-4 are capable of producing ideas that people see as unexpected, novel and unique. Other researchers are arriving at similar conclusions in <a href="https://doi.org/10.48550/arXiv.2303.12003">their research of AI and creativity</a>. </p>
<h2>Yes, creativity can be evaluated</h2>
<p>The emerging creative ability of AI is surprising for a number of reasons. </p>
<p>For one, many outside of the research community continue to believe that creativity <a href="https://www.ted.com/talks/yoel_tawil_why_creativity_has_no_definition">cannot be defined</a>, let alone scored. Yet products of human novelty and ingenuity have been prized – and bought and sold – for thousands of years. And creative work has been defined and scored in fields like psychology since at least the 1950s. </p>
<p><a href="https://www.idsa.org/education-paper/exchanging-the-4ps-of-creativity/">The person, product, process, press model of creativity</a>, which researcher Mel Rhodes introduced in 1961, was an attempt to categorize the myriad ways in which creativity had been understood and evaluated until that point. Since then, the understanding of creativity has only grown. </p>
<p>Still others are surprised that the term “creativity” might be applied to nonhuman entities like computers. On this point, we tend to agree with cognitive scientist Margaret Boden, who has argued that the question of whether the term creativity should be applied to AI is a <a href="https://doi.org/10.1609/aimag.v30i3.2254">philosophical rather than scientific question</a>. </p>
<h2>AI’s founders foresaw its creative abilities</h2>
<p>It’s worth noting that we studied only the output of AI in our research. We didn’t study <a href="https://theconversation.com/chatgpt-dall-e-2-and-the-collapse-of-the-creative-process-196461">its creative process</a>, which is likely very different from human thinking processes, or the environment in which the ideas were generated. And had we defined creativity as requiring a human person, then we would have had to conclude, by definition, that AI cannot possibly be creative. </p>
<p>But regardless of the debate over definitions of creativity and the creative process, the products generated by the latest versions of AI are novel and useful. We believe this satisfies the definition of creativity that is now dominant in the fields of psychology and science.</p>
<p>Furthermore, the creative abilities of AI’s current iterations are not entirely unexpected. </p>
<p>In their now famous proposal for the <a href="https://home.dartmouth.edu/about/artificial-intelligence-ai-coined-dartmouth">1956 Dartmouth Summer Research Project on Artificial Intelligence</a>, the founders of AI highlighted their desire to simulate “every aspect of learning or any other feature of intelligence” – including creativity.</p>
<p>In this same proposal, computer scientist Nathaniel Rochester <a href="http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html">revealed his motivation</a>: “How can I make a machine which will exhibit originality in its solution of problems?” </p>
<p>Apparently, AI’s founders believed that creativity, including the originality of ideas, was among the specific forms of human intelligence that machines could emulate.</p>
<p>To me, the surprising creativity scores of GPT-4 and other AI models highlight a more pressing concern: Within U.S. schools, very few official programs and curricula have been implemented to date that specifically target human creativity and <a href="https://www.ted.com/talks/sir_ken_robinson_do_schools_kill_creativity?language=en">cultivate its development</a>. </p>
<p>In this sense, the creative abilities now realized by AI may provide a “<a href="https://www.space.com/10437-sputnik-moment.html">Sputnik moment</a>” for educators and others interested in furthering human creative abilities, including those who see creativity as an essential condition of individual, social and economic growth.</p><img src="https://counter.theconversation.com/content/211598/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Erik Guzik does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Researchers had college students and AI take a standardized test in creative thinking, and all of them were scored by trained evaluators who didn’t know in advance that some had been completed by AI.Erik Guzik, Assistant Clinical Professor of Management, University of MontanaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1706092021-11-08T19:10:31Z2021-11-08T19:10:31ZAre people lying more since the rise of social media and smartphones?<figure><img src="https://images.theconversation.com/files/430554/original/file-20211105-16752-1hf2une.jpg?ixlib=rb-1.1.0&rect=0%2C2%2C1595%2C1420&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Some forms of technology seem to facilitate lying more than others.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/lies-concept-royalty-free-image/465104450?adppopup=true">solitude72/iStock via Getty Images</a></span></figcaption></figure><p>Technology has given people more ways to connect, but has it also given them more opportunities to lie?</p>
<p>You might text your friend a white lie to <a href="https://dl.acm.org/doi/10.1145/1518701.1518782">get out of going to dinner</a>, exaggerate your height on a dating profile <a href="https://doi.org/10.1111/j.1460-2466.2011.01619.x">to appear more attractive</a> or invent an excuse to your boss over email to <a href="https://kuscholarworks.ku.edu/bitstream/handle/1808/6098/KJSV11N1A6.pdf?sequence=3&isAllowed=y">save face</a>. </p>
<p>Social psychologists and communication scholars have long wondered not just who lies the most, but where people tend to lie the most – that is, in person or through some other communication medium. </p>
<p>A seminal <a href="https://dl.acm.org/doi/10.1145/985692.985709">2004 study</a> was among the first to investigate the connection between deception rates and technology. Since then, the ways we communicate have shifted – fewer phone calls and more social media messaging, for example – and I wanted to see how well earlier results held up. </p>
<h2>The link between deception and technology</h2>
<p>Back in 2004, communication researcher <a href="https://comm.stanford.edu/faculty-hancock/">Jeff Hancock</a> and his colleagues had 28 students report the number of social interactions they had via face-to-face communication, the phone, instant messaging and email over seven days. Students also reported the number of times they lied in each social interaction.</p>
<p>The results suggested people told the most lies per social interaction on the phone. The fewest were told via email. </p>
<p>The findings aligned with a framework Hancock called the “<a href="https://dl.acm.org/doi/10.1145/985692.985709">feature-based model</a>.” According to this model, specific aspects of a technology – whether people can communicate back and forth seamlessly, whether the messages are fleeting and whether communicators are distant – predict where people tend to lie the most.</p>
<p>In Hancock’s study, the most lies per social interaction occurred via the technology with all of these features: the phone. The fewest occurred on email, where people couldn’t communicate synchronously and the messages were recorded.</p>
<h2>The Hancock Study, revisited</h2>
<p>When Hancock conducted his study, <a href="https://www.history.com/this-day-in-history/facebook-launches-mark-zuckerberg">only students at a few select universities</a> could create a Facebook account. The iPhone was in its early stages of development, a highly confidential project nicknamed “<a href="https://www.theverge.com/2017/6/13/15782200/one-device-secret-history-iphone-brian-merchant-book-excerpt">Project Purple</a>.” </p>
<p>What would his results look like nearly 20 years later?</p>
<p><a href="https://academic.oup.com/hcr/advance-article-abstract/doi/10.1093/hcr/hqab019/6423102">In a new study</a>, I recruited a larger group of participants and studied interactions from more forms of technology. A total of 250 people recorded their social interactions and number of interactions with a lie over seven days, across face-to-face communication, social media, the phone, texting, video chat and email.</p>
<p>As in Hancock’s study, people told the most lies per social interaction over media that were synchronous and recordless and when communicators were distant: over the phone or on video chat. They told the fewest lies per social interaction via email. Interestingly, though, the differences across the forms of communication were small. Differences among participants – how much people varied in their lying tendencies – were more predictive of deception rates than differences among media.</p>
<p><iframe id="zcjE1" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/zcjE1/4/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Despite changes in the way people communicate over the past two decades – along with ways the COVID-19 pandemic changed <a href="https://time.com/5835818/socializing-coronavirus-social-distancing/">how people socialize</a> – people seem to lie systematically and in alignment with the feature-based model.</p>
<p>There are several possible explanations for these results, though more work is needed to understand exactly why different media lead to different lying rates. It’s possible that certain media are better <a href="https://link.springer.com/chapter/10.1007/978-3-319-96334-1_31">facilitators of deception</a> than others. Some media – the phone, video chat – might make deception feel easier or less costly to a social relationship if caught. </p>
<p>Deception rates might also differ across technology because people use some forms of technology for certain social relationships. For example, people might only email their professional colleagues, while video chat might be a better fit for more personal relationships.</p>
<p>[<em>Over 115,000 readers rely on The Conversation’s newsletter to understand the world.</em> <a href="https://theconversation.com/us/newsletters/the-daily-newsletter-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=100Ksignup">Sign up today</a>.]</p>
<h2>Technology misunderstood</h2>
<p>To me, there are two key takeaways.</p>
<p>First, there are, overall, small differences in lying rates across media. An individual’s tendency to lie matters more than whether someone is emailing or talking on the phone.</p>
<p>Second, there’s a low rate of lying across the board. Most people are honest – a premise consistent with <a href="https://doi.org/10.1177/0261927X14535916">truth-default theory</a>, which suggests most people report being honest most of the time and there are only a few <a href="https://doi.org/10.1177/0261927X14528804">prolific liars</a> in a population.</p>
<p>Since 2004, social media have become a primary place for <a href="https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/">interacting with other people</a>. Yet a common misperception persists that communicating online or via technology, as opposed to in person, leads to social interactions that are <a href="https://www.penguinrandomhouse.com/books/313732/reclaiming-conversation-by-sherry-turkle/">lower in quantity and quality</a>.</p>
<p>People often believe that just because we use technology to interact, honesty is harder to come by and users aren’t well served. </p>
<p>Not only is this perception misguided, but it is also unsupported by empirical evidence. The <a href="https://www.sciencedirect.com/science/article/abs/pii/S0747563216304800">belief that lying is rampant</a> in the digital age just doesn’t match the data.</p><img src="https://counter.theconversation.com/content/170609/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Markowitz does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Communication scholars have long wondered not just who lies the most, but also whether people tend to lie more online, in person or over the phone.David Markowitz, Assistant Professor of Social Media Data Analytics, University of OregonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/976732018-06-19T10:36:29Z2018-06-19T10:36:29ZHow setting a schedule can make you less productive<figure><img src="https://images.theconversation.com/files/223500/original/file-20180618-85849-1tynayx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Ticking away the moments that make up a dull day ...</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/man-filling-out-monthly-planner-on-67869547?src=ubq7OnNOSEhcZEzYlyfU9Q-1-0">NAN728/Shutterstock.com</a></span></figcaption></figure><p>It can seem like there’s never enough time – not enough for sleep and not enough for play, not enough for cooking and not enough for exercise. </p>
<p>There’s a relatively new term to describe this feeling: <a href="https://www.cnn.com/2017/07/24/health/time-famine-stress-happiness-study/index.html">time famine</a>, or the sensation of having too much to do without enough time to do it.</p>
<p>In order to structure what little time we feel we have, one strategy we deploy is scheduling. In fact, reliance on organizational tools like daily planners has been on the rise. In two recent surveys, <a href="https://www.statista.com/forecasts/805867/regularly-used-smartphone-apps-in-the-us">51 percent</a> of respondents said they regularly used their calendar app, while <a href="https://www.yumpu.com/en/document/view/37642904/a-study-of-calendar-usage-in-the-workplace-2011-promotional-">63 percent</a> of office workers consider calendars “very important.”</p>
<p>The idea is that scheduling will make you more efficient: When you allocate your time, it organizes your day into a series of appointments, meetings and calls, while blocking off free time for other activities or tasks. </p>
<p>But in a series of <a href="https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucy043/4996321?redirectedFrom=fulltext">eight studies</a>, <a href="http://www.business.rutgers.edu/faculty/gabriela-tonietto">Gabriela Tonietto</a>, <a href="https://olin.wustl.edu/EN-US/Faculty-Research/Faculty/Pages/FacultyDetail.aspx?username=nowlis">Steve Nowlis</a> and <a href="https://scholar.google.com/citations?user=a5FL_fYAAAAJ&hl=en">I</a> found that scheduling can sometimes backfire – and actually make us less productive.</p>
<h2>An appointment approaches – and time ‘shrinks’</h2>
<p>Much of scheduling’s downside has to do with the anticipation of a meeting or appointment. When we know a scheduled meeting or phone call is looming, it can make us feel like we have less time to do what we need to do.</p>
<p>In <a href="https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucy043/4996321?redirectedFrom=fulltext">one study</a>, we asked attendees of an academic conference whether they would go to the presidential address taking place about an hour later. Some said they would, and others said they wouldn’t. Those who planned to attend the address reported that the hour leading up to it felt shorter. </p>
<p>In <a href="https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucy043/4996321?redirectedFrom=fulltext">another study</a>, we had half of the participants imagine that a friend would be coming over in an hour, while the other half were told to imagine they had no plans. We asked all of the participants how many minutes they “subjectively” felt like they could spend reading during that same hour.</p>
<p>Those who were told to imagine that a friend would be coming over reported that the hour leading up to the visit had only 40 usable minutes for reading. Those who were told to imagine they had no plans felt as if they had 49 minutes to read.</p>
<p>So the presence of an upcoming activity seems to have shrunk how much time people felt they had to do something. </p>
<p>Why might this happen?</p>
<p>We believe that when there’s an appointment looming, we direct our attention to it, whether it’s mentally preparing for it or simply dreading it. This makes the future appointment feel more substantial; as a result, the time interval leading up to the scheduled activity feels limited and insufficient.</p>
<h2>Free to do … less?</h2>
<p>But in the end, you still have the same amount of time leading up to a scheduled event.</p>
<p>So feeling like you have less time shouldn’t really matter, right? But it does. The feeling by itself can influence what people decide to do. </p>
<p>We know that <a href="https://ac-els-cdn-com.proxy.lib.ohio-state.edu/0167487082900344/1-s2.0-0167487082900344-main.pdf?_tid=7eacc858-25d0-4288-ac57-5e44ce112eb3&acdnat=1529346476_84f6b7b769a6de250709bffd1fd440a2">when something is scarce</a>, people consider it more valuable and are <a href="https://academic.oup.com/qje/article-abstract/112/2/341/1870915">less willing to part with it</a>. </p>
<p>The same is true for time. If time feels limited, people are less likely to use it – even when it’s in their best interest. </p>
<p>So in <a href="https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucy043/4996321?redirectedFrom=fulltext">another study</a>, we gave participants an empty calendar for the next day and asked them to fill it up, as accurately as possible, with what they had scheduled (including preparation or transition times). This allowed us to correctly calculate how much free time they had in between each planned event. </p>
<p>We then gave participants an opportunity to participate in a second study. Everyone made a choice between participating in a 30-minute online study that would earn them US$2.50, or signing up for a 45-minute online study to receive $5.00. Each would take place during an hourlong window. </p>
<p>On our end, we strategically chose the hourlong window for the participants. We told half of them that the study would take place within an hour of an event they’d scheduled. The other half would take the study during a time period that concluded with a half-hour cushion before their scheduled event.</p>
<p>We found that participants in the first group were much less likely to choose the longer but more lucrative study – despite having more than enough time to complete the study.</p>
<p>In yet <a href="https://academic.oup.com/jcr/advance-article-abstract/doi/10.1093/jcr/ucy043/4996321?redirectedFrom=fulltext">another study</a>, we wondered if the mere reminder of an upcoming event could have a similar effect. </p>
<p>Before beginning an unrelated study, we told half of the participants that they would have about five minutes to do whatever they wanted. We told the other half they had about five minutes before we would “get started.” </p>
<p>For those in the latter group, the simple mention of “starting something” was enough to change their behavior. We found that they engaged in significantly fewer activities, whether it was answering emails or checking social media, in this short five-minute period.</p>
<h2>You’re less famished than you think</h2>
<p>Some might think that time famine arises because they have too much to do and not enough time to do it. </p>
<p>While this may certainly be the case at times, our results suggest that the fault also lies in our own perception of what we feel can be done with the time we have. In other words, it’s important to realize that we might be contributing to our time famine. </p>
<p>If you love scheduling and planning out your days, a trick could be to schedule events or tasks back-to-back, which leaves you with larger chunks of unscheduled time. Several uninterrupted hours of unscheduled time will feel longer, especially if there’s nothing scheduled looming.</p>
<p>It can be effective to remind yourself that time isn’t as short as it feels. Even if you don’t think you’ll have enough time to complete something, you can still start a task and finish it later. </p>
<p>As Aristotle <a href="https://www.brainyquote.com/quotes/aristotle_109750">once said</a>, “Well begun is half done.”</p><img src="https://counter.theconversation.com/content/97673/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Selin Malkoc works for the Ohio State University. She has not received any relevant external funding and has no conflict of interest.</span></em></p>You might think you’ve made your day more efficient – but it can actually affect what you accomplish during your unstructured time.Selin Malkoc, Associate Professor of Marketing, The Ohio State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/876602017-11-23T14:34:06Z2017-11-23T14:34:06ZKeeping staff satisfied really is good business, says new study<figure><img src="https://images.theconversation.com/files/196146/original/file-20171123-18021-1879kih.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/business-people-modern-office-celebrating-good-397897816?src=HZkK18abGGzZGd3aD_vnnA-1-3">Shutterstock</a></span></figcaption></figure><p>Imagine sitting down to your business meeting in a <a href="http://www.telegraph.co.uk/news/newstopics/howaboutthat/12192007/Ridiculous-offices-10-utterly-absurd-workplaces.html">ball-pool room</a> with multi-coloured walls and bean bags instead of chairs. If that’s not crazy enough, how about a massive hammock to take a breather on during a hectic nine-hour shift?</p>
<p>Many companies are coming up imaginative ways to keep their staff satisfied. The theory goes that with more entertaining and exciting work spaces, employees no longer feel like they are stuck at work. Instead, they adopt a “work hard, play hard” attitude and the company enjoys greater productivity.</p>
<p>But do companies actually benefit from investing in the satisfaction of their employees? Shareholders are sceptical. Their common <a href="https://hbr.org/2016/03/28-years-of-stock-market-data-shows-a-link-between-employee-satisfaction-and-long-term-value">view</a> is that for every dollar invested in staff satisfaction, a dollar is taken away from them. But my colleagues Efthymia Symitsi, Panagiotis Stamolampros and I have just completed a <a href="https://www.sciencedirect.com/science/article/pii/S0165176517304433">new study</a> that shows employee satisfaction really does affect the long-term financial success of a business.</p>
<p>To do this, we examined the relationship between reviews by employees of a company and how successful it was using measures of profitability (<a href="https://www.investopedia.com/terms/r/returnonassets.asp?ad=dirN&qo=investopediaSiteSearch&qsrc=0&o=40186">return on equity</a>) and value (<a href="https://www.investopedia.com/terms/q/qratio.asp?ad=dirN&qo=serpSearchTopBox&qsrc=1&o=40186">Tobin’s Q</a>). We found that companies whose employees said they were highly satisfied performed better financially than those who were unsatisfied. The more reviews per employee that a company had, the more pronounced this effect seemed to be.</p>
<h2>What makes this study different?</h2>
<p>Previous <a href="http://www.sciencedirect.com/science/article/pii/S0304405X11000869">studies</a> on this topic have mainly used Fortune magazine’s “<a href="http://fortune.com/best-companies/">100 Best Places to Work for in America</a>” list to measure staff satisfaction. This rates workplaces using an extensive anonymous employee survey. The <a href="https://www.greatplacetowork.com/best-workplaces/100-best">survey</a> includes questions relating to the support employees get in their personal and professional lives, the quality of communication by management and relationships with colleagues. The problem is that companies have to pay a fee to participate in the survey and be included in this list. So they are only likely to do this if they believe their employees are satisfied, an issue known in statistics as self-selection bias.</p>
<p>To avoid this, we gathered data from a collection of reviews posted by employees on jobs website <a href="https://www.glassdoor.com">Glassdoor</a>. This also meant our analysis wasn’t limited to a small number of companies. To prevent disgruntled ex-employees from unfairly skewing the picture of their old companies, we focused on reviews from employees who were still working at each company. In total, we used approximately 326,000 “overall” satisfaction ratings for 313 public US companies posted from 2009 to 2016 on Glassdoor.</p>
<h2>Does the stock market agree?</h2>
<p>We also looked at whether investors in the stock market recognised the value staff satisfaction brings to a business. The answer was a resounding no. An investment portfolio that included stocks of the top 25% of companies in terms of employee satisfaction produced an “<a href="https://www.investopedia.com/terms/a/abnormalreturn.asp">abnormal return</a>” over the period we studied. That is, given the portfolio’s risk, the rate of return was significantly higher than expected according to standard asset pricing models. This supported our finding that employee satisfaction is important for companies and investors, but also revealed it is not fully reflected in companies’ stock prices.</p>
<p>If it was, the portfolio wouldn’t have achieved this abnormal return. The reviews from Glassdoor are public information so if investors recognised the importance of staff satisfaction, they could easily use them when deciding which stocks to buy. The increased demand would translate to an increase in the stock price of companies with high employee satisfaction and so the return of the portfolio wouldn’t be abnormal.</p>
<h2>What are the implications for managers?</h2>
<p>Some economists have been arguing <a href="http://faculty.chicagobooth.edu/luigi.zingales/papers/research/search.pdf">for almost two decades</a> that employees are becoming more important to modern corporations. They aren’t just in charge of company assets but are assets themselves. Our findings, which are <a href="http://faculty.london.edu/aedmans/RoweAMP.pdf">consistent with previous studies</a>, support this idea that looking after employees’ job satisfaction is very important to the company’s financial success.</p>
<p>This is especially true in a knowledge and service-based economy where innovation and customer relationships are key for creating value. Managers need to recognise this even when shareholders don’t. Investing in staff satisfaction will pay dividends in the long-term.</p>
<p>Finally, it’s worth noting that doing this doesn’t have to cost a lot of money. Despite the popularity of quirky office design, spending millions of dollars transforming the physical surroundings of a business <a href="https://theconversation.com/heres-why-cool-offices-dont-always-make-for-a-happier-workforce-77361">isn’t what keeps employees satisfied</a>. The answer could be that employees simply <a href="https://moneyish.com/ish/this-is-the-no-1-thing-that-will-keep-you-happy-at-work">want to be appreciated</a>.</p><img src="https://counter.theconversation.com/content/87660/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>George Daskalakis does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>New research reveals a happy work force is likely to increase a business’s profitability.George Daskalakis, Lecturer in Finance, University of East AngliaLicensed as Creative Commons – attribution, no derivatives.