Menu Close
It’s a shame the adverts aren’t displaying a real product. Bahio would’ve won over a mesmerised customer. Clear Channel

Now advertising billboards can read your emotions … and that’s just the start

Advertising giant M&C Saatchi is currently testing advertising billboards with hidden Microsoft Kinect cameras that read viewers’ emotions and react according to whether a person’s facial expression is happy, sad or neutral.

The test adverts – which feature a fictitious coffee brand named Bahio – have already appeared on Oxford Street and Clapham Common in London. So we now have adverts that can read the reactions of those that view them and adapt accordingly, cycling through different images, designs, fonts and colours. With partners Clear Channel and Posterscope, Saatchi has made advertising history. When future media historians look back they will see 2015 as a landmark year.

There are three key things we should recognise: adverts can read our behaviour, this is based on our emotions rather than website browsing history, and that adverts use this to improve themselves.

What are we to make of this? Is it a bit creepy? The answer is both yes and no. What the campaign represents is an attempt to get closer to us, something that’s a defining characteristic of the advertising and audience research industries. They want to know us more intimately so as to be able to craft messages that will affect and resonate with us. It’s an example of what I call “empathic media” because, through reading facial expressions, adverts are able to bypass the guesswork and make direct use of our emotions.

Evolution of the ad

While uncanny and creative, Saatchi’s adverts are not a threat to privacy. After all, unlike our PCs, phones and tablets, these posters neither know nor care who we are. The adverts’ creators say they do not store images or data, and there is little reason to disagree. All their adverts do is react to facial shapes – the truly creepy stuff is online and in the mobile phone apps tracking our habits. For example, one study records the Eurosport Player app as having 810 data trackers collecting hardware and software information, but also navigation (where a person visits online), behaviour, visit times, visitor actions and geolocation (where a person is located in real space).

The real genius of the new advert is in using our facial expressions to learn and alter the design of the advert. Through responding to our expressions the adverts have purpose – an evolutionary urge to improve and become more effective.

This idea of adaptable advertising was foreseen around 100 years ago by advertising luminaries such as Daniel Starch and Claude Hopkins. They insisted advertising should be treated as a science based on collecting information, analysing it and using these insights to improve campaigns. Starch and Hopkins both sought to understand which techniques do and don’t work in order to make the business of advertising subject to laws of cause and effect. The grandfathers of advertising would be very pleased with today’s progeny.

Although the logic is old, processing feedback to self-correct in real-time is new. For years, Google has masterfully led the way in how adverts are automatically served based on our interests; self-improving adverts in the physical world are another step forward.

Connecting with the subject

Much of the media coverage surrounding M&C Saatchi’s adverts lauds it as an artificially intelligent campaign. While this is true to an extent, the advert is actually quite mechanical: the advertiser has no understanding of why we are smiling, grimacing or straight-faced, or of what these expressions imply. They simply match the shapes, and react.

So what would intelligent advertising look like? It would have to be able to engage with the context of our lives, in real time. What that consists of is a somewhat philosophical question, but it might encompass our individual life histories, our natural spoken language, human values, politics, current affairs, popular culture, and aesthetic trends – all topics that human ad creatives consider when putting campaigns together.

Clearly, these adverts don’t – but others in the advertising business may have the technological muscle to do so. For an insight into tomorrow’s artificially intelligent advertising, have a look at Google’s Deepmind that promises to “combine the best techniques from machine learning and systems neuroscience to build powerful general‑purpose learning algorithms”. When we remember that Google is first and foremost an advertising company, Deepmind is one to watch.

Then there are the sensors. We will soon wear and carry more sensors and we will have more sensors around us. Empathic media will grant advertisers even more insight into our emotions through how we speak to our mobile devices, more granular facial recognition and emotional insights derived from our heart rates, respiration patterns and how our skin responds to stimuli. And if that sounds far-fetched, remember you’ve just read a true story about adverts that recognise your emotions.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now