Affective computing is the idea that technology can better assist us if it can read, process and simulate human experience. One aspect of this is the recognition of human emotion through motion and physiological sensors that track verbal and nonverbal information. Computing techniques such as artificial intelligence can then be applied to understand and model human emotions. Simply put, the technology will be able to tell whether the target is fearful, happy, sad, triumphant or concentrating.
The technology used in affective computing (AC) and human-computer interaction (HCI) – an allied discipline that studies links and relationships between man and machine – is also used to track body and facial expressions of actors to create animated characters in films like Gollum in Lord of the Rings and the apes in Dawn of the Planet of the Apes.
Out in the real world
But this technology has real applications. Every day across the world, firefighters earn our respect and admiration by carrying out dangerous jobs in extreme conditions. In the US, there have been 51 fatalities already this year and official figures record 81 in 2012. These “line of duty deaths” or LODDs aren’t just from fire – stress and extraordinary conditions often lead to injury and heart attacks. To cap that, the everyday stresses on the body of lifting, climbing, wearing heavy protective gear and using heavy machinery can cause long-term physical damage.
But firefighters don’t just fight fires. Many brigades have an additional elite group of specialist rescue experts who work in the field of Urban Search and Rescue (USAR). These technicians receive extra training, have higher standards of fitness and are called into action to deal with major rescue events – events such as collapsed buildings, major transport accidents or terrorist attacks.
The first USAR teams were created by America’s Federal Emergency Management Agency in 1989, and played a leading role in the response to 9/11. In the UK, Fire Service USAR teams were created in response to 9/11.
They work at height, suspended from cranes or the sides of skyscrapers, in hazardous environments, such as under the rubble of buildings. USAR responders face the constant risk of being trapped or crushed by further collapse so they build shoring and employ structural engineers to limit the risk. They tend to work in cramped environments, operating heavy tools such as nail guns and breaking equipment – more commonly seen in the hands of workers as they dig up the road. This equipment is heavy, vibrates and is often used in cramped and unnatural working positions.
In some cases the only way of entering a building is from above and USAR responders are trained to work at height, using complex rope systems, and to protect the firefighters from dust, noxious fumes or debris, heavy protective clothing often has to be worn.
The stress involved in working in such an environment is difficult enough, but when coupled with the likelihood of having to treat catastrophically injured casualties, traumatised survivors and the bodies of victims, many of which could be children, the result can be serious psychological trauma.
But what if we could monitor, in an unobtrusive way, the emotional state of USAR workers – to tell if they are approaching a damaging level of psychological trauma, or were too physically tired to continue effectively? What if we could see – in real time – the stresses on their bodies of using breaking machinery in a confined space?
If we could build a system that alerted a crew manager when a firefighter was reaching maximum endurance, a simple rotation system may be enough to protect their long-term welfare, especially as firefighters. Affective computing could solve many of these problems.
A firefighter, wearing a series of motion sensors sewn into their tunics, leggings or built into helmets could be represented on a screen at the control point as a stick figure showing their posture at that time. If the firefighter was prone for too long that may suggest they were trapped or unconscious. If they were using a tool in a way that could lead to long-term injury a warning icon could flash over their avatar, or a timer could start, prompting the crew manager to rotate personnel.
Using AC technology, the movement of the firefighters could be interpreted to detect fatigue, psychological trauma to again warn the crew manager that there was an issue with that individual and to start a rotation.
Similar behaviour analysis technology could also be used to detect if a target is fatigued, drunk or injured. It could theoretically be employed within “smart” machines that refuse to activate for an operator who shows signs that they won’t be able to use them safely. And this isn’t confined just to firefighters – lorry drivers, factory operators and crane drivers are just some other groups where this technology could be used.
There has already been some work with AC and human-robot interaction to build robots that can express emotion in an attempt to calm trapped casualties. However, as yet there has been very little work on monitoring and protecting the rescuers themselves.
The emerging technologies of AC and its allies – wearable computing, location-based systems, ubiquitous computing (which means computing that can occur anywhere and in any device) – all seem ideally suited to addressing the problems of USAR firefighting. With firefighters being asked to work later into their careers, as well as budgets for firefighting across the globe being reduced, there is a requirement to reduce long-term debilitating injuries as much as possible.
And in addition to the ongoing, everyday risk of major transportation or construction accidents, more unstable weather patterns and the continuing risk of terrorism means the need for the specialist skills of USAR technicians is more pressing than ever.