Menu Close

Revealed: the unparalled scale of big data analysis behind Santa’s naughty or nice list

Santa’s so busy he has no time to play, capturing data at 1.25Gb/s, the NSA way. Shannon-Long/knowyourmeme

After a year in which further details of national intelligence agencies’ shadowy surveillance networks were laid bare, a fresh leak of documents reveals the obsessive surveillance that extends as far north as Lapland…

Santa’s Arctic Workshop (SAW) has deployed a multinational panopticon surveillance programme, according to leaked documents being called “the Snowman Files”.

According to these documents, the Arctic workshop runs a behavioural monitoring project, part of a previously wider undisclosed programme called “Operation Naughty or Nice”, which allows chief executive, Santa Claus, and a group of moralising elves to collect behavioural evidence of children worldwide through conventional and “enhanced” techniques.

Claus was purported to be inspired by the conventional surveillance techniques used by agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ). These organisations run extensive programmes that capture content and metadata, gleaned from search histories, emails, chat records, and even webcam images worldwide.

The Snowman Files contain documents classified by Claus as Top Secret, not to be distributed outside the grotto. These are intended to train elves in intelligence gathering operations and big data analysis capabilities. They reveal claims of “data collection directly from the minds and bodies of everybody everywhere.”

SAW conducts real-time data collection, storage and processing directly from everyone in order to determine “naughty or nice” markers. Fuzzy algorithms are then applied to the markers to determine overall moral scores used to determine whether children deserve presents, or coal.

However, SAW is also said to employ “enhanced” techniques involving mapping neuronal activity when children are sleeping in order to perform whole brain emulation, according to the source of the leaked documents, Eddie the Snowman.

These techniques use metabolomic analysis, the study of the chemical fingerprints left behind by biological processes, to emulate human brains. This information captures not only state of the brain’s neurons, but also the chemical environment – in particular the hormones associated with naughtiness and niceness.

“Operation Naughty or Nice is a major upgrade from SAW’s previous methods. This supplements self-reports at shopping centres, handwriting analysis of want lists sent to the grotto, and the use of elves to watch children when they are awake,” Eddie the Snowman said.

Santa’s Arctic Workshop panopticon

He revealed details of a truly astounding data collection operation. “In 2013, as the NSA Prism program came to light,” said Snowman, “Santa decided it was time for an upgrade. I was tasked with setting up the datacentre. Whereas the NSA only stores exabytes of information, we needed to store Santabytes of data.”

An exabyte is one quintillion bytes (1018), the equivalent of one million terabytes. An average personal computer hard drive might hold half a terabyte (500 gigabytes), while the NSA Intelligence Community Comprehensive National Cybersecurity Initiative Data Center in Utah is estimated to store 12 exabytes (12m terabytes) of data.

To put that into perspective, GCHQ taps and stores data it intercepts from 200 internet links worldwide for three days, with data captured at around 1.25 gigabytes per second. With 1,000 gigabytes in a terabyte, GCHQ captures around one terabyte every 17 minutes, requiring a storage capacity of around 65,000 terabytes for the three days of collection.

Santabytes: not appearing in this list. Research Trends

It has been reported that the Pentagon would like to be able to process yottabytes of data, where a yottabyte is one million exabytes, the sort of size estimated to be necessary to contain all the communications surveillance material of all US intelligence agencies. Even at £1 a terabyte (a very low estimate), one yottabyte would cost around one trillion pounds.

There is no internationally-accepted name for a quantity of data bigger than yottabytes, so SAW had to invent their own. One Santabyte is one million yottabytes.

The metabolome biological modelling techniques deployed by SAW require around 10,000 exabytes per emulated brain. With 7.3 billion people currently on the planet, SAW’s storage runs to around 73 Santabytes of brain data, with a further Santabyte for all of the communications data from all of the world’s intelligence agencies.

This Santascale datacentre requires a tremendous amount of power to run. The NSA’s Utah data centre “only” uses 65 megawatts of electricity and costs around US$40m a year. SAW’s santa-scale data centre requires exawatts of electricity (which is more power than the Earth receives from the sun). But given NORAD’s confirmation that Santa Claus can travel faster than light, and “functions within his own time-space continuum” it comes as no surprise that his facilities can accommodate such power.

Alleged elf surveillance network

Concerns have been raised about the Elf on the Shelf toy, and the normalisation of surveillance in the younger generation. Described as a “special scout elf sent from the North Pole” to “help Santa Claus manage his naughty and nice lists”, the “toy” is said by parents to children to be a “Christmas tradition”.

Laura Pinto, assistant professor of digital education at the University of Ontario Institute of Technology, recently wrote that the Elf was “a capillary form of power that normalises the voluntary surrender of privacy, teaching young people to blindly accept panoptic surveillance.”

With what we’ve learned from the Snowman Files, it’s clear that Santa Claus does indeed have an exceptional surveillance network and the data-analysis capacity to know if you’ve been bad or good. Given the levels of surveillance we are now all under, it is as wise as ever to learn to “be good, for goodness sake”.

Want to write?

Write an article and join a growing community of more than 190,800 academics and researchers from 5,056 institutions.

Register now