UK United Kingdom

Predicting the technology future from disillusionment to enlightenment

Gartner Hype Cycle 2014 Gartner 2014

For 20 years, consulting firm Gartner have been calling the future of technology using its now iconic “Hype Cycle”.

The Hype Cycle: from hype to reality

The Hype Cycle breaks the introduction of new technologies into 5 phases starting with the “Technology Trigger”, the first point at which a technology comes to the attention of the press and businesses. Technologies then rapidly become oversold or hyped. This is the point at which expansive claims are made about how technology X is going to radically transform and disrupt and the early innovators push to be amongst the first to ride the wave of excitement that technology generates.

The initial hype eventually leads to a “Peak of Inflated Expectations” which is subsequently followed by the crash as it is realised that the technology isn’t going to be adopted in quite the way everyone predicted, nor is it generally as useful. This part leads to a “Trough of Disillusionment” which is accompanied by an increasing number of negative articles, project failures and lessening of interest in the technology generally.

For some technologies however, the disillusionment is followed by a gradual increase in a more realistic adoption of the technology which eventually results in a “Plateau of Productivity”.

Technologies for the next 10 years

For Gartner’s 2014 Hype Cycle, the notable technologies are speech recognition which they are claiming to be well into the productive phase. Certainly mobile phones and increasingly, wearables, have driven the adoption of voice control and interaction and it is definitely usable on a day-to-day basis.

Having said that however, Gartner also puts wearable user interfaces as having passed the peak of inlfated expectations and rapidly heading to the trough of disillusionment. Given that Google has based their interface for wearables very heavily on the use of voice, it seems odd that these two technologies would be so far apart according to Gartner.

The position of the Internet of Things at the peak of inflated expectations will also come as a disappointment to all of the companies like Cisco that are claiming that we are already well and truly in the era of billions of interconnected and independently communicating devices.

The future is lumpy

Although the Hype Cycle is a convenient way of visualising the progress of technology from invention to universal use, it over-simplifies the way progress is made in innovation. As science fiction writer William Gibson once said

“The future is already here — it’s just not very evenly distributed”

Technology innovation is never smooth and never takes a single path. There can be businesses and individuals that are using technologies to radically improve productivity at the same time as almost everyone else is failing to do the same. A good example of this is the hype around “Big Data”. Whilst everyone acknowledges that we are creating enormous amounts of data that ultimately must hold valuable information and knowledge, very few organisations are attempting, let along succeeding, in finding it. Those that are experts in Big Data are the companies that have made digitally massive infrastructure their entire existence, companies like Google, Facebook and Twitter.

Whilst Gartner has predicted that Big Data will reach the plateau of productivity within 5 - 10 years, it is also possible that it will never get there and that very few companies will have the skills to be able to take advantage of their amassed data.

The other issue with Gartner’s representation of the technologies that it surveys is that it doesn’t distinguish between the different categories of technologies. Those that are aimed at consumers as opposed to the business sector. Here again, we are likely to see very different paths to adoption and acceptance of those technologies with very different time frames.

What we are increasingly seeing is how technology is increasingly being used to enable a concentration of a very small number of very large companies. In turn, these companies are able to focus their resources on introducing new technologies for the public, rapidly iterating on designs until they work. Wearables from Apple, Google and companies like Samsung is a good example of this.

As always with predictions around technology, it is very hard to tell what will be the key technologies next year, let alone in 5 - 10 years time. Given that the Hype Cycle has been with us for 20 years however, my prediction is that it will still be here for the next 20.

What politicians say about Metadata. Bad metaphors and a bad idea.

An envelope revealing contents?

Watching politicians explain metadata has descended into black comedy as both Brandis and Abbott have tried to make distinctions about something that they clearly don’t understand.

Both have resorted to using the analogy with letters. “metadata is the material on the front of the envelope, and the contents of the letter will remain private".

Well, no.

Web metadata is not like the material on the front of a letter. A URL, the address of a website that you type into the address bar encodes far more information than the simple address. First of all, the address points to information that can be publicly accessed and so if you know the URL, you know what people have seen. Second, URLs quite often have other information that tells the website things like what you are doing. As I am writing this, the URL has information in it saying that I am editing a document and in fact details the exact document that I am editing:

Anatomy of a URL

Of course even with letters, we know something about the content and this is why the security agencies are happy with just having the addresses. If I receive a letter from the tax office or a gas company, I have a pretty good idea what the letter is about. And so it is with websites, just knowing that you have visited Amazon tells you something about what you intend to do. If you have the other addresses that were used to access the site, then you would know exactly what someone was doing.

Brandis and Abbott could have saved themselves some angst and caused less confusion if they had talked about IP addresses rather than web addresses, or URLs. If it is the intention of the security agencies to record the numeric address of the sites people visit, then that is different from the text address which encodes other information. The problem with recording a number like (the IP address of is that it can refer to any number of other actual web addresses. So could be at the same IP address as

However, it is not clear that the Government is proposing keeping just the IP addresses and it is far more likely that they mean the URLs.

Other arguments that have been put forward have been that keeping the address of the site is different from the browsing history. Again, this shows a fundamental lack of understanding of how the basics of the web works. If you search for something on Amazon, you will click on a link and are likely to go straight into the site with an address like Although you could make an attempt to strip off everything other than, it is unlikely that this will be the case and the entire URL will be kept. The processing required to analyse and make sense of URLs is not trivial and is going to prove even more burdensome than simply recording the entire URL.

It is of concern when politicians (and business leaders) talk about technologies that they clearly don’t understand and then try and argue the merits of. At best it ends up confusing and alarming industry and the public, at worst, it signals another activity that is going to threaten privacy and end up as another form of “Tax” that consumers will be asked to pay. If ISPs have to collect this data, it will prove expensive to manage and that cost will be passed on to their customers.

In terms of the benefits, will this help catch terrorists? Again, in a word, no. The average high school student understands how to use technologies to get around filters that block certain sites. The technology they use, called proxies and VPNs, allows anyone to browse whilst masking what they have been browsing. Using technologies like Tor someone can go even further to mask what they are doing.

The idea that the cost of this proposed system, even if the Government can get the design clear would justify what they might find is a very long stretch.

Why governments will always fail with technology projects like the NBN

NBN AdelaideNow

The National Broadband Network (NBN) was one of former Prime Minister Kevin Rudd’s grand gestures. Sweeping to power in 2007, he quickly set to fixing Australia’s problems in education, health and productivity. Although it was clear that he only ever understood technology superficially, he netherless saw it as the answer to all of his country’s ills. Children in schools would get laptops, the health system would be reinvented through the Personally Controlled Electronic Health Record (PCEHR) and the nation would have the NBN to set it on the path of productivity levels to rival the Chinese.

Rudd commands “Make it so”

Rudd rode roughshod over any detail of these plans. He epitomised the spirit of StarTrek TV character Captain Picard by uttering the command “Make it so!”. Unlike Picard, who never carried out any action before considering it from every angle, Rudd was convinced that these gestures would both immortalise him and transform society and industry at a stroke.

As we now know however, nothing is that simple, and the digital education revolution died a relatively quick death leaving the NBN and PCEHR struggling for survival on life-support.

The NBN independent audit

The most recent audit report of the NBN, released this week, has ultimately judged the entire project as “rushed, chaotic and inadequate” and the stewards of the project, NBN Co as “not fit for purpose”.

The summary findings and recommendations of the audit do not specifically deal with the NBN project itself nor with the technological decisions that underpinned it. Instead, it recommends a range of measures to ensure proper oversight of election and other party promises and the way that the public service and government should work to ensure that this happens in future. One of the key proposals is that any project over $1 billion should be subject to a cost benefit analysis. The problem with this however is the same as for most of the recommendations made in this report. They sound good in principle, but are actually fairly meaningless in practice.

The devil in the benefits

Cost benefit analyses of even small projects are dubious at best. They are often heavily biased depending on the whether the people conducting them want to have a positive or negative outcome. This is reflected in both how one decides what benefits will flow from a project and how much value these benefits will generate. This is especially true of technology projects where the benefits are supposed.

With the NBN, the benefits of having universal high-speed broadband were assumed to come from all of the things this technology would enable government, public services, industry and the public at large to do. Hospitals would save $190 million over 10 years. Households would save $3,800 a year by 2020. This last figure included savings of $74 dollars a year on “communications through social engagement and social media”. Clearly, you could claim almost any benefit, especially if it comes from a reputable analyst firm like Deloitte Access Economics as this latter figure did.

Would an analysis have got it right?

Would a cost benefit analysis in 2008 have accurately predicted the technical landscape in 2014? Absolutely not. In 2008, nobody knew that the iPhone would change smartphone usage completely and drive universal adoption of highly powerful portable computers with 4G wireless network connectivity. Nobody would have foreseen the rapid change in how people access media, especially TV. Despite the lack of the NBN, Australia leads the world in illegal downloads of US TV shows like Game of Thrones.

Mix into this the role of government in making technological bets that they barely understand and the establishment of a completely new company that is going to implement all of this within political, and not corporate, constraints, and you have a recipe for the disaster that unfolded.

Should governments be involved?

The findings of the audit are not surprising. Pretty much any analyst who has been watching the NBN disaster unfold for the past 6 years could have summarised the findings, or ones similar to it, without too much problem. Ultimately however, the question remains whether governments, and especially the Australian Federal Government, should be even in this business at all. It is very likely that the answer is no given the particular geographic and social challenges that Australia faces and its particular mix of telecommunications industries.

There is no disputing that Australia needs high speed networks. Clearly, they are not going to be provided by the Federal Government and so it is up to industry to step in and do this for commercial reasons. After all, the ultimate business case for building something is that there are customers wanting the service and that it makes enough money to support the business.