Ask a parent what a school should teach and they’ll tell you, “When my child leaves school, I want them to be able to understand money, to work well with others, go to university or to get a good job.”
Ask an inspirational teacher and they’ll give a different answer. In physics they’ll say students should be inspired by questions about how the universe started and whether there’s life out there. In chemistry, to learn how to build new drugs and materials that will revolutionise our lives. In history, to learn from our mistakes and successes to build a better future.
What should an inspirational teacher say about computing, the discipline that defined the late 20th century and that is already constructing the 21st? And what computing knowledge and skills should a parent expect their child to know when they leave school? Answers to these questions should inspire and define the content of any computing curriculum, including that to be delivered for the first time in the UK in September 2014.
The beginning of the modern information age can arguably be traced back to Manchester University’s “Baby”, the first programmable computer. Baby ran its first, hand-crafted 17-line programme on midsummer’s day 1948. The programme – which finds an integer’s highest factor – is very hard to read. It was written in a very early programming language, one without the bells and whistles many have today.
Even so, a student of computing should be able to read and understand how the first programme works. Computational thinking – the computer-science led basis of the UK’s new computing curriculum – provides some great tools to do so.
But being able to read and understand Baby’s first program clearly isn’t the same as being able to write it – in the same way that being able to read and understand Shakespeare isn’t the same as being able to write Romeo and Juliet.
And what if we need to bring Baby’s programme up to date? How about making it work on the web, with password protection to guard its use, or on a smartphone. What would it look like then?
Solving problems like this requires different skills. Great writers, such as George Polya, have taught us that to solve problems you need a “method”. And because much of the real-world value of modern computing is delivered through its modern methods, in addition to computational thinking and creativity that the curriculum recommends, students at each stage should be appropriately introduced to the modern ways that computing builds solutions. But this view of computing appears to be missing from the new curriculum.
Schools caught in the schism
Soon after Baby’s first programme, computers started to leave academia and make their way into business and beyond. Those were heady days when everything seemed to fall within computing’s grasp. There were breakthroughs on both sides.
Business computing invented ways of thinking about real-world problems, fuelling the development of methods able to cope with increasing complexity and high pace of change, from reaching a better understanding of users’ needs, to capturing good software design and practice for reuse, to using software to drive business change and as the backbone of today’s digital economy. With these and more, software and information systems engineering took shape.
Academia, too, made great strides: researchers pushed the boundaries of computation, learnt how to build error-free software and how to bring human logic, reasoning and other capabilities into the machine – with artificial intelligence. They learned how quickly algorithms could run and developed new ways of thinking about computation. With these and more, computer science took shape.
Unfortunately, the divide between business computing’s need for engineering and academia’s wish for computer science grew. Much of what the two approaches had in common was lost and something of a schism opened up.
Schools, which should have been the beneficiaries of their synergy, emerged with the lowest common denominator of “digital literacy” as a result of this schism. Their job was to teach children how to use office applications such as Word and PowerPoint – devoid of either engineering or science.
As ability to do computing has faltered, the realisation grew that digital literacy was not enough. But it took Eric Schmidt, executive chairman of Google, to get politicians to listen. And so a new computing curriculum was born.
The excitement and backlash in computing circles was palpable. What would schools teach and how would we introduce the next generation to computing?
A curriculum for 21st century computing
With computational thinking, the new curriculum certainly meets the challenge to go beyond digital literacy. Yet, its emphasis on code and computation appears to neglect the real world context of much of today’s computing and the hard lessons of decades of engineering software systems fit for real-world problems.
Full blown, industrial strength software engineering is like industrial chemical engineering: too complex to be taught in schools. But coding without an appreciation of computing’s methods and their associated tools simply misses the point and has as much chance of preparing someone for the complex and volatile world of computing as an English curriculum that teaches just spelling and grammar does for a student of Shakespeare wishing to emulate their teacher.
What should the next generation to meet their first computing teacher hear? It should sound something like this:
Good morning children!
If you want to solve the world’s problems with computers you’re going to have to know about how to build software with the computing method. You’re going to be able to help spacecraft fly to the stars, to discover new drugs that will make people well again, to build thinking machines that rival the human brain, to build new businesses that no-one has ever thought of before. With computing, you can dream of solving the world’s problems.
Now, here’s a problem I want you to solve with the three “C"s of computing: computational thinking, creativity and computing methods.