Matthew Hancock, Parliamentary Under-Secretary of State for further education, skills and lifelong learning, announced in the Sunday Times yesterday that the government was setting up a Whitehall unit “to examine how children can be taught by computers that use sophisticated algorithms to set the pace according to individual ability”. After three and a half years of virtual silence on ed-tech, this is a welcome and exciting announcement. Online tuition, he says, is the key that “could help to raise Britain from the bottom of the international educational league tables”, using technology in a complementary role to teachers, so that “computers [will] take the lead in ‘imparting knowledge’ while teachers focus on ‘mentoring, coaching and motivating’”. Not only does this statement put education technology back on the political agenda in the UK, but it does so on completely different terms from those previously proposed by the advocates of independent learning, twenty-first century skills, and the wisdom of the crowd. Instead, it reflects something very similar to the position that I have been arguing on this blog. There is even a sense of urgency detectable in the fact that Mr Hancock wants “the changes…to be implemented as soon as possible”.
A copy of a comment regarding the difference between customisation and adaptation, and the importance of the latter to learning content that encapsulates pedagogy.
It is a central argument of this blog that the attempt to apply technology to the improvement of education has been held back by the lack of education-specific software. Such software will generally encapsulate pedagogy. An objection to this approach was recently raised by Peter Twining in a useful discussion on his blog, EdFutures. It is a little difficult to link directly to the part of the conversation where this occurs – the best way is probably to follow the link to the discussion page and then to search for “Re Technology Enhanced Learning”, which is the title of the thread in which this discussion occurs.
To paraphrase the general objection to software that encapsulates pedagogy, such software might be seen as a way of scripting lessons that dis-empower the teacher. At the top level, I would respond that many teachers have a pretty shaky understanding of pedagogy, so the ability to put pedagogically proven tools into their hands is a key way in which we will empower (not dis-empower) teachers (see my Education’s coming revolution). As for the nature of those tools, I certainly accept that the way in which software is used in the classroom needs to be flexible, allowing the teacher (the professional on the spot) to apply the software in the right way. This provides the background to my conversation with Peter Twining regarding the customisation or adaptation of education-specific software.
Peter’s argument is that, according to an OU project in the 1990s called SoURCE, in which he was involved, the pedagogy encapsulated in software often needed to be subverted by the teacher—and that this suggested that the encapsulation of pedagogy was something of a blind alley. I copy below my reply to Peter, followed by my conclusion.
Following my previous posts on the review of the National Curriculum (Digital literacy and the new ICT curriculum and Good lord! Where’s the digital literacy?), I submitted the following response to the DfE’s consultation on the National Curriculum, with particular reference to Computing.
The requirement for education technology rests, not on spurious arguments about “21st century skills”, but on a long-standing need to find a way of teaching traditional skills systematically and at scale. To succeed, education has to go through its own industrial revolution, which will introduce systematic processes, backed by effective quality controls and robust quantitative evidence of effectiveness.
At a recent event in London reported by Merlin John, Tim Oates of Cambridge Assessment suggested that the British textbooks produced in the 1970s by the School Mathematics Project (SMP) and the Nuffield Science series still represented the best resources around in their respective fields. This is startling claim, coming as it does after 40 years in which we have seen a revolution in information technology and the expense of billions of pounds on technology in schools.
Two of the leading figures in the textbook publishing movement of the 1970s were both Headmasters at my old school, Sevenoaks, a commuter town 25 miles south of London. At the time that I was at the school in the late 1970s, the Headmaster was Alan Tammadge, a principal author of the SMP series for Maths. The previous Headmaster had been Kim (L C) Taylor, who had resigned from Sevenoaks in 1970 to become Director of the Nuffield Resources for Learning project.
Resources for Learning was also the name of a book, written in the following year, in which Taylor provided a justification for the Nuffield programme. He questioned whether the comprehensive education that was being introduced in the UK at the time was realistic. His concern was not about the phasing out of selection: the problem that caught Taylor’s eye was the fact that the new system of secondary education was to be universal. Traditional education had always been provided to a small elite based on a model of the teacher-as-craftsman. So long as we clung to that model, Taylor argued that there would not be enough sufficiently well qualified teachers to go around.
Now that the use of the term “ICT” is coming under increasing scrutiny in the schools sector, many are making more use of the term “TEL” But “TEL” has similar flaws to “ICT”, as was brought home to me when attending the Online Educa Berlin conference last week.
Technology Enhanced Learning (TEL) has for some time been the preferred term for the academic community when referring to the application of technology to the improvement education. It has been the title of various funding streams of the European Commission (such as TeLearn and TELNET. The only slightly different “Technology Supported Learning” appears in the strap-lines of conferences such as Online Educa Berlin (which I attended last week) and Learning Technologies, to be held in London in January.
This post makes the case that the HE “TEL” community has been just as ineffective as the schools-level “ICT” community at delivering real improvements in education—and that some of the key reasons for this failure are embedded in the terminology itself.
Interoperability is critical if we are to build a market for educational technology, a market which will in turn enable the pedagogical innovations capable of transforming education. This post identifies six interoperability specifications which would take the first steps in this direction.
I will start by painting a quick picture of the overall education technology ecosystem towards which I think we should be aiming. I will then describe the six standards for data interoperability that I think will provide the foundation for the market that will be needed to deliver that ecosystem. These are standards for:
- Digital Learning Activities
- Reporting of performance metrics
- Declarative sequencing
- Managed use of creative tools
- Competency definitions
- Open classroom response systems
The clue to the mystery of missing racehorse, Silver Blaze, was provided by “the dog that did nothing in the night-time”. It was the absence of any barking as Silver Blaze was removed from her stable that aroused Sherlock Holmes’ suspicions that it had been the stable manager himself had taken the horse.
When called upon by Michael Gove to engage in “a serious, intelligent conversation about how technology will transform education”, the education technology community proved almost as unresponsive as the dog in Silver Blaze’s stable. If it woke up at all, it was only to wag its tail.
Michael Gove did not only call for a “serious intelligent conversation” in his BETT 2012 speech, he also told people where that conversation was to happen. Naace and ALT had already set up a discussion site at www.SchoolsTech.org.uk, where they hosted the conversation over the second half of January and February 2012, with the collaboration of the DfE, which provided the stimulus questions. In July 2012, Naace and ALT published the conclusions of the conversation in a joint report, Better learning through technology (BLTT).
Both the level and quality of the debate were disappointing: the respected ed-tech journalist, Merlin John, rated most of the contributions to the debate “lacklustre”.
This post will ask three questions:
- why did the “serious, intelligent debate” not happen as we all might have hoped?
- to what extent does Better learning through technology make good the deficit?
- now that the Naace/ALT report has been published, what conclusions should we draw and how can we now move forwards again?
A copy of a response to a thoughtful New Statesman article. The article claims that Gove’s reputation is built on a myth because (1) his claim to be reintroducing rigour will turn out to be bogus; (2) he is centralising power in Whitehall and not, as he claims, in the hands of parents; (3) that the benefits of academies will not spread beyond a few model schools; and (4) that the claim to put an end of Labour’s white elephants (ICT and BSF) fails to recognise the continuing need, at least to update the school estate.
The jury is still out on point (1). With respect to (2) it is faulty logic to argue that because Whitehall is becoming more powerful at the expense of local authorities, therefore parents may not also become more powerful. But although I am a supporter of what Gove is doing, I tend to agree with the New Statesman on points (3) and (4). Below is a copy of my comment submitted on their website.
A presentation given to an Ad Hoc group in ISO/IEC SC36, responsible for scoping future standards work for digital learning content
Learning content is a divisive concept. Over the last few years it has become increasingly fashionable to criticize “content-driven” systems as encouraging transmissive or instructionalist styles of teaching. Ian Usher from Buckingham County Council reported in 2008 that “the best work we’ve seen within our Moodles in Buckinghamshire hasn’t come from great swathes of pre-produced content but from interactions…between learners and other learners (with teachers in there as well)”. This echoes a 2006 article by Stephen Heppell stating that “Content isn’t king any more, but community might just be sovereign”.
There are two questionable assumptions that lie behind this now established orthodoxy:
- the assumption that content and community are opposed to one another;
- the assumption that we know what we mean by “content” in the first place.
The following presentation argues that the problem with concept of learning content is not that it is pedagogically flawed—but that it is misunderstood. Continue reading