An infographic summarizing what needs to happen if edtech is play its part in improving education provision
I’m told that everyone who’s anyone does infographics these days—and also that most of my posts are too long and difficult to understand. Well, here is my first effort at an infographic and I hope it makes things clearer.
The consensus is that we should not mind the technology but that we should focus instead on the learning. The consensus is wrong.
This is the transcript of a presentation I gave at the EdExec conference, held by ICT Matters in London on 6 November 2013. The ostensible argument in my talk is that “procurement matters”, which I will admit, probably isn’t going to set your heart racing. But perhaps it should. The reason why procurement matters is that technology matters – and this is a point that much of the ICT community do not generally admit. Time and again, you hear the old saw being repeated, “never mind the technology, where’s the learning?” Most of my talk addressed addressed this point—and in doing so, I take on (as is my wont in this blog) a lot of shibboleths. I summarise some arguments with which those of you who have read previous posts may be familiar, and I also shadow some arguments that I will develop in greater detail in future. And I return to a promise that I made in my first post to this blog in January 2012, which is to discuss in rather more depth than I have done before why Becta’s approach to procurement was so lamentable.Continue reading →
A copy of a comment regarding the difference between customisation and adaptation, and the importance of the latter to learning content that encapsulates pedagogy.
It is a central argument of this blog that the attempt to apply technology to the improvement of education has been held back by the lack of education-specific software. Such software will generally encapsulate pedagogy. An objection to this approach was recently raised by Peter Twining in a useful discussion on his blog, EdFutures. It is a little difficult to link directly to the part of the conversation where this occurs – the best way is probably to follow the link to the discussion page and then to search for “Re Technology Enhanced Learning”, which is the title of the thread in which this discussion occurs.
To paraphrase the general objection to software that encapsulates pedagogy, such software might be seen as a way of scripting lessons that dis-empower the teacher. At the top level, I would respond that many teachers have a pretty shaky understanding of pedagogy, so the ability to put pedagogically proven tools into their hands is a key way in which we will empower (not dis-empower) teachers (see my Education’s coming revolution). As for the nature of those tools, I certainly accept that the way in which software is used in the classroom needs to be flexible, allowing the teacher (the professional on the spot) to apply the software in the right way. This provides the background to my conversation with Peter Twining regarding the customisation or adaptation of education-specific software.
Peter’s argument is that, according to an OU project in the 1990s called SoURCE, in which he was involved, the pedagogy encapsulated in software often needed to be subverted by the teacher—and that this suggested that the encapsulation of pedagogy was something of a blind alley. I copy below my reply to Peter, followed by my conclusion.
Recognising that there was a problem with quality control, Daniel advocates an education equivalent of Google’s App Store. This would enable OER authors to market their products as a sort of cottage industry. The micro-market would produce a selection process, sorting the wheat from the chaff, and would incentivise authors to improve the best.
The most fundamental of all pedagogical patterns is the conversation—and it is this paradigm that needs to inform the implementation of education technology.
Grab a cup of coffee and get comfortable! At 12,000 words this is the longest of my posts so far. But right now, it seems as if it is my most important, so I think it will be worth the read.
In 2012, I have addressed what I see as deficiencies in many of the current ed-tech theories and processes. Last month, in Education’s coming revolution, I made the general argument that education technology provided the only plausible, long-term solution to what are endemic problems in our schools, introducing a systematic approach to education that contrasted with the model of teacher-as-craftsman.
This post describes what I think those systems will look like. They will be grounded in reputable educational theory, and in particular on what is the essential design paradigm for all learning: the conversation.
The requirement for education technology rests, not on spurious arguments about “21st century skills”, but on a long-standing need to find a way of teaching traditional skills systematically and at scale. To succeed, education has to go through its own industrial revolution, which will introduce systematic processes, backed by effective quality controls and robust quantitative evidence of effectiveness.
At a recent event in London reported by Merlin John, Tim Oates of Cambridge Assessment suggested that the British textbooks produced in the 1970s by the School Mathematics Project (SMP) and the Nuffield Science series still represented the best resources around in their respective fields. This is startling claim, coming as it does after 40 years in which we have seen a revolution in information technology and the expense of billions of pounds on technology in schools.
Two of the leading figures in the textbook publishing movement of the 1970s were both Headmasters at my old school, Sevenoaks, a commuter town 25 miles south of London. At the time that I was at the school in the late 1970s, the Headmaster was Alan Tammadge, a principal author of the SMP series for Maths. The previous Headmaster had been Kim (L C) Taylor, who had resigned from Sevenoaks in 1970 to become Director of the Nuffield Resources for Learning project.
Resources for Learning was also the name of a book, written in the following year, in which Taylor provided a justification for the Nuffield programme. He questioned whether the comprehensive education that was being introduced in the UK at the time was realistic. His concern was not about the phasing out of selection: the problem that caught Taylor’s eye was the fact that the new system of secondary education was to be universal. Traditional education had always been provided to a small elite based on a model of the teacher-as-craftsman. So long as we clung to that model, Taylor argued that there would not be enough sufficiently well qualified teachers to go around.
Why most of what currently excites the ed-tech world is hot air: MOOCs, Learning Analytics and Open Education Resources, amongst other fads.
I already know what my new year’s resolution will be. As well as losing a stone in weight (the same resolution every year), it will be to stop writing almost exclusively on why education technology has so far failed to transform education, and to focus more on arguing how education technology will transform education, when it is properly implemented. As the song has it:
You’ve got to accentuate the positive
Eliminate the negative
Latch on to the affirmative
Don’t mess with Mr In-between.
The predominantly negative copy of 2012 has been no more satisfying to write than I imagine it has been to read. But it has been necessary. It is impossible to make progress with a cogent argument for how education technology will transform education while most of the community accepts as self-evident half-baked notions of “independent learners” and “21st century skills”, believes that creativity is possible without knowledge, or that testing is a dirty word. Before making a start on constructing the new you need to demolish the old.
That will be my resolution on 1st January—but for the last few days of 2012, I will follow the prayer of St Augustine (“Lord make me chaste but not yet”) and take one last swing with the old ball and chain. Continue reading →
A resume of a break-out discussion at the JISC/CETIS annual conference
For me, the highlight of the 2012 JISC/CETIS annual conference was Adam Cooper’s session on “Mapping Cause and Effect”. Adam asked participants to create diagrams which traced chains of causality (both negative and positive) through to a final, desirable, pre-defined outcome.
I joined a break-out group with Colin Smythe (Chief Architect at IMS ), Tore Hoel (Oslo and Akershus University, Sweden), Malcolm Batchelor (JISC/CETIS), and Seyoung Kim (Carnegie Mellon University, USA). We chose to analyse what preconditions would favour or disfavour the use of analytics to improve the quality of course materials, taking course materials to be synonymous with learning content. We envisaged a scenario in which the author of a digital course might be able to track the performance of students taking the course. Having discovered from this data which parts of the course worked well and which worked less well, the author could improve the quality of the course materials.