Design principles for edtech

plansAn infographic summarizing what needs to happen if edtech is play its part in improving education provision

I’m told that everyone who’s anyone does infographics these days—and also that most of my posts are too long and difficult to understand. Well, here is my first effort at an infographic and I hope it makes things clearer.

Continue reading

Its the technology, stupid!

wheelThe consensus is that we should not mind the technology but that we should focus instead on the learning. The consensus is wrong.

This is the transcript of a presentation I gave at the EdExec conference, held by ICT Matters in London on 6 November 2013. The ostensible argument in my talk is that “procurement matters”, which I will admit, probably isn’t going to set your heart racing. But perhaps it should. The reason why procurement matters is that technology matters – and this is a point that much of the ICT community do not generally admit. Time and again, you hear the old saw being repeated, “never mind the technology, where’s the learning?” Most of my talk addressed addressed this point—and in doing so, I take on (as is my wont in this blog) a lot of shibboleths. I summarise some arguments with which those of you who have read previous posts may be familiar, and I also shadow some arguments that I will develop in greater detail in future. And I return to a promise that I made in my first post to this blog in January 2012, which is to discuss in rather more depth than I have done before why Becta’s approach to procurement was so lamentable. Continue reading

The iTunes model in education

iTunes gift voucherDeveloping a marketplace for micro educational software and content

In response to my post MOOCs and other ed-tech bubbles (which listed OER as one of three significant “bubbles”), Daniel Clark (LearningShrew) posted an interesting piece on Key issues in OER and how we might overcome them.

Recognising that there was a problem with quality control, Daniel advocates an education equivalent of Google’s App Store. This would enable OER authors to market their products as a sort of cottage industry. The micro-market would produce a selection process, sorting the wheat from the chaff, and would incentivise authors to improve the best.

Continue reading

In the beginning was the conversation

conversationThe most fundamental of all pedagogical patterns is the conversation—and it is this paradigm that needs to inform the implementation of education technology.

Grab a cup of coffee and get comfortable! At 12,000 words this is the longest of my posts so far. But right now, it seems as if it is my most important, so I think it will be worth the read.

In 2012, I have addressed what I see as deficiencies in many of the current ed-tech theories and processes. Last month, in Education’s coming revolution, I made the general argument that education technology provided the only plausible, long-term solution to what are endemic problems in our schools, introducing a systematic approach to education that contrasted with the model of teacher-as-craftsman.

This post describes what I think those systems will look like. They will be grounded in reputable educational theory, and in particular on what is the essential design paradigm for all learning: the conversation.

Continue reading

Education’s coming revolution

The requirement for education technology rests, not on spurious arguments about “21st century skills”, but on a long-standing need to find a way of teaching traditional skills systematically and at scale. To succeed, education has to go through its own industrial revolution, which will introduce systematic processes, backed by effective quality controls and robust quantitative evidence of effectiveness.

At a recent event in London reported by Merlin John[1], Tim Oates of Cambridge Assessment suggested that the British textbooks produced in the 1970s by the School Mathematics Project (SMP) and the Nuffield Science series still represented the best resources around in their respective fields. This is startling claim, coming as it does after 40 years in which we have seen a revolution in information technology and the expense of billions of pounds on technology in schools.

Two of the leading figures in the textbook publishing movement of the 1970s were both Headmasters at my old school, Sevenoaks, a commuter town 25 miles south of London. At the time that I was at the school in the late 1970s, the Headmaster was Alan Tammadge, a principal author of the SMP series for Maths. The previous Headmaster had been Kim (L C) Taylor, who had resigned from Sevenoaks in 1970 to become Director of the Nuffield Resources for Learning project.

Resources for Learning[2] was also the name of a book, written in the following year, in which Taylor provided a justification for the Nuffield programme. He questioned whether the comprehensive education that was being introduced in the UK at the time was realistic. His concern was not about the phasing out of selection: the problem that caught Taylor’s eye was the fact that the new system of secondary education was to be universal. Traditional education had always been provided to a small elite based on a model of the teacher-as-craftsman. So long as we clung to that model, Taylor argued that there would not be enough sufficiently well qualified teachers to go around.

Continue reading

What do we mean by “content”?

A presentation given to an Ad Hoc group in ISO/IEC SC36, responsible for scoping future standards work for digital learning content

Learning content is a divisive concept. Over the last few years it has become increasingly fashionable to criticize “content-driven” systems as encouraging transmissive or instructionalist styles of teaching. Ian Usher from Buckingham County Council reported in 2008 that “the best work we’ve seen within our Moodles in Buckinghamshire hasn’t come from great swathes of pre-produced content but from interactions…between learners and other learners (with teachers in there as well)”.  This echoes a 2006 article by Stephen Heppell stating that “Content isn’t king any more, but community might just be sovereign”.

There are two questionable assumptions that lie behind this now established orthodoxy:

  • the assumption that content and community are opposed to one another;
  • the assumption that we know what we mean by “content” in the first place.

The following presentation argues that the problem with concept of learning content is not that it is pedagogically flawed—but that it is misunderstood. Continue reading

Learning analytics for better learning content

A resume of a break-out discussion at the JISC/CETIS annual conference

For me, the highlight of the 2012 JISC/CETIS annual conference[1] was Adam Cooper’s session on “Mapping Cause and Effect”. Adam asked participants to create diagrams which traced chains of causality (both negative and positive) through to a final, desirable, pre-defined outcome.

I joined a break-out group with Colin Smythe (Chief Architect at IMS ), Tore Hoel (Oslo and Akershus University, Sweden), Malcolm Batchelor (JISC/CETIS), and Seyoung Kim (Carnegie Mellon University, USA). We chose to analyse what preconditions would favour or disfavour the use of analytics to improve the quality of course materials, taking course materials to be synonymous with learning content. We envisaged a scenario in which the author of a digital course might be able to track the performance of students taking the course. Having discovered from this data which parts of the course worked well and which worked less well, the author could improve the quality of the course materials.

Continue reading