Now that the use of the term “ICT” is coming under increasing scrutiny in the schools sector, many are making more use of the term “TEL” But “TEL” has similar flaws to “ICT”, as was brought home to me when attending the Online Educa Berlin conference last week.
Technology Enhanced Learning (TEL) has for some time been the preferred term for the academic community when referring to the application of technology to the improvement education. It has been the title of various funding streams of the European Commission (such as TeLearn and TELNET. The only slightly different “Technology Supported Learning” appears in the strap-lines of conferences such as Online Educa Berlin (which I attended last week) and Learning Technologies, to be held in London in January.
This post makes the case that the HE “TEL” community has been just as ineffective as the schools-level “ICT” community at delivering real improvements in education—and that some of the key reasons for this failure are embedded in the terminology itself.
The semantic argument
As Douglas Adams’ Hitchhiker’s Guide to the Galaxy observed, terminology matters.It is of course perfectly natural to assume that everyone else is having a far more exciting time than you. Human beings, for instance, have a phrase which describes this phenomenon: “The other man’s grass is always greener”. The Shaltenak race of Broupkedron 13 had a similar phrase but since their planet is somewhat eccentric, botanically speaking, the best they could manage was, “The other Shaltenak’s jupelberry shrub is always a more mauvey shade of pinky russet”—and so the expression soon fell into disuse and the Shaltenaks had little option but to become terribly happy and contented with their lot, much to the surprise of everyone else in the galaxy, who had not realised that the best way not to be unhappy is not to have a word for it.
The wrong terminology can make it impossible to say some things while other, unexamined assumptions become hard-wired into the discourse.
I suggest that there are four problems with “TEL” as a term: one for each of the first two words in the acronym and two that are associated with the third word.
In the case of “technology”, the problem lies not in the word but in the context in which it is used. The reference to technology tout court, without any qualifier, suggests a conception of technology as a kind of generic commodity, the sort of thing you might go down to the local supermarket to buy in large jars, like peanut butter. “Could I have some technology on my learning please?”
Technology isn’t like that: it is an abstract, not just a collective noun. Technology would be better thought of as an “opportunity to innovate”, rather than a commodity. Yet the reference to technology, without any qualifier, suggests that any technology will do.
This is not just a semantic argument: it is in just this sense that the “technology” in TEL is thought of. Motivational speeches at Online Educa Berlin tried to persuade attendees that they were living through a revolutionary moment in education by describing the impact of mobile devices on the businesses of taxi companies and hotels. Projects trying to apply technology to our classrooms focus on digital video and internet discussion. And the discussion forums buzz with excitement over the potential impact on education of tablet computers, Second Life, or commercial games. In every case, people are transfixed by generic, off-the-shelf technology, developed by other people for other purposes—and the importance (even the possibility) of education-specific technology is ignored. In the words of Richard Noss, Director of the London Knowledge Lab at the Institute of Education in London, “For too long learning has been subsisting on the crumbs of technologies designed for other purposes”.
The notion that we need education-specific technology does not, of course, mean that we need to bin the generic technologies that we already have and start again. The development of sector-specific technologies is about refining and adapting more generic technologies, about dwarves sitting on the shoulders of giants. When I suggested the need for education-specific technology in the SchoolTech discussion, Theo Kuechel replied:Mmmm…. I haven’t seen an educational version of the most important “real world” technology yet… namely The Internet.
ps Thank Goodness! : )
This misses the point. The purpose of sector-specific technologies is not to duplicate generic technologies but to supplement them by addressing the sector-specific requirements that generic technologies overlook. It would be absurd to suggest that we try to reinvent the internet—but this point of view ignores the nested nature of technological developments (see my previous post Aristotle’s Saddlemaker). What is needed is to refine the internet to serve our sector-specific requirements, to grow some outer branches on our particular (and currently stunted) part of the technology tree—just as Facebook, Google or the Worldwide Web itself extend the Internet in their particular parts of the technology tree.
The problem with “enhanced” is the same as with “ICT”: it confuses the use of technology to improve education and the teaching of technology as an end of education. It would be perfectly reasonable to claim that learning could be enhanced in both senses:
- by using technology to ensure that people learn more quickly and effectively;
- by teaching students a technology-enhanced curriculum.
In practice, “TEL” is generally used in the first of these two senses—but the ambiguity in the terminology is still significant as the distinction between the use and teaching of technology is still widely ignored. We need a terminology which makes this distinction clear.
The ambivalence, noted above, as to what sort of enhancement we are talking about, is partly the responsibility of the word “learning”, which can refer both to the process of acquiring competencies and to the substance of the competencies acquired—as when you might say of someone that “he has much learning”. Both “teaching” and “education”, by contrast, refer unambiguously to processes.
My objections to the third word in the “TEL” acronym take on a more widely revered shibboleth in the educational community: the perception that we should be talking about “learning” rather than “teaching” or “education”. Although learning constitutes part of the objective of teaching or educating, I suggest that there are two reasons why this emphasis on “learning” is unhelpful.
Learning is not necessarily good
Learning is something humans do naturally all the time. But much of our learning is harmful, not beneficial: becoming addicted to hard drugs is a learnt behaviour; habitually beating up your girlfriend is a learnt behaviour; not getting out of bed before mid-day is a learnt behaviour. The job of the educationalist is to lead—educare in the Latin—students to the right sorts of learning and to avoid the wrong sorts of learning. Educationalists who promote “learning” as an intrinsic good ignore this responsibility.
Teaching (not learning) is what we do
However much we want our students to learn what we are teaching them, we (qua teachers) cannot induce them to do so except by teaching them. Isaac Newton denied the possibility that a force could operate at a distance, without any particle, wave or other influence travelling between the affecting and affected objects. He was talking about gravity but education is the same. We cannot induce learning to occur by magic: if we as teachers can have any beneficial effect at all, it will be through the medium of teaching.
Teachers who constantly talk about learning are therefore like motor manufacturers who constantly talk about driving, or advertisers who constantly talk about selling. Learning, driving and selling might be the ultimate objectives of these three different types of expertise—but it is not what they themselves, in respect of their expertise, actually do. Such discussions may reflect aspirations but they will not improve the efficiency of any operations under the control of the agents concerned.
Talk of “learners” rather than “students” or “pupils” is similarly aspirational. By assuming they are learning, ex-officio as it were, we feel less urgency in considering the measures that are needed to ensure that they do learn. The possibility that “the learners are not learning” becomes—by virtue of the poorly chosen words that we use—a contradiction in terms.
Use of the word “teaching” does not imply any particular style. You do not have to stand at the front of the class and use chalk and talk. You can be an ever-so-subtle facilitator of learning, you can create micro-worlds and other creative opportunities, designed to maximise the opportunities for the student to discover particular insights, as if for themselves. What you are doing is still teaching—and the underlying process is still fundamentally transmissive—and rightly so. If we condemned our students to go back to square one and discover for themselves all of human knowledge, it is unlikely that they would ever get much further than the stage of civilization attained by the early hominids, roaming the savannahs of East Africa.
Finally, there are many parts of our teaching practice that do not directly involve learning by the student: designing programmes of study, marking student work, tracking student performance and planning interventions and remediation. All these suggest requirements for teaching technology, which will be entirely missed by a discipline which focuses only on learning technology.
Constructivism: the intellectual hinterland of the doctrine of learning
The two specific problems discussed above need to be supplemented by a discussion of a deeper orthodoxy still that underpins the focus on learning—the theory of constructivism—the observation that learners “construct” their own knowledge. It is this theory that has spawned an approach to child-centred, personalised learning in which the learner becomes responsible both for what and how they learn. Although it sometimes seems hard to find an educationalist who does not subscribe to the doctrine of constructivism, the theory seems to me to be deeply flawed, at least at the fundamental, epistemological level.
Those who assert that learners construct their own knowledge had better start with a clear understanding of what they mean by “knowledge”. Epistemologists define knowledge as “justified, true belief” (this is not the end of the story—but it is a good first step). You cannot claim to “know” something unless:
- you believe it to be true;
- it is true;
- you have a reason for believing it is true.
We may “construct” our own belief, at least in the sense that belief must somehow be encoded in our own brains and most of us would think of our brains as being part of “me”. That itself is a simplification because the overwhelming majority of information encoded in our brains is not only unknown to the conscious “me” but is encoded in our brains without our conscious participation. Belief happens not only without, but sometimes despite the conscious “I”. I might like very much to believe in God or fairies or Father Christmas—but as every vicar who has suffered a crisis of faith has found out, the conscious “I” cannot choose what it wants to believe in.
Even if the ego could construct its own belief, belief is not the same as knowledge. The acquisition of beliefs that are true and justified requires more than construction—it requires belief that has been constructed in the right way. What constitutes such a “right way” cannot be learnt once and applied for ever thereafter. Every discipline, every topic and every individual assertion will have its own criteria against which proper justification must be measured (truth is, in the final analysis, beyond our ability to prove conclusively). In abstract academic disciplines, these “criteria” cannot always be learnt by exploring a microworld. The student can fail to write a poem in iambic pentameter and he does not burn his fingers, fall off his bicycle or have the sky fall in. He needs a teacher to point out the mistake.
So constructivism itself, while it points out some homely pedagogical truths about the need for student engagement in his own learning, is—judged as a grand theory—based on some very dodgy foundations. It should not be taken to provide a justification for continuing to talk about “learning” at the expense of “teaching”.
This is perhaps why Seymour Papert’s invented his own term, “constructionism”, to encapsulate the idea that children learn through creative play, in particular using Microworlds, programming environments and robotics, while avoiding the relativist implications of the theory of “constructivism”.
The empirical argument
You may think that the argument above is a bit pedantic. You might say “Who cares what we call the thing? What really matters is whether it works”.
The answer is that it doesn’t—at least not reliably, sustainably and scalably. A detailed examination of the evidence is for another post but the general conclusion should not surprise anyone. Occasionally, news is heard that TEL has been responsible for a dramatic improvement in results in some remote and probably well-funded project, but firm, statistically significant evidence seems to be as hard to track down as the Abominable Snowman. A presentation at a pre-conference workshop on Wednesday by the European Commission, which has funded a series of large academic research projects into TEL, explicitly recognised that there was a general “absence of evidence” that the projects had achieved lasting impact. A report out last month from the London KnowledgeLab is based on the premise that “evidence of digital technologies producing real transformation in learning and teaching remains elusive”.
Another session during the Online Educa conference discussed Massive Open Online Courses (MOOCs), with presentations from Gary Matkin of the University of California and Robert Cummings of the University of Mississippi. MOOCs are about the most exciting kid on the block at the moment from the point of view of TEL and many people are speculating on how companies like Coursera, Edx and Udacity are going to revolutionise Higher Education. The New York Times called 2012 “the year of the MOOC”. But the more the speakers evangelised at Online Educa, the more it became clear to me that MOOCs were much more likely to be a bubble than a revolution. Everyone is excited by the fact that they are free—but this does not mean much so long as none of them have yet worked out a sustainable business model. Although they claim to accommodate hundreds of thousands of students, their method of delivery, in the main, “simply involves videotaping lectures and putting them online”. This regressive, transmissive pedagogy, was amply reflected in the Online Educa Exhibition Hall, which was dominated by companies providing online video streaming services. And the most telling statistic of all about MOOCs is that they have a 93% drop-out rate. When I questioned Gary Matkin on this disappointing statistic, he replied that the metric was to blame. High drop-out rates were fine: the 93% of people who did not finish were presumably only interested in Chapter 1 (though no-one appeared to have bothered to establish this by actually asking them).
The key to making the MOOC even halfway successful is in providing a scalable, machine-delivered pedagogy (I say “halfway successful” because my own presumption would be that blended options will always be preferable). In the absence of education-specific technology, this is just what the academic TEL community has failed to do. Donald Clark of the University for Industry, speaking at an earlier session, The Empty Campus, admitted that the delivery of traditional e-learning courses was no cheaper than face-to-face delivery, owing to the requirement for intensive, one-to-one online tutoring. The use of generic digital communications technologies to deliver courses remotely has for many decades proved useful for isolated learners (be they adults in the workplace, excluded pupils, or children living in the Outer Hebrides). But as a way of improving the efficiency of mainstream education, the overcoming isolation does not constitute a sufficient benefit.
I had numerous conversations, with speakers, with the Question Time panel run by Graham Attwell, and with old e-learning campaigners whom I met in different lunch queues, who all agreed that the claim being made by the “TEL” acronym did not really stack up. Many also volunteered or agreed with the view that the TEL community represented an introspective bunch of academics saying the same things, year after year, and responding with intolerance to criticism.
One cause of the confusion lies in what exactly the role of academics should be in this field. No-one seems quite sure whether they are speaking as:
- practitioners, who use TEL in their own courses (which is why conferences like Online Educa are not only given by academics but are predominantly about HE);
- experts, who hold transferable and marketable TEL skills;
- visionaries, who pronounce on how technology could or should change the aims of education and its wider contribution to society.
In practice, the first and last of these three roles tends to predominate. Many speakers come over as opinionated practitioners working with off-the-shelf or home-made technologies, rather than genuine experts working with industry.
The lack of obvious expertise links to the apparent disinterest in new technologies. On one of the exhibition stands I came across a company selling eye-tracking technology. An attachment at the bottom of the screen could monitor where on the screen you were looking, allowing you to play a game of Asteriods just by looking at any threatening rock. Here is a technology which I suspect might have considerable potential for education:
- improving accessibility for students who cannot handle a mouse;
- providing immersive simulations, by which students not only spoke to voice recognition software, but also directed what they were saying to a particular character (perhaps in the context of an intelligent language laboratory);
- enabling rapid fire drills, increasing significantly the speed of student responses (and therefore the intensity of the drill) by cutting out the need for fiddling around with the mouse;
- diagnosing the way students were reacting to a problem on screen, analysing cognitive processes and levels of understanding, and giving contextual hints.
These possibilities might offer genuine opportunities for new education technologies—but were there any sessions giving feedback on research into any of them? Or into the statistical reliability with which competency could be measured? Or the use of automatic sequencing software to aid re-enforcement? You bet there weren’t.
TEL is a poorly conceived acronym. It is used by a community whose discourse is introspective and uninspiring—and after more than ten years of trying, that has not produced any measurable improvements in teaching and learning.
Building a new approach that focuses on education-specific technologies will take time. But the first step should be to get our terminology straight. Just as at school level we should park “ICT” and talk instead about “Computer Science” and “Digital Literacy”, so at HE and theoretical levels, we should leave “TEL” behind and talk instead about “education technology”. Not only is this the preferred term in the US, but it is also superior to TEL because:
- it clearly references the creation and application of technologies that are appropriate to education;
- it clearly addresses education, which is the process that falls within the responsibility of the profession, avoiding the ambivalence and contradictions implied by the discussion of “learning”.
Academics have an important role in the development of education technology—but they will only be effective in that role when:
- they separate out into those who are experts and those who are merely practitioners;
- they stop trying to apply off-the-shelf technology to the complex business of education, but start instead to work with industry (as would any academic engineer, physicist or chemist) on the development and application of education-specific technologies.
The feature picture for this post is Oscar Pistorius, coming off the starting blocks on his running blades. The special techniques required to run with carbon fibre blades (the application of the technology) is doubtless a matter requiring considerable skill, the lessons from which need to be tracked carefully by the manufacturers of the blades themselves (the creators of technology). But no-one will win the Paralympics 100 metres, however good their technique, if their running blades are no good. It is with the application-specific technology that we must start.
Create the technology first and then, in an iterative cycle of application and optimisation, requiring a close relationship between developers and users, you will be able to find out how that technology can be used to best effect.
This conclusion is being reached by the more thoughtful parts of academia itself, which seems to be particularly well represented at the Institute of Education in London. According a recent report from Richard Noss’ London Knowledge Lab:Academic, and practitioner research particularly, is poorly connected and is typically conducted in isolation from the technology developers whose products grace our schools and homes.
So to any academic who claims to be an expert in education technology, the most obvious opening question should be, “Which technology companies are you working with?”
|Scrapping “ICT”, argued that the term “ICT” was no longer useful and should be scrapped. I did not know at the time that the Royal Society had published a report 5 days earlier which came to the same conclusion.|
|Aristotle’s Saddlemaker makes the argument for education-specific software, based on a discussion of the relationship between ends and means found originally in Aristotle’s Nichomachean Ethics.|
|Learning Analytics for better Learning Content explores the preconditions for enabling learning analytics to be used to improve the quality of learning content (one of which is commercial provision of content).|
|Home page, with a full listing of posts on this blog.|
 Nevertheless, they do exist: see “Why minimal guidance during instruction does not work” at http://igitur-archive.library.uu.nl/fss/2006-1214-211848/kirschner_06_minimal_guidance.pdf. I should stress that I am not arguing against many practical conclusions of constructivist theory – such as the importance of experiential and contextualised learning – but the elevation of constructivism into a doctrine that mixes up many different positions, many of them with relativist implications, and in the accounts of many educationalists, placing this theory beyond criticism.
 Decoding Learning, November 2012, by Rosemary Luckin, Brett Bligh, Andrew Manches, Shaaron Ainsworth, Charles Crook and Richard Noss, http://www.nesta.org.uk/home1/assets/features/decoding_learning_report
 Audrey Watters at http://www.hackeducation.com/2012/12/03/top-ed-tech-trends-of-2012-moocs/
 Rose Luckin and Richard Noss at http://ioelondonblog.wordpress.com/category/richard-noss/.