Tag Archives: you can’t just look it up

Memory cannot be outsourced

A couple of weeks ago, I wrote a post arguing that the ‘traditional’ understanding of a hierarchical, teacher-centric classroom has not existed in English schools since at least the 1960s. In the comments thread, I am trying to list examples of people claiming that this model of teaching still does exist (normally in the context of them saying that it should be abolished). Andrew Old has been particularly diligent in finding examples. Thank you Andrew.

In this post, I want to do something similar. Here, I want to list examples of people claiming that we don’t need to learn facts because we have the internet. I mean things like this:

What matters today is how to process and manipulate knowledge, rather than absorbing and memorising facts from within a narrow specialism…Facts learned at school become irrelevant to most of life’s challenges since the internet makes knowledge universal and immediately accessible.

And this:

Why teach them about the Battle of Hastings when they have got Google?

And this:

We are no longer in an age where a substantial ‘fact bank’ in our heads is required.

This is another example of a completely false idea which nevertheless seems to be extremely pervasive. What is so frustrating about this one is that the hard evidence about why we need to memorise facts is solid and well-documented, but not nearly so well-known. Here is what I think is the best article about this available on the internet – E.D. Hirsch’s You Can Always Look It Up…Or Can You? I have also written about it here, here and here.

Advertisements

Why 21st century skills are not that 21st century

Whenever I hear anyone talk about preparing students for the 21st century, I am always sceptical. Partly this is because it is never made clear exactly what is so different about the 21st century that requires such different preparation. For the American organisation Partnership for 21st Century Skills (P21), which is sponsored by a number of multinational corporations, the four important 21st century skills are ‘critical thinking and problem solving; communication, collaboration; and creativity and innovation.’[i]  For the Royal Society of Arts, the skills that are needed for the future are: ‘citizenship, learning, managing information, relating to people and managing situations’.[ii] For Sir Ken Robinson, in the 21st century people need to be able to ‘adapt, see connections, innovate, communicate and work with others’.[iii]  Of course, I would agree that these skills are important. But I fail to see what is so uniquely 21st century about them. Mycenaean Greek craftsmen had to work with others, adapt and innovate. It is quite patronising to suggest that no-one before the year 2000 ever needed to think critically, solve problems, communicate, collaborate, create, innovate or read. Human beings have been doing most of these things for quite a long time. The alphabet, a fairly late development of civilisation, was invented in the 21st century BC.  It probably is true that the future will require more and more people to have these skills, and that there will be fewer economic opportunities for people who don’t have these skills. But that would suggest to me that we need to make sure that everyone gets the education that was in the past reserved for the elite. That’s not redefining education for the 21st century; it’s giving everyone the chance to get a traditional education.

And that is where my real problem with the concept of 21st century education lies. To the extent that it says that creativity and problem solving are important, it is merely banal and meaningless; to the extent that it says such skills are unique to the 21st century, it is false but harmless; to the extent that it proposes certain ways of achieving these aims, it is actually pernicious. This is because very often, the movement for ‘21st century skills’ is a codeword for an attack on knowledge.  Of course, one way the 21st century really is different to other eras is in the incredible power of technology. But this difference, whilst real, tends to lead on to two more educational fallacies. Firstly, it is used to support the idea that traditional bodies of knowledge are outmoded. There is just so much knowledge nowadays, and it is changing all the time, so there is no point learning any of it to begin with. The Association of Teachers and Lecturers argue, for example, argue that : ‘A twenty-first century curriculum cannot have the transfer of knowledge at its core for the simple reason that the selection of what is required has become problematic in an information rich age’.[iv]  The popular youtube video ‘Shift Happens’ tells us that 1.5 exabytes of unique new information are generated each year, and that the amount of new technical information is doubling each year. [v] It then concludes that this flow of new information means that for students starting a four year college or technical degree, half of what they learn in their first year will be outdated by their third year of study. This is simply not true. Of course people make new discoveries all the time, but a lot of those new discoveries don’t disprove or supersede the old ones – in fact, they’re more likely to build on the old discoveries and require intimate knowledge of them. The fundamental foundations of most disciplines are rarely, if ever, completely disproved. Universities can turn out as many exabytes of information as they like – they are unlikely to disprove Pythagoras’s theorem or improve on Euripides’s tragedies. And there are very many such ancient, fundamental ideas and inventions which have stood the test of time: perhaps more than we are willing to admit. The alphabet and the numbering system, for example, are two of the most valuable inventions we have. As far as we know, these were invented in about 2000 BC and 3000 BC respectively. So far they show no signs of wearing out or being superseded. All of the most modern and advanced technological devices depend on them in one way or another. Indeed, if anything the sheer proliferation of knowledge should lead to selective bodies of knowledge becoming more important, as mechanisms for sorting the wheat from the vast amounts of chaff.

Secondly, advances in technology are used to do down knowledge because it is said that they remove the need for pupils to memorise anything. This is the ‘Just Google It’ fallacy which I dealt with briefly here and here, and which E.D. Hirsch deals with comprehensively here.[vi] Put simply, to be able to effectively look things up on the internet requires a great deal of knowledge to begin with.

What I think you can see from this is that too often the idea of 21st century skills is just a codeword for an attack on knowledge and memory. This is ironic because, as I now want to explain, the message of late 20th century and 21st century science is that knowledge and memory are unbelievably important.

As Kirschner, Sweller and Clark put it.

‘our understanding of the role of long-term memory in human cognition has altered dramatically over the last few decades. It is no longer seen as a passive repository of discrete, isolated fragments of information that permit us to repeat what we have learned. Nor is it seen only as a component of human cognitive architecture that has merely peripheral influence on complex cognitive processes such as thinking and problem solving. Rather, long-term memory is now viewed as the central, dominant structure of human cognition. Everything we see, hear, and think about is critically dependent on and influenced by our long-term memory.’[vii]

You will see that Kirschner et al say that our understanding of human cognition has altered dramatically over the last few decades. A large part of this is down to the work that artificial intelligence pioneers have done. In the 50s and 60s, scientists wanted to try and create artificial intelligence in computers. They realised as they did this that their understanding of real, human intelligence was incredibly hazy. The research they did to try and understand real intelligence is fascinating and has huge implications for the classroom. And as Kirschner et al suggest, one of their strongest findings was that knowledge plays a central part in all human cognition. The evidence on this is solid. Much of the early research on this was done involving chess players, including one fascinating experiment by Adriaan de Groot. The electronic chess games that can beat you are based on the research these AI pioneers did.  And all the research in different fields confirms this. Dan Willingham sums up all this research with this line:

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not just because you need something to think about. The very processes that teachers care about most – critical thinking processes such as reasoning and problem solving – are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).[viii]

And yet, as we have seen, many advocates of ‘21st century skills’ speak disparagingly of knowledge and want to marginalise its place in the curriculum. This is despite the fact that there is no research or evidence backing up their ideas. Indeed, the guilty secret of the 21st century skills advocates is that it is their ideas which are rather old hat and outdated. Diane Ravitch notes how, at the beginning of the 20th century, many educators wanted to throw away traditional knowledge and embrace ‘20th century skills’. [ix]

The most depressing thing about all of this, therefore, is that old ideas which are thoroughly discredited are being warmed over and presented as being at the cutting edge. And it is particularly ironic that the actual cutting edge science is telling us to do the complete opposite of what most of the ‘21st century skills’ advocates want.

 


[i]   ‘Shift Happens’, http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 19 July 2011.

[ii] Royal Society for Arts Opening Minds. What is RSA Opening Minds? http://www.rsaopeningminds.org.uk/about-rsa-openingminds/ Accessed 19 February 2011.

[iii] National Advisory Committee on Creative and Cultural Education. All Our Futures: Creativity, Culture and Education. 1999, p.14. < http://www.cypni.org.uk/downloads/alloutfutures.pdf> Accessed 19 February 2011.

[iv] Association of Teachers and Lecturers. Subject to Change: New Thinking on the Curriculum. London, 2006. http://www.atl.org.uk/Images/Subject%20to%20change%20-%20curriculum%20PS%202006.pdf Accessed 19 July 2011.

[v] Fisch, Karl. ‘Shift Happens’. http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 21 January 2011.

[vii] Kirschner, P. A., J. Sweller, and R.E. Clark, ‘Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure Of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.’ Educational Psychologist (2006) 41:2, 75-86, p.76. http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf

[viii] Willingham, Daniel T., Why Don’t Our Students Like School?, San Francisco: Jossey-Bass, 2009, p.28.

Why you can’t just Google it

In my post here, I talked about the pervasive modern idea that Google renders memory irrelevant, and explained why this idea is false. I want to return to this point here with some further explanations.

The best explanation of why you can’t ‘Just Google It’ is by E.D. Hirsch here. Essentially, his point is that:

There is a consensus in cognitive psychology that it takes knowledge to gain knowledge. Those who repudiate a fact-filled curriculum on the grounds that kids can always look things up miss the paradox that de-emphasizing factual knowledge actually disables children from looking things up effectively. To stress process at the expense of factual knowledge actually hinders children from learning to learn. Yes, the Internet has placed a wealth of information at our fingertips. But to be able to use that information—to absorb it, to add to our knowledge—we must already possess a storehouse of knowledge. That is the paradox disclosed by cognitive research.

What I want to focus on here is one wonderful example he gives which proves this point. It is a summary of research done by George Miller, a cognitive psychologist. Miller asked pupils to use dictionaries to look up word meanings and then to use those words in sentences. He got sentences like this:

“Mrs.Morrow stimulated the soup.” (That is she stirred it up.)

“Our family erodes a lot.” (That is they eat out.)

“Me and my parents correlate, because without them I wouldn’t be here.”

“I was meticulous about falling off the cliff.”

“I relegated my pen pal’s letter to her house.”

I instantly recognized this phenomenon. I have read lots and lots of sentences like this, far too many to remember. Two I do remember are:

The weather outside was ingratiating. (after looking up ‘nice’ in a thesaurus.)

He was congenial at football. (after looking up ‘good’.)

What Miller and his fellow researchers did, however, was to extrapolate from this very common occurrence a profound and seemingly counter-intuitive insight, which is that in order to use reference works such as dictionaries, thesauri and encyclopedias, you already need to know quite a lot about the thing you are looking up. As Hirsch says:

Of course, Professor Miller is in favor of dictionaries and encyclopedias in appropriate contexts where they can be used effectively by children and adults. But those contexts turn out to be the somewhat rare occasions when nuances of meaning can be confidently understood. Reference works including the Internet are immensely valuable in those constrained circumstances. But Miller has shown very well why, outside those circumstances, adults use reference resources so infrequently. His observations are well supported by other areas of cognitive psychology.

The whole article by Hirsch, and the article by Miller where he explains these and other ideas, are fascinating and well worth reading.

This is what I love about modern research into cognition. Reading it is like the moment when you read the end of a well-constructed detective story and go ‘Ah! Of course! That’s how it all fits together.’ Modern research in cognitive psychology offers a convincing theoretical framework that seems to me to make sense of so many of the apparently baffling things my students do.