Tag Archives: 21st century skills

Activities, projects and the American Educator

One of the most wonderful educational resources I have found on the internet is the archive of the journal American Educator. This is the journal of the American Federation of Teachers, America’s second largest teaching union. Although the AFT is very similar to unions in England in its stance on employment rights, pension and pay, it is very different from the teaching unions in this country in its stance on pedagogy. For example, prominent unionists in England believe that school curricula should be reformed for the 21st century and should include lessons on how to walk. The AFT, on the other hand, are supporters of Core Knowledge, a  kindergarten-eighth grade curriculum that focuses on traditional subjects and content.

There are tons of good articles in the American Educator, but one I want to blog about now is by Gilbert Sewall.  It’s about the way that activities and projects have come to take up more and more classroom time. Here’s an excerpt:

  • A third-grade social studies student in California builds an Endangered Species “portfolio.” For the entire year. This portfolio is given over to the demise of the toucan and the Galapagos tortoise. The portfolio is brightly colored, laminated and spiral bound, containing lots of glossy photographs clipped from magazines. Each page is thick with adhesive stick-ons and glitter. The portfolio contains many, many misspelled words and exhibits almost no understanding of the South American continent’s natural history…
  • A seventh-grade suburban Maryland student builds a shoebox-sized replica of the items in his school locker for Spanish class. The academic content: He then labels the items in Spanish. Total time for the project: approximately 20 hours. Ninth-grade French class students in New York City scout cookbooks for crêpe suzette and omelet recipes. They create photo montages of the Eiffel Tower and Notre Dame, making posters for display on classroom walls.
  • Selected members of a 10th-grade world history class receive cookies. The rest of the class goes empty-handed. This creates a room of haves and have nots. Students discuss how it feels to be left out, and how it feels to be the privileged few given the cookies to eat. The purpose: to prepare for the study of the French Revolution…
  • A third-grade math program devotes a week to the concept of 1,000. One lesson centers on “Thousand skits,” in which students figure out things the class can do cooperatively to accomplish 1,000 repetitions and then try to act them out. “Work in groups of four to make up your skit. Decide what you will do, how many people you will need, and how many repetitions each person will do. Write down the directions for your skit.” This lesson is taken from a textbook series the U.S. Department of Education recommended last year to school districts across the country.
  • A sixth-grade social studies textbook suggests: “Imagine you are a television reporter covering the Roman assault on Masada. Prepare a news report on this event.”
  • An “authentic assessment” in “integrated science” designed to replace ordinary tests asks students to write a poem about mitosis. A journal of chemical education encourages high school science students to construct a new periodic table of the elements as it might appear on some unspecified alien planet.

My first reaction was to cringe. I myself had taught lots of lessons like this. I had always thought there was something a bit wrong about them, but everyone else was doing it, and it was what we’d been taught at teacher training college. So I carried on teaching them and assumed that my uneasiness was down to me being a bit curmudgeonly and old-fashioned. When I read Sewall’s article, the way that he lists all the activities makes you realise how absurd they all are. I mean, they read like something out of Brass Eye or Monty Python. ‘Live at Masada’ in particular sounds like a scene from the Holy Grail that was left on the cutting room floor.

My second reaction was to question my first reaction. After all, Sewall’s article is a convincing piece of rhetoric, but does it really offer any evidence as to why such projects aren’t valuable? Isn’t the problem that such projects might be taught badly, and that taught well they would be very good? It’s easy to mock ideas like ‘Live at Masada’, but they do actually engage the kids. And is it just that he has cherrypicked the worst types of projects, attacking a straw man rather than a real problem? In short, is he just an old-fashioned curmudgeon whose ideas play to my prejudices?

My first reaction was correct, but it was another AFT writer, Dan T. Willingham, who convinced me of this. Willingham offers some cognitive evidence about exactly why such projects and activities are not worthwhile. In his book ‘Why Don’t Students Like School?’,  Willingham speaks about a similar project to the ones Sewall mentions.

‘A teacher once told me that for a fourth grade unit on the Underground Railroad he had his students bake biscuits, because this was a staple food for runaway slaves.’ (53)

Then, Willingham explains what the flaw is:

‘his students probably thought for about forty seconds about the relationship of biscuits to the Underground Railroad, and for forty minutes about measuring flour, mixing shortening and so on.’ (53)

And the reason why this is a problem is that:

‘Whatever students think about is what they will remember…memory is the residue of thought.’ (54)

And this, of course, is the problem with the projects Sewall lists. They get pupils to think about the wrong things. If we want pupils to learn Spanish, they need to think about Spanish vocabulary and sentence structures, not replicas of shoeboxes. If we want pupils to learn about Roman history, we need to get them to think about the events of Roman history, not about TV news reporters’ interviewing techniques.

Of course, there is another possible criticism of Sewall’s article, which is that we shouldn’t be bothering to teach pupils Spanish vocab or Roman history in the first place. I will deal with that in another post.

Advertisements

Shirley Valentine has the answers

Shirley Valentine is one of my favourite films. I watched it a lot with my parents when I was younger; I think they liked it because it made relationships between English women and Greek men temporarily fashionable. A scene from Shirley Valentine occurred to me when I was writing this post about 21st century skills. The scene is in the video below, up until 2.04.

In this scene, Shirley is in school assembly and the snobby headmistress asks ‘What is man’s most important invention?’ Sputnik, says one girl. The Hoover, says another. The automatic washing machine. The aeroplane. The internal combustion engine. All great inventions, but all wrong. The answer, as young Shirley says, is the wheel. She’s right, but of course, that doesn’t stop everyone laughing at her. It’s the laughter of hubris – and indeed, the idea that we, in the 20th and 21st centuries, could be dependent on anything as pathetic and simple as a wheel does seem laughable. But it is true. As Newton said, if I have seen a little further it is by standing on the shoulders of giants. Modern industry wouldn’t work without the wheel. The iPad wouldn’t work without the alphabet and the number system.

Anyway, it gets even better. The headmistress is outraged that young Shirley could have got the question right when the better-spoken girls in the school got it wrong. ‘Somebody must have told you,’ she snaps. And Shirley responds ‘Well how the bleeding hell else could I learn it?’ Quite. A more succinct demolition of discovery learning I have never heard.

Unfortunately, the headmistress’s dismissal of her answer leads to poor Shirley dropping out and becoming a rebel. There’s a moral there for you.

Why 21st century skills are not that 21st century

Whenever I hear anyone talk about preparing students for the 21st century, I am always sceptical. Partly this is because it is never made clear exactly what is so different about the 21st century that requires such different preparation. For the American organisation Partnership for 21st Century Skills (P21), which is sponsored by a number of multinational corporations, the four important 21st century skills are ‘critical thinking and problem solving; communication, collaboration; and creativity and innovation.’[i]  For the Royal Society of Arts, the skills that are needed for the future are: ‘citizenship, learning, managing information, relating to people and managing situations’.[ii] For Sir Ken Robinson, in the 21st century people need to be able to ‘adapt, see connections, innovate, communicate and work with others’.[iii]  Of course, I would agree that these skills are important. But I fail to see what is so uniquely 21st century about them. Mycenaean Greek craftsmen had to work with others, adapt and innovate. It is quite patronising to suggest that no-one before the year 2000 ever needed to think critically, solve problems, communicate, collaborate, create, innovate or read. Human beings have been doing most of these things for quite a long time. The alphabet, a fairly late development of civilisation, was invented in the 21st century BC.  It probably is true that the future will require more and more people to have these skills, and that there will be fewer economic opportunities for people who don’t have these skills. But that would suggest to me that we need to make sure that everyone gets the education that was in the past reserved for the elite. That’s not redefining education for the 21st century; it’s giving everyone the chance to get a traditional education.

And that is where my real problem with the concept of 21st century education lies. To the extent that it says that creativity and problem solving are important, it is merely banal and meaningless; to the extent that it says such skills are unique to the 21st century, it is false but harmless; to the extent that it proposes certain ways of achieving these aims, it is actually pernicious. This is because very often, the movement for ‘21st century skills’ is a codeword for an attack on knowledge.  Of course, one way the 21st century really is different to other eras is in the incredible power of technology. But this difference, whilst real, tends to lead on to two more educational fallacies. Firstly, it is used to support the idea that traditional bodies of knowledge are outmoded. There is just so much knowledge nowadays, and it is changing all the time, so there is no point learning any of it to begin with. The Association of Teachers and Lecturers argue, for example, argue that : ‘A twenty-first century curriculum cannot have the transfer of knowledge at its core for the simple reason that the selection of what is required has become problematic in an information rich age’.[iv]  The popular youtube video ‘Shift Happens’ tells us that 1.5 exabytes of unique new information are generated each year, and that the amount of new technical information is doubling each year. [v] It then concludes that this flow of new information means that for students starting a four year college or technical degree, half of what they learn in their first year will be outdated by their third year of study. This is simply not true. Of course people make new discoveries all the time, but a lot of those new discoveries don’t disprove or supersede the old ones – in fact, they’re more likely to build on the old discoveries and require intimate knowledge of them. The fundamental foundations of most disciplines are rarely, if ever, completely disproved. Universities can turn out as many exabytes of information as they like – they are unlikely to disprove Pythagoras’s theorem or improve on Euripides’s tragedies. And there are very many such ancient, fundamental ideas and inventions which have stood the test of time: perhaps more than we are willing to admit. The alphabet and the numbering system, for example, are two of the most valuable inventions we have. As far as we know, these were invented in about 2000 BC and 3000 BC respectively. So far they show no signs of wearing out or being superseded. All of the most modern and advanced technological devices depend on them in one way or another. Indeed, if anything the sheer proliferation of knowledge should lead to selective bodies of knowledge becoming more important, as mechanisms for sorting the wheat from the vast amounts of chaff.

Secondly, advances in technology are used to do down knowledge because it is said that they remove the need for pupils to memorise anything. This is the ‘Just Google It’ fallacy which I dealt with briefly here and here, and which E.D. Hirsch deals with comprehensively here.[vi] Put simply, to be able to effectively look things up on the internet requires a great deal of knowledge to begin with.

What I think you can see from this is that too often the idea of 21st century skills is just a codeword for an attack on knowledge and memory. This is ironic because, as I now want to explain, the message of late 20th century and 21st century science is that knowledge and memory are unbelievably important.

As Kirschner, Sweller and Clark put it.

‘our understanding of the role of long-term memory in human cognition has altered dramatically over the last few decades. It is no longer seen as a passive repository of discrete, isolated fragments of information that permit us to repeat what we have learned. Nor is it seen only as a component of human cognitive architecture that has merely peripheral influence on complex cognitive processes such as thinking and problem solving. Rather, long-term memory is now viewed as the central, dominant structure of human cognition. Everything we see, hear, and think about is critically dependent on and influenced by our long-term memory.’[vii]

You will see that Kirschner et al say that our understanding of human cognition has altered dramatically over the last few decades. A large part of this is down to the work that artificial intelligence pioneers have done. In the 50s and 60s, scientists wanted to try and create artificial intelligence in computers. They realised as they did this that their understanding of real, human intelligence was incredibly hazy. The research they did to try and understand real intelligence is fascinating and has huge implications for the classroom. And as Kirschner et al suggest, one of their strongest findings was that knowledge plays a central part in all human cognition. The evidence on this is solid. Much of the early research on this was done involving chess players, including one fascinating experiment by Adriaan de Groot. The electronic chess games that can beat you are based on the research these AI pioneers did.  And all the research in different fields confirms this. Dan Willingham sums up all this research with this line:

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not just because you need something to think about. The very processes that teachers care about most – critical thinking processes such as reasoning and problem solving – are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).[viii]

And yet, as we have seen, many advocates of ‘21st century skills’ speak disparagingly of knowledge and want to marginalise its place in the curriculum. This is despite the fact that there is no research or evidence backing up their ideas. Indeed, the guilty secret of the 21st century skills advocates is that it is their ideas which are rather old hat and outdated. Diane Ravitch notes how, at the beginning of the 20th century, many educators wanted to throw away traditional knowledge and embrace ‘20th century skills’. [ix]

The most depressing thing about all of this, therefore, is that old ideas which are thoroughly discredited are being warmed over and presented as being at the cutting edge. And it is particularly ironic that the actual cutting edge science is telling us to do the complete opposite of what most of the ‘21st century skills’ advocates want.

 


[i]   ‘Shift Happens’, http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 19 July 2011.

[ii] Royal Society for Arts Opening Minds. What is RSA Opening Minds? http://www.rsaopeningminds.org.uk/about-rsa-openingminds/ Accessed 19 February 2011.

[iii] National Advisory Committee on Creative and Cultural Education. All Our Futures: Creativity, Culture and Education. 1999, p.14. < http://www.cypni.org.uk/downloads/alloutfutures.pdf> Accessed 19 February 2011.

[iv] Association of Teachers and Lecturers. Subject to Change: New Thinking on the Curriculum. London, 2006. http://www.atl.org.uk/Images/Subject%20to%20change%20-%20curriculum%20PS%202006.pdf Accessed 19 July 2011.

[v] Fisch, Karl. ‘Shift Happens’. http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 21 January 2011.

[vii] Kirschner, P. A., J. Sweller, and R.E. Clark, ‘Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure Of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.’ Educational Psychologist (2006) 41:2, 75-86, p.76. http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf

[viii] Willingham, Daniel T., Why Don’t Our Students Like School?, San Francisco: Jossey-Bass, 2009, p.28.

Why you can’t just Google it

In my post here, I talked about the pervasive modern idea that Google renders memory irrelevant, and explained why this idea is false. I want to return to this point here with some further explanations.

The best explanation of why you can’t ‘Just Google It’ is by E.D. Hirsch here. Essentially, his point is that:

There is a consensus in cognitive psychology that it takes knowledge to gain knowledge. Those who repudiate a fact-filled curriculum on the grounds that kids can always look things up miss the paradox that de-emphasizing factual knowledge actually disables children from looking things up effectively. To stress process at the expense of factual knowledge actually hinders children from learning to learn. Yes, the Internet has placed a wealth of information at our fingertips. But to be able to use that information—to absorb it, to add to our knowledge—we must already possess a storehouse of knowledge. That is the paradox disclosed by cognitive research.

What I want to focus on here is one wonderful example he gives which proves this point. It is a summary of research done by George Miller, a cognitive psychologist. Miller asked pupils to use dictionaries to look up word meanings and then to use those words in sentences. He got sentences like this:

“Mrs.Morrow stimulated the soup.” (That is she stirred it up.)

“Our family erodes a lot.” (That is they eat out.)

“Me and my parents correlate, because without them I wouldn’t be here.”

“I was meticulous about falling off the cliff.”

“I relegated my pen pal’s letter to her house.”

I instantly recognized this phenomenon. I have read lots and lots of sentences like this, far too many to remember. Two I do remember are:

The weather outside was ingratiating. (after looking up ‘nice’ in a thesaurus.)

He was congenial at football. (after looking up ‘good’.)

What Miller and his fellow researchers did, however, was to extrapolate from this very common occurrence a profound and seemingly counter-intuitive insight, which is that in order to use reference works such as dictionaries, thesauri and encyclopedias, you already need to know quite a lot about the thing you are looking up. As Hirsch says:

Of course, Professor Miller is in favor of dictionaries and encyclopedias in appropriate contexts where they can be used effectively by children and adults. But those contexts turn out to be the somewhat rare occasions when nuances of meaning can be confidently understood. Reference works including the Internet are immensely valuable in those constrained circumstances. But Miller has shown very well why, outside those circumstances, adults use reference resources so infrequently. His observations are well supported by other areas of cognitive psychology.

The whole article by Hirsch, and the article by Miller where he explains these and other ideas, are fascinating and well worth reading.

This is what I love about modern research into cognition. Reading it is like the moment when you read the end of a well-constructed detective story and go ‘Ah! Of course! That’s how it all fits together.’ Modern research in cognitive psychology offers a convincing theoretical framework that seems to me to make sense of so many of the apparently baffling things my students do.

Skills and Knowledge

A very similar post to this was originally published on the Policy First website here.

In modern education, traditional knowledge often gets a bit of a hard time. Critics of knowledge-based curriculums argue that modern technology has eliminated the need for pupils to remember and memorise vast quantities of knowledge. Not only that, but the rate of modern development means that a lot of knowledge will quickly become obsolete. We need to ‘future-proof’ education by teaching transferable skills which can apply in a range of situations, not knowledge which may soon be irrelevant.  You can find countless teachers, educationalists and organisations who all broadly subscribe to this view. I will give examples in a later post but for now, this viewpoint is clearly expressed by a primary head teacher who was recently interviewed here by the Telegraph:

“Why teach them about the Battle of Hastings when they have got Google? For us, it is about teaching them how to learn.”

Unfortunately, as plausible as these arguments sound they aren’t backed up by the facts.  Firstly, the idea that we can outsource memory to the internet is simply not true. In the words of Dan Willingham, a cognitive scientist who has published a book explaining the latest neuroscientific research for an educational audience:

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts…critical thinking processes such as reasoning and problem solving…are intimately intertwined with factual knowledge that is stored in long-term memory.[1]

Skills and knowledge are bound up with each other, and any curriculum which marginalises knowledge is therefore doomed to fail.

Secondly, traditional bodies of knowledge are not nearly as obsolete as the theorists would like us to believe. Of course people make new discoveries all the time, but a lot of those new discoveries don’t disprove or supersede the old ones – in fact, they’re more likely to build on the old discoveries and require intimate knowledge of them. As we know from fashion, it’s the new that dates the soonest.  The wheel was invented in the 4th millennium BC, but it’s set fair to outlast the microfiche, invented in the second millennium AD.  Likewise, the knowledge and inventions from the distant past that have survived to the present day have done so because they are extraordinarily valuable. The Ancient Egyptian alphabet system, the Hindu-Arabic numbering system, Ancient Greek geometry and Renaissance art and literature are all still very relevant to today’s world. These traditional bodies of knowledge represent not only some of the highest peaks of human culture, but the foundations on which any true originality or creativity must build. We fail to teach them at our peril.

[1] Willingham, Daniel T. Why Don’t Our Students Like School? San Francisco: Jossey-Bass, 2009, p.28.