Tag Archives: critical thinking

Thinkers v. producers

Think sign

In How Children Fail, John Holt makes the following distinction:

  • producers - students who are only interested in getting right answers, and who make more or less uncritical use of rules and formulae to get them
  • thinkers – students who try to think about the meaning, the reality, of whatever it is they are working on

A great question to ask ourselves: What is the ratio of thinkers to producers in our school(s)? In most schools, I’m guessing the ratio is fairly small, even for our high-achieving students.

Another great question to ask ourselves: What is an average school day like for those students in our school(s) who ARE thinkers?

Image credit: Think!, florriebassingbourn

‘World-class’ teacher preparation

Shelley Krause

When I work with educators, I get asked on a regular basis, “What about the universities? What are they doing to prepare educators who can facilitate technology-infused learning environments that emphasize deeper cognitive complexity and greater student agency?” Unfortunately, I don’t have much to offer them.

I’m not up on all of the thousands of preparation programs that are out there but, as I think about the shifts that we need to see in schools (and the new building blocks that we need to put in place), at a minimum any teacher preparation program that wanted to label itself ‘world-class’ would be able to affirmatively say the following…

Our graduates know…

Project- and inquiry-based learning

  • how to operate in student-driven, not just teacher-created, project-oriented learning environments
  • how to facilitate inquiry-based activities like ‘passion projects’ or ‘FedEx days’ or ’20% time’ or ‘genius hour’
  • how to facilitate students’ development as creators, designers, innovators, and entrepreneurs
  • how to integrate communication, collaboration, and critical thinking skills into these types of environments

Authentic, real-world work

  • how to organize student work around the big, important concepts central to their discipline
  • how real work gets done by real professionals in that discipline (practices, processes, tools, and technologies)
  • how to find, create, and implement robust, authentic simulations for their subject area
  • how to facilitate and assess authentic performances by students

Standards-based grading and competency-based education

  • how to write and implement a ‘competency’
  • how to help students thrive in a standards-based grading environment
  • how to facilitate learning-teaching systems that focus on mastery rather than seat time (or other dumb criteria)

1:1 computing

  • how to manage and support ubiquitous technology-infused learning spaces
  • how to facilitate student success with digital tools, online systems, and social networks
  • how to help students create appropriate AND empowered ‘digital footprints’

Digital, online, and open access

  • how to leverage digital and online open educational resources to full advantage
  • how to meaningfully curate digital materials in their subject area
  • how to helpfully contribute to our online global information commons (and have students do the same)

Online communities of interest

  • how to utilize online networks and communities of practice to further their professional learning and growth
  • how to meaningfully connect students to relevant online communities of interest for academic and personal development

Adaptive learning systems

  • how to integrate adaptive learning software into students’ learning and assessment
  • how to utilize blended learning environments to individualize and personalize students’ learning experiences (time, place, path, pace)

I think most teacher preparation programs probably fall short of the mark on these, but a program that could say these things about its preservice teachers would be INCREDIBLE.

What do you think? What would you add to this list? More importantly, does anyone know of a teacher preparation program that’s doing well in some / many / most of these areas?

‘Closed’ v. ‘open’ systems of knowing

Teaching As a Subversive Activity

I am rereading Teaching As a Subversive Activity, which is a phenomenal book if you haven’t read it. About halfway through the book, Postman and Weingartner discuss ‘closed’ versus ‘open’ systems of knowledge:

A closed system is one in which the knowables are fixed. Examples of this kind of system would include any in which most of its answers are either yes or no, right or wrong, clearly and without any other possibility. (p. 116)

Open systems may be thought of as situations in which there are degrees of ‘rightness,’ and in which a right answer today may well be a wrong answer tomorrow. (p. 117)

Most of what we do in school falls under the description of a ‘closed’ system. There is typically a right answer, the teacher (or the textbook or the learning software) knows it, and it’s up the student to ‘learn’ it and then spit it back correctly: Describe the water cycle. If 4x2 + 3 = 39, what is x? What is the capital of Delaware? 

In life, however, much of what we do falls under the description of an ‘open’ system. We ask questions and make choices and devise solutions that seem right at the time given the particular context: What major should I choose? Should I look for a new job? Is she the one with whom I want to spend the rest of my life? Which car is best for our family? At another time, in another context, we might decide and act differently. This is true for both individual- and citizen-/policy-level decisions: Should we try to stop Russia from annexing Crimea? Are ethanol subsidies a good way to reduce our nation’s fuel dependence? Should I vote ‘yes’ for the school district referendum? When should we place limits on free speech?

Many argue that fixed knowledge items such as ’the water cycle’ or ‘4x2 + 3 = 39’ or ‘the capital of Delaware’ are the necessary parts that form a foundation for deeper, more cognitively complex thinking. And that’s often true. But it’s a whole nother matter to treat fixed items of knowledge as sacrosanct or to elevate them to the primary desired outcomes of schooling, particularly given the increasing presence of Internet-enabled learning contexts in which such items are easily and quickly accessible. Instead of treating content retention and procedural thinking as foundational floors from which we then build larger, more important edifices of learning, we have made them into almost-impermeable ceilings that drive teaching, curriculum, and assessment.

To fully prepare most students for life – and, arguably, to reengage many of them in the learning, not just social, aspects of their schooling – they need greater immersion in open systems of learning where questions are raised, answers aren’t fixed, and solutions are often contextual. This is true for all grade levels, not just secondary. So far most schools don’t do a great job with this. Instead, what schools usually do

in effect [is] to make closed systems of largely open ones. (p. 117)

We take areas of knowledge like science or government or language or health and we set them in stone – “yes or no, right or wrong, clearly and without any other possibility” – instead of bravely facing them – as they are in real life – as open opportunities for discussion, inquiry, problem-solving, and, yes, divergent learning and knowing.

A tremendous challenge for us as educators and policymakers is to stop reducing learning to convergent, ‘closed’ models of knowing and instead embrace the power and potential of more ‘open’ systems of knowledge and inquiry. This challenge is worth taking on because

very few problems of any great significance can be answered if they are approached from a ‘closed’-system point of view. (p. 117)

And goodness knows we have innumerable problems of great significance that would benefit from some fresh thinking…

Performance assessments may not be ‘reliable’ or ‘valid.’ So what?

Meh

In a comment on Dan Willingham’s recent post, I said

we have plenty of alternatives that have been offered, over and over again, to counteract our current over-reliance on – and unfounded belief in – the ‘magic’ of bubble sheet test scores. Such alternatives include portfolios, embedded assessments, essays, performance assessments, public exhibitions, greater use of formative assessments (in the sense of Black & Wiliam, not benchmark testing) instead of summative assessments, and so on. . . . We know how to do assessment better than low-level, fixed-response items. We just don’t want to pay for it…

Dan replied

I don’t think money is the problem. These alternatives are not, to my knowledge, reliable or valid, with the exception of essays.

And therein lies the problem… (with this issue in general, not with Dan in particular)

Most of us recognize that more of our students need to be doing deeper, more complex thinking work more often. But if we want students to be critical thinkers and problem solvers and effective communicators and collaborators, that cognitively-complex work is usually more divergent rather than convergent. It is more amorphous and fuzzy and personal. It is often multi-stage and multimodal. It is not easily reduced to a number or rating or score. However, this does NOT mean that kind of work is incapable of being assessed. When a student creates something – digital or physical (or both) – we have ways of determining the quality and contribution of that product or project. When a student gives a presentation that compels others to laugh, cry, and/or take action, we have ways of identifying what made that an excellent talk. When a student makes and exhibits a work of art – or sings, plays, or composes a musical selection – or displays athletic skill – or writes a computer program – we have ways of telling whether it was done well. When a student engages in a service learning project that benefits the community, we have ways of knowing whether that work is meaningful and worthwhile. When a student presents a portfolio of work over time, we have ways of judging that. And so on…

If there is anything that we’ve learned (often to our great dismay) over the last decade, it’s that assessment is the tail that wags the instructional, curricular, and educational dogs. If we continue to insist on judging performance assessments with the ‘validity’ and ‘reliability’ criteria traditionally used by statisticians and psychometricians, we never – NEVER – will move much beyond factual recall and procedural regurgitation to achieve the kinds of higher-level student work that we need more of.

The upper ends of Bloom’s taxonomy and/or Webb’s Depth of Knowledge levels probably can not – and likely SHOULD not – be reduced to a scaled score, effect size, or regression model without sucking the very soul out of that work. As I said in another comment on Dan’s post, “What score should we give the Mona Lisa? And what would the ‘objective’ rating criteria be?” I’m willing to confess that I am unconcerned about the lack of statistical ‘validity’ and ‘reliability’ of authentic performance assessments if we are thoughtful assessors of those activities.

How about you? Dan (or others), what are your thoughts on this?

Image credit: Meh, Ken Murphy

Once kids realized that they were full partners in their learning…

I adopted Leonardo da Vinci’s 7 Principles as a guide and was especially attracted to Sfumato, usually translated as “Up in Smoke,” meaning to embrace ambiguity, paradox and uncertainty. Great things are produced and discovered when you open the door to possibilities and leave some things undefined. When I did that, there was difficulty adjusting as kids had been trained to give the right answers. My response was there may be none and that I was more interested in originality, creativity, and being able to explain and defend one’s thinking. En Garde!

However, once kids realized that they were full partners in their learning and that most anything was possible, they brought me to tears with their work.

Anonymous teacher via http://dianeravitch.net/2013/01/02/when-students-love-learning

Complex performance does not imply deep understanding and transfer

just because students are asked to do a complex performance does not mean that any real transfer is demanded. If the task is completely scripted by a teacher – say, memorizing a poem, performing a Chopin Prelude that one has practiced many times, with coaching, or writing a formulaic 5-paragraph essay – then there is no transfer of learning taking place. Transfer only is demanded and elicited when there is some element of novelty in the task and thus strategic thought and judgment is required by the performer.

If you only can recall and state something you don’t really understand it. You have to be able to explain and justify its meaning and applicability – a Meaning goal – and you also have to be able to apply it into settings where it is needed, without being prompted to do so or shown exactly how to do so – Transfer.

Grant Wiggins via http://grantwiggins.wordpress.com/2012/10/03/a-clarification-of-the-goal-of-transfer-and-how-it-relates-to-testing

Higher-order thinking is the exception rather than the norm for most classrooms

McREL has collected data from more than 27,000 classroom observations that offer a dismaying glimpse into the level of instruction that appears to be occurring in the nation’s classrooms. In well over half of these observations, student learning reflected the two lowest levels of Bloom’s taxonomy: remembering (25 percent) and understanding (32 percent). Meanwhile, students were developing the higher-order thinking skills of analysis (9 percent), evaluation (3 percent), and creation (4 percent) in less than one-sixth of the classrooms observed.

Certainly, not all learning can focus on higher-order thinking; teachers must develop students’ ability to recall and understand basic concepts before they can move on to more critical thinking. Nonetheless, the fact that so much of what goes on in classrooms appears to be focused on low-level thinking suggests that high expectations and challenging instruction may be the exception, rather than the norm, for most students.

Bryan Goodwin via http://www.ascd.org/publications/books/111038/chapters/Guaranteeing-Challenging,-Engaging,-and-Intentional-Instruction.aspx

[See also the data at Are 21st century skills a solution to a problem that may not exist?]

Teach students higher order or critical thinking skills? Not if the Texas Republicans have their way.

Republican Party of Texas Logo

The Republican Party of Texas states in its official 2012 political platform:

We oppose the teaching of Higher Order Thinking Skills (HOTS) (values clarification), critical thinking skills and similar programs that are simply a relabeling of Outcome-Based  Education (OBE) (mastery learning) which focus on behavior modification and have the purpose of challenging the student’s fixed beliefs and undermining parental authority.

This is astounding since most everyone else in America seems to understand that our educational graduates and our employees need greater, not less, development of critical and higher-order thinking skills in order to be effective citizens, learners, and workers in our hyperconnected, hypercompetitive global information society. This political platform item is an absolutely stunning example of educational and economic cluelessness and is a surefire recipe for complete irrelevance in the 21st century.

In recent years, I don’t believe I’ve heard of any other groups officially opposed to teaching students critical thinking or higher-order thinking. Have you? Other thoughts?

Hat tip: Slate

Some Iowa students weigh in on what classes they need for 21st century jobs

Leslie Pralle Keehn, a Social Studies teacher here in Iowa, had her students read The World Is Flat and then asked what ‘classes’ would help prepare them for 21st century jobs. Here are their responses:

Prallekeehnstudentresponses

I love how none of these are disciplinary silos (which, I’m guessing, is how her students experience most of their school days). What do you think about her students’ responses?


Switch to our mobile site