The 4 Shifts Protocol in Kentucky

The Commonwealth of Kentucky has leaned hard into the 4 Shifts Protocol to support its schools’ technology integration and instructional redesign work. Over 650(!) Digital Learning Coaches (DLCs) across the state have received a copy of Harnessing Technology for Deeper Learning and are working with their local educators to use the protocol to redesign lessons and units for deeper learning, greater student agency, more authentic work, and rich technology infusion.

Although many thousands of educators and schools across the globe are using the 4 Shifts Protocol, I believe that Kentucky currently is the largest single deployment of this redesign work. Kentucky also is investing heavily in project-based learning, and the protocol is a nice bridging mechanism and support for that complex work.

Julie Graber and I are grateful that the protocol has been useful to so many educators in so many places. Kentucky (and others), please let me know what I can do to support this work. Happy to chat or visit anytime!

2022 KYSTE 01

2022 KYSTE 02

2022 KYSTE 03

The 4 Shifts Protocol in Bismarck

It’s always gratifying to see your resources being used by educators. I’ve worked with the Bismarck Public Schools multiple times on leadership, vision, and instructional design for deeper learning (and we featured Legacy High School in Leadership for Deeper Learning). They’ve got an amazing group of educators there and I always love to see what they’re up to… Thanks for sharing, Tanna!

2022 Bismarck

Thinking about NAEP in Colorado and the Denver Public Schools

[this blog post is a follow-up response to the Twitter exchange with Van Schoales posted below]

2022 10 26 Van Schoales 01

2022 10 26 Van Schoales 02

2022 10 26 Van Schoales 03

Hi Van.

Although I’ve admired your work for years, you and I have never met, which means that we don’t have a relationship to lean into. Seth Godin reminds me that ‘if your audience isn’t listening, it’s your fault, not theirs.’ So please take this post as a very-public apology for whatever anger or defensiveness I sparked by my tweet. I’m sure that it could have been worded more artfully, and I regret not phrasing it in a way that maybe would have been received better. I offer this longer-form blog post as an attempt to bypass the lack of nuance available in 280 Twitter characters. I tend to do much of my thinking out loud in front of others, because their feedback makes me smarter.

I don’t know if you had a chance to read my previous blog post, Much ado about NAEP, but I tried to make two key points. The first was that the timing of the March 2022 NAEP tests should give us pause when interpreting the results. Most educators in America probably would tell us that, as difficult as the 2020-2021 school year was, the 2021-2022 year was even tougher. Kids and families still were dealing with incredible trauma, children showed us daily in classrooms (or by their absences) that they needed more support, and we didn’t do a great job of effectively serving large numbers of our young people. Then we had another big COVID spike right before the NAEP administration, and we have absolutely no idea how that impacted student test-takers except that probably a whole bunch of them (and their families) were ill, absent, struggling, etc. during the months that immediately preceded the test. We also know that these things were most true for our least-resourced children. All of this together feels like a reason to take the 2022 NAEP results not just with a grain but a giant boulder of salt. It’s difficult enough in normal times to help our students feel motivated for standardized tests. It’s even tougher given the March 2022 context and when NAEP results don’t mean anything to students personally.

The second point that I tried to make was that ‘Proficient’ on NAEP doesn’t mean what most folks think it does. To quote Tom Loveless, former director of the Brown Center on Education Policy at the Brookings Institution, in most states the Proficient label is “significantly above” what most state policymakers and assessments deem as ‘grade level,’ and the more accurate NAEP proficiency level for most states would be closer to Basic. This is rarely if ever mentioned in media coverage of NAEP results. Accordingly, most people tend to interpret ‘proficient’ in the ordinary sense of the word rather than in an aspirational sense, which is why I mention it whenever I think it is relevant (e.g., in my tweet). As I said in my blog post post, we can have some interesting discussions about whether Basic or Proficient is the right NAEP target for states, but we should at least recognize that Proficient is very aspirational in most parts of the country.

Here in Colorado, about 75% of 4th graders met the Basic NAEP standard in math for 2022, and about 36% met the Proficient standard. In 8th grade, about 63% of Colorado students met the Basic NAEP standard in math in 2022, and about 28% of students met the Proficient standard. Those results are essentially equivalent to the 2022 national NAEP averages for math. For reading, about 68% of Colorado 4th graders met the Basic NAEP standard in 2022, and about 38% met the Proficient standard. For 8th grade, about 73% of students met the Basic NAEP standard for 2022 in reading, and about 34% met the Proficient standard. Colorado students’ NAEP results in reading were a few percentage points higher than the 2022 national averages. As many have noted, all of the scores for both Colorado and the nation are down from 2019, which is to be expected.

In my blog post, I also quoted Loveless’ statement that, because the NAEP Proficient standard is so aspirational, “If high school students are required to meet NAEP proficient to graduate from high school, large numbers will fail. If middle and elementary school students are forced to repeat grades because they fall short of a standard anchored to NAEP proficient, vast numbers will repeat grades.” [emphasis added]. We also have evidence that similar percentages of students in nominally higher-performing countries also would have trouble meeting the NAEP Proficient mark. NCES has done the work of mapping Colorado’s state standards for proficiency to NAEP equivalent scores. For math, Colorado’s standard is well above NAEP Proficient in 4th grade and close to Proficient in 8th grade. For reading, Colorado’s standard is much closer to Proficient than Basic in both 4th and 8th grade. Colorado’s standards clearly are more aspirational than those of most other states. Accordingly, fewer Colorado students will be deemed ‘at grade level’ than if our benchmarks were set closer to those elsewhere.

All of which brings us to the concerns that you note in the Denver Public Schools (DPS). DPS is one of 26 urban districts that was sampled in 2022 and, as you stated at DPS Boardhawk, results were worse than for Colorado and for the nation as a whole. DPS’ 4th grade math, 8th grade math, and 8th grade reading results all were essentially equivalent to the large city averages. DPS’ 4th grade reading results were generally a few percentage points higher than the large city averages. Enormous equity gaps exist across student subcategories and, unfortunately, NAEP noted that the 2022 performance gaps are not significantly different from those of 2017. Here is a more-detailed breakdown for DPS in a few categories (click on the links to see these tables):

  • 4th grade math
    • White students: 92% Basic, 62% Proficient
    • Black students: 49% Basic, 12% Proficient
    • Hispanic students: 48% Basic, 13% Proficient
    • School lunch not eligible: 77% Basic, 47% Proficient
    • School lunch eligible: 48% Basic, 10% Proficient
  • 8th grade math
    • White students: 81% Basic, 53% Proficient
    • Black students: 41% Basic, 11% Proficient
    • Hispanic students: 40% Basic, 10% Proficient
    • School lunch not eligible: 66% Basic, 35% Proficient
    • School lunch eligible: 38% Basic, 10% Proficient
  • 4th grade reading
    • White students: 84% Basic, 63% Proficient
    • Black students: 46% Basic, 14% Proficient
    • Hispanic students: 37% Basic, 14% Proficient
    • School lunch not eligible: 72% Basic, 48% Proficient
    • School lunch eligible: 36% Basic, 11% Proficient
  • 8th grade reading
    • White students: 88% Basic, 58% Proficient
    • Black students: 54% Basic, 15% Proficient
    • Hispanic students: 52% Basic, 16% Proficient
    • School lunch not eligible: 76% Basic, 42% Proficient
    • School lunch eligible: 51% Basic, 16% Proficient

Yikes! Those performance gaps are both troubling and persistent! They’re also similar to the other large city districts that were sampled, better than some and worse than others. Denver basically is in the middle of the pack for the 2022 NAEP sample of large city school districts.

To quote my own tweet, these performance “divides continue to be of concern.” Like you, I believe that DPS should be extremely transparent about those performance gaps. Also like you, I hope that DPS identifies publicly some concrete plans and actions to remedy its existing equity issues. Additionally, I’m cognizant of the difference between NAEP performance gaps and NAEP performance levels (which is what I was trying to say, albeit artlessly, in my tweet). If Colorado makes it harder for students to be deemed ‘proficient’ than in most other states, of course we’re going to say, “look, fewer kids are proficient!” That’s how we set up the system in the first place (and, once again, we can have a rich discussion about where the line should be set for proficiency). We’re also probably going to say that fewer students are ‘proficient’ in a large urban school system because, sadly, that’s basically the pattern that we see in big city school districts all across the country. Equity gaps are large and persistent in America for students of color, who live in poverty, whose primary language isn’t English, or who have a disability, and the past couple of decades of school reform haven’t done much to alter those. DPS isn’t doing great on these fronts, but it’s not an outlier either.

You said in your tweet that I should be “outraged by Black and Latinx proficiency levels.” Am I – and, without speaking for them, probably my colleagues at the University of Colorado Denver – outraged about proficiency gaps? Yes, of course. Just like you, we also care about equity and we all are fighting for historically-marginalized children across a variety of fronts. Am I personally outraged about proficiency levels? Less so, given the fact that Colorado decided to set a much higher bar than most other states. ‘Proficiency’ is a politically-determined label, not a context-free indicator. If DPS was in most states in America, we would say that 41% of its Black students were ‘at grade level’ in 8th grade math instead of 11%. That number is still terrible, particularly compared to their White student peers, but it’s not “1 in 10” either. Again, if Colorado sets the bar higher, by definition fewer students will be proficient. For me, the gaps are much more alarming than whatever level we apply to children’s performance. I think that the concern is in the inequity, not the label?

Van, this is a long post. You may disagree with much of what I said here, and who knows if you even read through to the end or not. But if you did, let me close with this: I think that you and I both have a similar passion for equity in schools, and I also think that we both have a passion for making school different, particularly for historically-marginalized children and families. I might be wrong, but I don’t think so. Given your work with DSST and The Odyssey School of Denver and my work around instructional redesign and leadership for deeper learning, I think that we might have a really interesting and productive conversation together. Let me know if you ever want to have a meetup. I’m happy to join you for lunch or coffee at whatever location is easy for you.

Thanks in advance if you actually read through all of this. Hope we get a chance to talk sometime.

SCOTT

Much ado about NAEP

Scores on the National Assessment of Educational Progress (NAEP) are down after the pandemic. Surprise!

Four big thoughts on all of this…

1. Below is the Centers for Disease Control and Prevention (CDC) graph of daily COVID cases in the U.S. Note the huge spike in January 2022 due to the Omicron variant. Also note that the National Center for Education Statistics (NCES) chose to administer the NAEP tests in March 2022, during the downswing of that huge spike in cases and after two years of COVID trauma (six weeks later America hit the 1 million dead mark). How many kids, families, and educators were ill, recovering from being ill, or still traumatized from loved ones’ deaths, illnesses, or long recoveries? We’ll never know.

CDC COVID graph

2. Always remember that the labels for NAEP ‘proficiency’ levels are confusing. Journalists (and others) are failing us when they don’t report out what NAEP levels mean. For instance, the New York Times reported this graph today from NCES:

2022 10 24 NCES NAEP scores“Appalling,” right?! That’s what the U.S. Secretary of Education, Miguel Cardona, said about these results. Just look at those low numbers in blue! 

BUT… ‘Proficient’ on NAEP doesn’t mean what most folks assume it does. NAEP itself says that ‘Proficient’ does not mean ‘at grade level.’ Instead, the label Proficient is more aspirational. Indeed, it’s so aspirational that most states are not trying to reach that level with their annual assessments. See the map below from NCES (or make your own), which shows that most states are trying for their children to achieve NAEP’s Basic level, not Proficient:

2019 Grade 4 Reading NAEP and state standards

Once again, in the words of Tom Loveless, former director of the Brown Center on Education Policy at the Brookings Institution, “Proficient on NAEP does not mean grade level performance. It’s significantly above that.” So essentially the New York Times and others are reporting that “only one-fourth of 8th graders performed significantly above grade level in math.” Does that result surprise anyone?

Loveless noted in 2016 that:

Equating NAEP proficiency with grade level is bogus. Indeed, the validity of the achievement levels themselves is questionable. They immediately came under fire in reviews by the U.S. Government Accountability Office, the National Academy of Sciences, and the National Academy of Education. The National Academy of Sciences report was particularly scathing, labeling NAEP’s achievement levels as “fundamentally flawed.”

Loveless also stated:

The National Center for Education Statistics warns that federal law requires that NAEP achievement levels be used on a trial basis until the Commissioner of Education Statistics determines that the achievement levels are “reasonable, valid, and informative to the public.” As the NCES website states, “So far, no Commissioner has made such a determination, and the achievement levels remain in a trial status. The achievement levels should continue to be interpreted and used with caution.”

 

Confounding NAEP proficient with grade-level is uninformed. Designating NAEP proficient as the achievement benchmark for accountability systems is certainly not cautious use. If high school students are required to meet NAEP proficient to graduate from high school, large numbers will fail. If middle and elementary school students are forced to repeat grades because they fall short of a standard anchored to NAEP proficient, vast numbers will repeat grades. [emphasis added]

In 2009, Gerald Bracey, one of our nation’s foremost experts on educational assessment, stated:

In its prescriptive aspect, the NAEP reports the percentage of students reaching various achievement levels—Basic, Proficient, and Advanced. The achievement levels have been roundly criticized by many, including the U.S. Government Accounting Office (1993), the National Academy of Sciences (Pellegrino, Jones, & Mitchell, 1999); and the National Academy of Education (Shepard, 1993). These critiques point out that the methods for constructing the levels are flawed, that the levels demand unreasonably high performance, and that they yield results that are not corroborated by other measures.

 

In spite of the criticisms, the U.S. Department of Education permitted the flawed levels to be used until something better was developed. Unfortunately, no one has ever worked on developing anything better—perhaps because the apparently low student performance indicated by the small percentage of test-takers reaching Proficient has proven too politically useful to school critics.

 

For instance, education reformers and politicians have lamented that only about one-third of 8th graders read at the Proficient level. On the surface, this does seem awful. Yet, if students in other nations took the NAEP, only about one-third of them would also score Proficient—even in the nations scoring highest on international reading comparisons (Rothstein, Jacobsen, & Wilder, 2006).

Similarly, James Harvey, executive director of the National Superintendents Roundtable (he also helped write A Nation at Risk), noted:

The NAEP benchmarks might be more convincing if most students elsewhere could handily meet them. But that’s a hard case to make, judging by a 2007 analysis from Gary Phillips, former acting commissioner of NCES. Phillips set out to map NAEP benchmarks onto international assessments in science and mathematics.

 

Only Taipei and Singapore have a significantly higher percentage of “proficient” students in eighth grade science (by the NAEP benchmark) than the United States. In math, the average performance of eighth-grade students could be classified as “proficient” in [only] six jurisdictions: Singapore, Korea, Taipei, Hong Kong, Japan, and Flemish Belgium. It seems that when average results by jurisdiction place typical students at the NAEP proficient level, the jurisdictions involved are typically wealthy.

We can argue whether the correct benchmark is Basic or we should be striving for Proficient, and we all can agree that more kids need more support to reach desired academic benchmarks. But let’s don’t pretend that ‘Proficient’ on NAEP aligns with most people’s common understandings of that term. We should be especially wary of those educational ‘reformers’ who use the NAEP Proficient benchmark to cudgel schools and educators.

3. Lest we think that these NAEP results are new and surprising, it should be noted that scores on NAEP already were stagnant. Achievement gaps already were widening. After nearly two decades of the No Child Left Behind Act and standards-based, testing-oriented educational reform – and almost 40 years after the A Nation at Risk report – the 2018 and 2019 NAEP results showed that the bifurcation of American student performance remained “stubbornly wide.” We continue to do the same things while expecting different results, instead of fundamentally rethinking how we do school.

4. The pundits already are chiming in on the 2022 NAEP results. They’re blaming overly-cautious superintendents and school boards, “woke” educators, teacher unions, parents, online learning, video games, social media, screen addiction, “kids these days who don’t want to work,” state governors, and anything else they can point a finger at. As I said yesterday, it’s fascinating how many people were prescient and omniscient during unprecedented times, when extremely challenging decisions needed to be made with little historical guidance, in an environment of conflicting opinions about what was right. Despite the massive swirl of disagreement about what should have occurred during the pandemic, many folks are righteously certain that they have the correct answer and everyone else is wrong. The lack of grace, understanding, and humility is staggering. 

Also, look again at the graph above. One way for journalists, commentators, and policymakers to frame those results is to call them ‘appalling.’ Another way is to say:

Scores are down but, even during a deadly global pandemic that shut down schools and traumatized families, the math and reading achievement of about two-thirds of our students stayed at grade level or above. How do we help the rest?

Always consider how an issue is framed and whose interests it serves to frame it that way (and why).

We can whirl ourselves into a tizzy of righteous finger-pointing, which is what many folks will do because it serves their agenda to do so. Or we can

I think that it’s unlikely that many states, schools, and communities will actually do this because of the fragility and brittleness of our school structures. But I’m pretty sure that the path forward is not simply doubling down on more math, reading, and testing, and it sure isn’t uncritically accepting NAEP results.

Your thoughts?

2022 10 23 mcleod tweet

We’re back in school. Did we lean into care or compliance?

Most schools here in the U.S. now have been back for a month or two. And I’m hearing from educators that things are … ‘better.’ Which has me wondering, “How are we defining better?”

As we all know, the end of the 2020 school year and the entire 2020-21 school year were an incredible challenge. Schools shut down. People died. Everything was disrupted, and everyone was scared and anxious. Then, over the summer of 2021, we were much too optimistic about an allegedly ‘normal’ return to school. And it wasn’t. In many (most?) schools, the 2021-22 school year was somehow even tougher than the previous one as we experienced extremely high levels of student refusal and absenteeism, educator stress and burnout, and so on.

In a conversation with Catlin Tucker, I wondered how much better last school year could have been if we had leaned more into relationships and care. There was so much policy rhetoric around students’ ‘learning loss.’ Accordingly, many schools jumped much too fast into their traditional instructional processes without really addressing the trauma that children (and educators) still were carrying with them at the beginning of the school year. And it didn’t work.

I hypothesized in that discussion that if we had started the first few weeks with a significant focus on relationships and care and getting students and families the supports that they needed (say, 80% of our time and energy) and a lesser emphasis on the academic stuff (say, 20%), we could have laid the groundwork for a much smoother school year as we created a stable foundation that allowed us to transition back to ‘normal’ expectations. But many schools didn’t do that, at least not sufficiently to remedy the problem. It was as if we knew that our young people still were traumatized but didn’t want to address it genuinely, at the levels that our children deserved. Sure, we recognized and paid lip service to the issue, and maybe even halfheartedly implemented some new socio-emotional learning (SEL) program, but we didn’t really meet kids’ needs. The proof was obvious as we mostly tried to return to regular learning-teaching practices and then wondered why kids’ behavior, attendance, and academic performance were so terrible and why teachers were incredibly stressed and leaving the profession.

The past few years have shown that the rigidity of our school systems is also a brittle fragility, particularly during a time of dire need for young people and their families. The saddest part of last school year may have been that we could have hit the reset button at any time. We could have taken a pause from school as we know it, invested more deeply into kids rather than content, and built, together, to where we needed to be. But we chose not to. We just kept on with the things that weren’t working, and children and educators paid the price.

All of which brings us to this school year, which supposedly is ‘better.’ And I’m wondering why. Did we finally transform how we interact with our children? Did we finally center their emotional and trauma needs and establish foundational structures of relationship and care that allow us to learn together in functional community? Or, as I suspect from the many educator discussion areas that I’m in, at the beginning of this year did we just lean more heavily into ‘expectations’ and ‘consequences’ that ignore underlying root causes and instead emphasize control and compliance? In other words, if one end of a continuum might be framed as ‘Kids are struggling so they need care’ and the other end might be framed as ‘Kids are struggling so they need control,’ which end of the continuum did our schools lean into? Did we create new, effective systems of care or did we just socialize and force our young people into submission (as we always seem to do)?

Control versus care

How about your school? What did it lean into this year?

4 Shifts Protocol sessions at InnEdCO 2022

InnEdCO 2022This year there are not one… not two… but THREE 4 Shifts Protocol sessions at the annual InnEdCO conference!

I do a basic introductory workshop on Monday. Gina and Robbi have created a fabulous workshop and I can’t wait to see their session in action on Tuesday. Then I will try and extend all of this work even further during my Wednesday workshop. Descriptions are below…

 

Monday, June 13

Redesigning for deeper learning and student engagement [2-hour workshop]
Scott McLeod

Many schools have created future-ready vision statements and college- and career-ready profiles of a graduate. But most schools still are struggling to transition their day-to-day classroom instruction to include more critical thinking, problem-solving, creativity, and other ‘future-ready’ student competencies in ways that are substantive, meaningful, and aligned to those vision statements and graduate profiles.

This workshop focuses on how to redesign classroom instruction for future-ready learning. We will use the free 4 Shifts Protocol to redesign lessons, units, and other instructional activities together for deeper learning, greater student agency, more authentic work, and rich technology infusion. The protocol contains concrete, specific ‘look fors’ and ‘think abouts’ that allow educators, coaches, and instructional leaders to shift students’ instructional work in deeper, more robust directions. The protocol is a useful complement to SAMR, TPACK, Triple E, and other frameworks that schools may be using, and also is an excellent capacity-building bridge to more complex inquiry and PBL projects.

This active, hands-on workshop is intended for teachers, instructional / technology coaches, and school leaders who are prepared to roll up their sleeves and dive into this important instructional redesign work!

 

Tuesday, June 14

A permanent pivot: [Re]design your lessons [2-hour workshop]
Gina Francalancia-Cancienne & Robbi Makely

This session incorporates Dr. Scott McLeod’s 4 Shifts Protocol and is designed to introduce teachers to practical skills to (re)design lessons focusing on deeper learning, greater student agency, more authentic work, and rich technology infusion. Teachers will learn to recognize the four shifts, evaluate ways to personalize the four shifts, (re)design a lesson, and use the four shifts to permanently pivot to incorporating the shifts into future.

This session is targeted for teachers PK-12, special education, literacy programs, gifted and talented classrooms, instructional coaches, and administrators.

 

Wednesday, June 15

Using blended learning structures to facilitate deeper learning [2-hour workshop]
Scott McLeod

New technologies give us new possibilities. In this workshop we will identify several different blended learning structures and how they might be used to facilitate students’ deeper learning, greater student agency, and more authentic, real world work. Station rotations, genius hours, flipped classrooms, flex models, and other blended learning strategies can create powerful pathways for our children. Bring a computer and come prepared to roll up your sleeves and engage in some active (re)design discussions!

This active, hands-on workshop is intended for teachers, instructional / technology coaches, and school leaders who are prepared to roll up their sleeves and dive into this important instructional redesign work!

 

Hope you’ll join us for one or all of these sessions!