Sunday, December 5, 2010

Head Banging


This fall, work demands have put a serious crimp in my school meeting schedule -- and (to be honest) in my willingness to bang my head against the wall known as "public engagement" at Seattle Public Schools. But last Monday I decided it was time to get back into the ring -- or at least into the loop -- so after dinner (and a prophylactic rum cocktail) I headed down to South Lake High School to hear what Southeast Director Michael Tolley had to say about the District’s recently released School Reports.

These reports represent the District's effort to track each school’s progress on a variety of measures, from test scores to student absences to the teachers’ feelings about their school’s leadership. The schools have had annual reports before -- they’re available online going back to 1998 -- but these new ones go into considerably more detail. They also include a one-page Improvement Plan for each school -- goals to raise achievement, or attendance, or whatever -- and a description of what the school is doing in order to reach those goals: instructional coaches, individual tutoring, more collaborative staff time, and so on. And every school has now been ranked on a five-point scale based on overall student performance and improvement on standardized tests, and the achievement gap between poor kids (those who qualify for free and reduced-price lunches) and everyone else.

To no one’s surprise, the Level 1 and 2 schools -- with crappy test scores that won’t budge -- are concentrated in the South End, and more Level 4 and 5 schools -- high scores, more growth -- are in the North.  In fact, the Southeast Region contains six Level 1 schools, five Level 2 schools, and no Level 4 or 5 schools at all. Meanwhile, in the Northeast Region, there are eight Level 4 or 5 schools, and not a single school has a rating lower than 3. (It seems worth pointing out, as my friend Dan did, that if you ranked Seattle’s public schools according to their free-and-reduced-lunch populations, you’d get a pretty similar distribution -- much of the disparity between North and South may be due to the greater challenges involved in teaching poor kids. Cleveland High School's principal (whose name is, wonderfully, Princess Shareef) said last night that she was sure if her staff switched places with the staff at, say, Roosevelt, Roosevelt’s test scores would stay just as high. "Our kids just need more – more support, more resources – in order to succeed.")

The purpose of the meeting at South Lake was to get feedback from the community about the new reports, and about how SPS should go about improving schools -- across the board and in Southeast Seattle in particular. Now, I tend to process big questions more slowly than the typical public meeting structure allows. I’m rarely able to stand up and deliver my testimony to a crowd, or even scrawl it on those ubiquitous comment cards with any kind of coherence. But most of the time I do -- eventually -- have something to say.  So, SPS, since you asked, here is some feedback:

First of all: I feel like your relentless reliance on quantifiable data is obscuring some of the things that really matter to me. Now, I know it's important to have specific goals and to track measurable results. But in order to get data tell you anything useful, you have to understand its context. You have to know where the numbers came from, and what their limitations are. You have to ask yourself how (and if) what they're measuring relates to what you think is important. You have to put the data in the service of human connections, experience, and judgment -- instead of the other way around.

My friend Bryan, for instance, has explained to me how as a math coach he helps teachers at South Shore analyze MAP results, unit tests, and "exit slips" to identify which kids need extra help, to decide which Everyday Math lessons they can skip, and to gauge kids' mastery of a range of tasks, from basic regurgitation ("find the perimeter of this rectangle") to higher level thinking ("when would you need to find the perimeter of a rectangle?"). The numbers inform, but do not dictate, the work. This seems like a fine -- though quite labor-intensive -- way to use the numbers.

Your school reports, on the other hand, are brimming over with authoritative-looking statistics, but offer very little in the way of context or explanation. What do these numbers mean? Not always what you think. Charlie Mas has pointed out the truly tangled math behind the “students making gains” figure: it apparently doesn’t measure whether the kids did better on the state tests than they did the year before, but whether they did better than at least a third of the kids who got the same score they did last year. Which of course two-thirds of them will, by definition. So unless that stat is above 67%, it’s actually bad news. (Do I have this right, Charlie?)

Or how about this one: I was pretty shocked to read in Orca’s school report that 0% of last year’s 8th graders were ready for high school math. Zero? Seriously -- not one kid? (Well, maybe one -- there were fifty of them, I think.) And my children are going to this school? I learned last night that this big fat zero simply reflects the fact that Orca’s pass/fail "passes" got counted as “D”s by your computers.

I found this only slightly more reassuring than infuriating.

Orca chose to use a pass/fail system for the middle school in order to focus kids on their work instead of on their grades. (I won't bore you with the research that supports this decision.) Still, the teachers carefully entered each kid’s score on every assignment and test into your online database so we could all see how they were doing. And it seems to me that those percentages could easily have been converted to letter grades -- if your “data warehouse” hadn’t erased them all at the end of the semester. In Donte’s social studies class, my daughter actually had to get a 90% to pass -- no easy task, believe me -- and then, presto! She officially got a "D."

This year Orca has given in and is giving grades -- the ramifications of all those "D"s on everyone's transcripts were too serious, and the prospects for getting you to fix the system were too dim. (I long ago gave up on getting you to list Orca as both an alternative school and a K-8 on your web site.) I know this hardly rates among all the other bad news, but I find it pretty pathetic that we’ve had to compromise this aspect of our alternative pedagogy in order to accommodate your frakking computer system.

An even more egregious example of the damage that can be done by numbers taken out of context can be found at South Shore (formerly the New School). Test scores at South Shore went down, down down last year, especially for poor kids and kids of color. Whoa, what happened? All that extra money from the New School Foundation and they can only get 11% of their fifth graders up to speed on the state science test? Did those teachers suddenly forgot how to teach? Who's going to be held accountable for this fiasco?

Well, maybe we should string that whole staff up by their thumbs... Or maybe we should consider the fact that a week before school started, as the South Shore was moving into a new building and hiring teachers for their new 7th grade, they were suddenly told that they'd be getting a whole bunch more kids -- including a whole third class of 6th graders -- from nearby schools that were closing due to budget cuts, or schools that hadn’t made "Adequate Yearly Progress" on standardized tests. Under the federal No Child Left Behind law, students at any school that fails to make AYP for two years in a row have the right to transfer to a school that has. And South Shore -- due no doubt in part to that extra foundation money -- was the only school in the South End that had made AYP. 

Many of the kids who transferred from these officially failing schools had pretty intense academic needs, emotional and behavioral issues, and a deep distrust of authority -- particularly white teachers. The South Shore staff spent tons of time and energy working with those kids, and in many cases made a great deal of progress. (I wish I could say this didn't take away from the educational experience of the rest of the kids, but alas, the test scores indicate otherwise.)

Then it turned out that South Shore’s new building was full of toxic fumes that caused rashes, itchy eyes, and breathing problems -- conditions not conducive to learning, shall we say. In February the entire school had to be evacuated to two separate sites, and the staff and parents spent much of the rest of the year arguing with your facilities department and risk management lawyers about the causes and effects of the fumes.

Given all this -- an influx of at-risk, academically-behind kids; a mysterious, poisonous miasma; and the chaos of relocating, twice -- are we really surprised that South Shore’s test scores went down?

Do I sound like I'm making excuses? I'm not making excuses. Nor do I mean to imply that test scores at South Shore -- or anywhere else -- are predestined by the kids who walk in the door, or that schools with lots of poor kids and kids of color are doomed to failure. Not at all. Demography is not destiny.  It is absolutely the responsibility of Seattle Public Schools (and the rest of us too, if we know what’s good for us) to make sure every child gets a fabulous education, no matter what the color of their skin, the size of their parents’ bank account, or which end of town their school happens to be located in. There are schools out there that do a great job teaching all kinds of kids -- and South Shore is one of them. I'm just pointing out that sometimes factors outside the classroom get in the way of that happening. I had something of a front row seat to a lot of South Shore’s trauma last year, and I just can't stand to see those smug little numbers sitting there on that school report, totally failing to acknowledge any of it.

Now, these are just the two schools I happen to know the best. There may well be stories like these behind the other reports too -- both for Level 1 and 2 schools and the ones that are ostensibly succeeding. So I guess that's my first point: I'm not sure how you could incorporate more context into your school reports, but without it the numbers tell an incomplete and often misleading story -- damaging to you and to the schools themselves. 

I'm also concerned about the effect of all this quantifying, tracking, and reporting on your academic goals -- and more importantly, on your educational vision. When you measure achievement in this way, it makes learning seem like a linear process, a ladder of standards we are marching the children up, step by step. Remember that silly cartoon in "Waiting for Superman," where knowledge is this goopy substance the teacher is steadily ladling into the children's empty heads? If you think of education that way, then it's easy to set goals: just count backwards from graduation, and calculate how much goop you should be ladling in, year by year, to get there on time. The federal government's AYP goals were set this way: someone decided -- admirably -- that all kids should be meeting rigorous academic standards by 2014, looked at how woefully behind they were when the law took effect in 2002, and then calculated a graduated schedule of improvement -- this much by 2006, this much by 2010, etc. What they didn't do, it seems to me, is think about what kind of education we really want for our kids, and what it was really going to take to make it happen.

Some of the goals on your school reports read like this too -- incremental advances based on the (dismal) status quo, rather than on any kind of big-picture vision for our kids. For instance: Orca’s goal for next year is to have 38% of its 6th graders meeting state math standards. This would be a significant improvement over the 28% who met them last year, but surely the goal should be for all of them to meet the standards? I suppose that seemed too pie-in-the-sky to the tired person who filled out the form that produced these reports -- 38% is a more realistic target, I guess. But honestly, if less than a third of the kids are up to speed, maybe we need to rethink the whole enterprise, rather than sedulously plotting our plodding progress toward the halfway mark.

I dropped in on Orca's 6th grade Math and Science teacher yesterday to get his take on all this stuff, and came away both inspired and discouraged by what he had to say.

"What really motivates kids?" Jeff mused aloud. "I can sit here and tell them, 'If you learn these things, your life will be more beautiful,' or whatever. And they will just look at me. So then I say, 'Well, let's just say, what if it would," and they look at me again. So then I say, 'Well, we're here anyway, we might as well do the work.'

"But what they really want, at this age, is to be engaged in the community, doing work that's real. Kids who are checked out in class, or goofing around -- if you put them in a situation where they're doing a real experiment, or a real project, with people counting on them to make something actually happen -- they snap to. They rise to the occasion. That's what gets their attention." (This was certainly my experience with the "No Place Like Home" project last year.)

I asked him if he thought data on student achievement could help him do a better job. "Well," he said, "within the context of a system where you set a standard, and the teachers work to teach it, and the students work to learn it, and then you move on to next standard and start again -- sure, data can help you push people up the ladder more efficiently. But is that the right way to do this?" He shrugged. "They walk in here excited and curious -- let's not kill that. Let's let them keep exploring, following their curiosity. And find a way for whoever shows up to move forward from where they started."

For better or worse, math is definitely a push-people-up-the-ladder proposition at SPS, so that's what Jeff does with the sixth graders under his care. And I suppose it makes some sense to make sure they've mastered place values before you teach them percents, and to tackle geometry before trig.

In science, however, he's got a little more leeway. SPS's science curriculum is based on a collection of "inquiry based" science kits. Sixth grade teachers choose two units from among four options: "Diversity of Life," "Magnets and Motors," "Solutions and Pollution," and "Truth about Science." (Am I the only one who finds that last title a little alarming? Are the other kits all full of lies?)

Jeff chose to take his kids down to Seward Park and have them do field work on water quality, native plants, and other things, using supplies and equipment from the district's "Truth about Science" kit. This meant walking down to the park once a week, rain or shine. It meant getting kids who had never spent time in the woods to feel comfortable there. It meant helping them develop scientific questions ("What can these tree rings tell me about climate change over the last century?") and figure out how to answer them ("Is this sample size big enough to tell me what I want to know?"). 

This approach -- getting the kids out into the world to do real experiments of their own -- may or may not prepare them for the state science test. (They won't actually take the test until 8th grade, and it could include questions about magnets or motors, solutions or pollution, astronomy or zoology, or anything in between: "Science" is a vast universe, after all.) But even if their adventures in the woods don't result in measurable academic achievement in the eyes of the state, would we really rather have those kids sitting in class swilling down quantifiable standards and filling out exit slips? Does that really strike you as a more likely way to inspire them to become scientists?

I really hope we can find a way to support teaching that starts with what's going on in the kids' heads, instead of with a giant cauldron of goop we want to spoon into them. But I'm afraid we're headed rather quickly in the other direction, both as a district and a nation.

Finally, SPS: Let’s be honest about what it's going to take to turn the schools of Southeast Seattle around, and what our prospects are for making it happen in our current context. When someone asked Mr. Tolley how the district plans to pay for all the extra coaches and collaborative time outlined in the School Improvement Plans, he talked about allocating resources more efficiently, targeting the needier schools with more support.

Of course, when we talk about "support" and "resources," what we really mean is M-O-N-E-Y -- precisely what the district lacks. When I asked a follow-up question about the budget gap already yawning before us, and the even deeper cuts coming down the pike, Mr. Tolley acknowledged that this was an issue, and suggested that we all fill out the district’s online survey asking us to prioritize different areas of the budget, so they can figure out where to focus the axe.

At about this point, a woman in the back stood up and said, "You know, every PowerPoint slide you've shown us has that blue banner across the bottom, 'Every student achieving, everyone accountable.' Given what you're telling us, I actually find that very offensive." Fionnuala, sitting behind me, was also frustrated by the lack of both achievement and accountability: "I want to see a plan, step by step:  here's what we're going to do to fix this, and here's are the consequences if it doesn’t get better.”

It seems to me that there's plenty of planning going on -- every school has identified the steps it intends to take to fix its failings --  but Fionnuala is correct in thinking that nothing in these reports (and nothing else we heard at the meeting) is going to prevent us from sitting here next year looking at the same damn numbers. As, of course, we have already been looking at them, for years.

Now, there may be no accountability here -- nobody to fire, nobody to sue, nothing to force us to improve the struggling schools that serve our neediest kids -- but there are certainly consequences if we fail to fix this. We can keep up this endless cycle of planning and implementing, monitoring and adjusting, responding and reporting -- and we can keep dancing clumsily around the fact that really, there just isn't enough money to do the job right.

But if we do, we’ll keep on losing rich white kids to private schools, and poor black kids to prison, and good teachers of all races to some other profession that doesn’t try to hold them "accountable" for the inequalities of our screwed-up society. Those are the consequences, it seems to me.

I don't know about you, Seattle Public Schools, but I find those consequences unacceptable. I got the sense they were unacceptable to pretty much everyone in that room last Monday, including Michael Tolley. If that's true, then all of us -- parents and teachers, principals and bureaucrats, legislators, business leaders, and students too -- we're all going to have to put our despair and cynicism aside, roll up our sleeves, and get our asses in gear. We owe it to Princess Shareef, who told Fionnuala, "I wouldn't be here if I thought it was hopeless." We owe it to Bryan and Jeff and all the other teachers who are out there using data and planning and imagination and compassion to teach our children the things we've decided they should know. Most importantly, we owe it to the kids who are getting up every morning and walking into class at Rainier Beach High and Hawthorne Elementary.

We don't have all the answers yet, that's clear. But we can at least support the things that are working -- and there are things that are working. We can find ways to spread those ideas throughout the system, hopefully without grinding the life out of them. We'll need to use data, of course -- but only if we're clear about what we're measuring, and clear also about the deeper educational vision behind those measurements. And somehow, somewhere, someday, we'll need to find the money to pay for it all.

11 comments:

Anonymous said...

Permission requested to share this with history blogger Historiann?
David Salmanson

Mikala said...

Dave -- Of course... Share away.

xo
mik

Susan Hayden said...

Oof, that opening graphic is a classic.

Mikala, as usual, this was eloquently expressed and hit the nail on the head. It sounds like what happened at that meeting that I missed was....nothing. Empty assurances and no real plan.

Susan Hayden said...

I should add about South Shore:

Our new principal held a meeting this week to address our dismal report card. Within her presentation she slapped up a slide with a graphic comparing scores of those who have been in the school for 3 years, versus 1 year. That graphic was very telling, and showed a very clear benefit for the students who have been in the program long-term. Too bad the SPS report card cannot communicate that benefit, or provide any analysis supporting or explaining the numbers. I'm afraid I've gotten into arguments with other bloggers who use our report to say, "see, New School program is a big fat failure". Aaargh.

Oh, and the Orca "pass" being counted as a "D"? That's the most ridiculous thing I've ever heard.

LG said...

Mikala,

Can you please send this to every school board member? Or I can do it for you if you want!

Also can I link to it from http://saveseattleschools.blogspot.com/ ?

-Laura

Mikala said...

I sent it to Betty Patu and Charlie Mas (of saveseattleschools) -- go ahead and forward or link, or whatever -- the more the merrier!

mik

Maureen said...

Thank you for writing this.

peonypower said...

Excellent post. I will say that the district is leaving all struggling students hanging out to dry, and location does not have much to do with it. Our school barely has any IA support and our sped. and ELL populations are nose diving due to lack of resources.

There is no planning on how to help these students. All district time and energy is spent on curriculum alignment rather than how to engage and support students, and if an instructor resists all the nonsense and works on teaching students as opposed to meeting every "alignment requirement" you are dinged for not having a "purpose statement" on the board. So little of what is happening has anything to do with what students actually need and it has got to stop.

dan dempsey said...

Great points about how the District uses data and how the District should be using data.

"To improve a System requires the intelligent application of relevant statistics."

The saddest part of this for me is that the data shows that the UW and the Central Administration are completely clueless as to how to bring about improvement. But from Page 9 of MGJ's Quarterly Update on Listening and Responding:

"We also feel strongly that our principals need ongoing support and training and so we provide them with ongoing opportunities to learn and grow as well. We established a new program, Superintendent Initiative for Leadership Development (SILD), which pairs central office leaders with principals in professional development sessions. As central office leaders we must find innovative ways to ensure all our schools, specifically struggling schools, have meaningful partnerships and support for high-quality teaching and learning."

It seems far more likely that solutions will be coming from those on learning's front lines than those at JSCEE.

These JSCEE folks need to give up on finding new innovative ways and try listening to parents and teachers. MGJ does not listen and does not intelligently and successfully apply either data or research. (In addition to being a very poor manager.)

TEAM MGJ has exhibited no feel for much of anything.

The Superintendent regularly violates laws and policies.

There are decisions, which would produce positive results, but those would be based based on peer reviewed research and a knowledge of a school's community and students.

The Board and the Superintendent are thus far oblivious.

It often seems that any success at schools occurs in spite of those at the JSCEE.

The best comment on this is from Dr. Eric Anderson's original memo sent to the School Director's on 2-2-2010 in regard to the New Tech Network contract for Cleveland:

"Since the data is mixed, the primary question is whether Seattle Public Schools believes strongly in the research-based NTN learning model. Success will more than likely depend on the quality of the program implementation. Knowing ahead of time that the NTN model does not guarantee strong results only enhances the degree to which the burden falls on the district and the schools to achieve success."

Of course MGJ never sent that message to the Court nor is that the message she used in constructing the 3-12-2010 NTN action report. She said she used the memo sent to the Board but instead she used a draft memo from an earlier time.

That means she committed forgery. MGJ is filled with jargon and complete devoid of recommendations that are based on sound and thoughtful research.

She and her team have NO ability to intelligently apply relevant data in the making of thoughtful decisions.

This is so sad. The Attorney General should have already been investigating the Superintendent.

When $800,000 goes to NTN, millions for MAP, and tens of millions to close schools and reopen them, it is no wonder that there is little money left for interventions for struggling learners ... Oh right, Dr. Enfield said they are looking for outside funds for interventions. A core item in any instructional plan and they are hopeful of maybe finding outside funding.

Good Luck to struggling learners and families with this crew entrenched at the JSCEE.

dan dempsey said...

Thanks for the Cool Graphic at the Start of the posting.

Here is my take on that.

What is said about:
How we will achieve our goals:

(1).. Plan (2).. Implement (3).. Monitor and Adjust
(4).. Respond (5).. Report (6).. Plan

Here is what actually happens and why so few goals are achieved.

How we will NOT achieve our goals:

(1).. Plan poorly, neglecting most of the relevant data and peer-reviewed research.

(2).. Implement poorly in a rush manner

(3).. Monitor and Adjust : rosily report on how it is going.

(4).. Respond by deflecting criticism and not changing much of anything.

(5).. Report by making up new and misleading ways to look at what is happening. 66% of students District wide are making gains on Standardized Tests.

(6).. As the above is perfect no corrections are needed, so MGJ and Staff can Poorly Plan some more stuff in the same defective manner as before.

ParentofThree said...

This comment really stood out to me:
"Our kids just need more – more support, more resources – in order to succeed."

Cleveland/STEM was just allocated an huge chuck of resources starting this fall. However...were they the "right" resources? Will the pricey NTN contract convert into student success? Or could that $600K have been better spent?

Since this comment came from the principal of this school, I have to wonder...