Wednesday, January 12, 2011

concerns with planning process

Open letter on concerns with the School District 57 District Planning Process from which the District Plan for Student Success (DPSS) and the School Plans for Student Success (SPSS) owe their origins.

The 2010-2011 DPSS, our annual achievement contract required by the Ministry of Education, has significant problems and needs to be challenged if the school district wants to take improvement plans seriously and if it actually wants staff at schools to consider the goal(s). The DPSS is not without merit -- many parts simply report what is happening around the district -- actions worth celebrating, as they involve the hard work of educators who are passionate about their subjects and care about the progress of their students. Other parts, such as the preamble, have been carried over from previous plans with few edits DPSS_2006-2007.pdf and DPSS_2007-2010.pdf. The core of the DPSS document (the goal and what is to be done with it) has some alarming deficits. As a backgrounder to these observations, I'd suggest taking a close look at the plan first DPSS_2010-2011.pdf, and maybe one or two of the SPSS documents (probably archived at, if not, some samples: PGSS, Duchess Park, Kelly Road)

The District Planning Process describes the planning cycle that involves school plans and district plans for student success. It was laid out in the previous DPSS but was not mentioned in the 2010-11 DPSS. What has happened -- has the process changed? Has it been dropped as part of cutbacks (a reduced capacity to follow the process)? Has anything replaced it? Will anything meaningful be done with the new SPSS? Is it fruitful to submit a plan for which the recipient has given no indication that the plan will be used as intended? Specifically, what happened to the feedback cycle for the SPSS and its use in building the DPSS, neither of which has occurred as described or scheduled? While it may be difficult to answer all these questions, it will help to understand the context for the plans and problems with the planning cycle.

The SPSS on its own has not proven necessary to inform teacher practice or departmental collaboration, although it might serve other beneficial functions such as recording teacher practice and departmental collaboration. This was the direction taken in the last few plans at D.P. Todd and elsewhere -- to capture the dialogue among educators for the benefit of the plan's audience and the reflection of staff. These functions are not held in high esteem when they are not read or reviewed by the SBO that has asked for them as part of a Ministry requirement. Like those of most teachers I know, my own cycle of praxis draws from a deep well of professional literature and supportive colleagues at school and elsewhere; there is nothing explicit in the school plan that I need to complete this cycle. I believe there are strong possibilities for the power an SPSS can have, but when it becomes an obligation and is not useful either to the teacher or the school district, it is time to revisit individual department contributions that are a tradition rather than a requirement.

There was never any clear expectation that all departments would write a plan, nor in the absence of dep't plans that there should be one single school plan. The plans are also different from school to school. Some schools did not submit an SPSS last year -- Carney/ACS and Heather Park did not submit plans (understandably), but neither did CHSS. D.P. Todd, for example, started around 2004 with a few years of department-based plans, in 2006-2009 we had school-wide plans, and in 2009-present is back to dep't-based plans, so we have no standard model to follow. The only required aspect of the SPSS is that the principal must submit one, although administrators, too, may want to ask at the SBO why they should submit a plan that has limited usefulness for staff and what appears to be of no use at the SBO.

Our understanding of professional learning communities and the role of legitimate performance standards has come too far for us to simply disengage from critical thinking when asked to complete perfunctory exercises. Does administration have plans to address concerns with abandonment of the District Planning Process and the significant deficits of the DPSS? As teachers, we can offer a knowledgeable critique of plans and data, etc., but it is better suited for administration to ask difficult and necessary questions at the board office related to the DPSS. This is a great opportunity for administration to lead change by insisting that the SBO wake up on the DPSS/SPSS process and shorten the "knowing/doing gap" we hear about when theory does not meet with practice.

Here's what seems fairly clear to me having read the DPSS and the other SPSS documents that were posted last July, and having participated in and watched closely the district planning process for the last 9 years:

1. The DPSS contains a number of logical fallacies. The first is the inclusion of what should be two paths of inquiry in the same goal statement. Both independent learning and formative assessment are meaningful and complimentary goals, but are not mutually assured. Logical fallacies in the document also include the choice of data, confusing reference for rationale, mistaking correlation for causality, mismatching of the goal to objectives and strategies, and an unclear focus. The cover suggests the focus is independent learning, the puzzle pieces suggest a 4-part focus, the second page suggests the focus is a paradigm shift from teaching strategies to improved learning, and the footer suggests the focus is personalized learning. Again, complimentary, but not necessarily correlative. Additionally, there are some false statements in the District Strengths section such as "this distributed responsibility has led to a great degree of staff and partner group engagement in all aspects of decision making." Many of these errors could have been cleared up at the editing phase.

2. The DPSS goal is incorrectly matched to data. How should the wide-spread use of formative assessment should be measured? Uptake of concepts promoted through district pro-d (registration data)? Survey responses from teachers about their methods of instruction and assessment? External evaluation by experts in assessment? Anything qualitative or quantitative to do with formative assessment? None of the above -- the plan acknowledges how difficult this is to measure and instead displays the same statistics the SBO uses in all its reports: a panoply of summative assessments including success rates, grade transitions, FSAs, and provincial exam scores. It is ironic that while the district uses provincial exam scores to indicate the success of formative assessment, PGSS admin uses the same data to indicate the success of an attendance program, and the Fraser Institute uses the same data to rank schools and so on. Data can't be stretched like that and remain valid.

3. The desire to embed formative assessment everywhere is hollow. It is as productive to say "we want all of our teachers to be in the business of educating students and doing things that help students learn" -- that's not a goal, that's a condition for employment. Formative assessment has come to mean so many things, although the DPSS connects it to 5 principles, a definition of AFL, and 6 strategies for AFL. Anyone who has spent time with these ideas will recognize that assessment and instruction are intertwined and that FA, AFL, and inquiry are all tools to examine classroom practice, steer away from stoic or rigid delivery and focus on what/how students are learning -- these are very flexible ideas and are not new to the scene. When I started teaching in 1995 we called it "checking for understanding" and began our courses with "what do you know" assessments that we'd use to shape instruction. My dad Walt Thielmann talks about designing his English classes and curriculum at Connaught Jr. in the 1960s around the passions, problems, and questions of his students -- virtually all of the learning was formative and inquiry-based. They contracted for grades based on the projects and inquiry they chose (very 21st century!). At most we can say that formative assessment is a gathering of various educational philosophies under a banner defined by its users. The choice of words in the goal is also of note: to "embed" is to lodge something firmly in place, to make it part of the habit or environment. This will look different in every classroom (user-defined) and makes the goal more of a mantra or vision than something practical. With a distinct area of inquiry thrown in ("create independent student learners"), this plan doesn't know what it is or what it wants.

4. The Objectives are largely unrelated to the goal itself. These include: address unique needs of aboriginal learners, increase play-based learning, using the "UDL" strategy with special-needs students, and offering joint teacher-admin pro-d (please tell me where and when this is happening, I have not seen one of these for many years). These are great objectives, but do not depend on or flow from the goal -- without a context they appear quite random.

5. The Support Structures are not really support structures. The list includes "Families of Schools" -- this is simply a rebranding of Zones as a result of school closures last year -- this is not a support structure, it is a description of catchments. The possibility for improved communication between schools as a result of changing the name from zones to families is cynical, especially given the reduced capacity for district-wide communication in the wake of "right-sizing." The second structure listed is "Learning Teams" followed by a highly arguable narrative of how they came into being. The learning teams pre-date the goal and involve a small fraction of the district; they may be useful or positive but they are not substantive instruments driving change towards the stated goal. The third structure "Working Meetings" is mysterious as it describes unknown presenters and unknown ideas and/or strategies. Maybe there will be snacks at these meetings.

6. The Strategies are simply a list of projects already underway in the district and largely independent of the SBO. Seven of the objectives relate to Aboriginal learning and inner city schools, four relate to early childhood learning & literacy, three relate to special education resources, two relate to math education, one relates to writing, one relates to teacher mentorship, one to administrator pro-d, and one relates to AFL. So, only one of the twenty objectives is directly connected to the goal; the rest are projects, highly commendable, but would probably exist no matter what the goal stated. Imagine if we set out to teach a learning outcome and chose twenty activities to do this but only one related to the learning outcome.

7. The SBO's recent track record on implementing district-wide goals is not strong. To use a relevant past example, last year the SBO (superintendent, a trustee, and the tech support coordinator) publicly committed to having and following a real plan for supporting teachers in a changing technology service scenario that included a transition to single-platform PC. Almost a year later there is no plan, no district support mechanisms (e.g. in-service, replacement specs, timeline for transition), and no points of contact to even dialogue about the issue. This work has been left to schools -- perhaps as it should be -- but then why bother with the commitment for district-wide support? There is less collaboration and follow-through on tech planning and direction than at any time in the last 13 years. The disbanding of the District Tech Team was one of the final strokes, with impacts including the rejection of at least five project proposals this year involving "21st Century Learning" technologies. The lesson is that published goals are not useful if the walk doesn't follow the talk. This need not be seen as a criticism -- one of the consequences to the "right-sizing" at the SBO was surely to be some lack of capacity or even a total hiatus on goal-setting, decision-making, and follow-up. Perhaps we shouldn't expect more from the SBO unless we're willing to see more money taken from school allocations.

8. It is doubtful the SBO will take its part of the DPSS too seriously when it has not done the same for the School Plans for Student Success. They have apparently not been read by SBO staff, let alone assessed using their SPSS rubric or handled according to their own District Planning Process described in the previous DPSS. I've polled the staff reps at every elementary and secondary school and have yet to find one that has received feedback of any kind on its SPSS. If some schools are extracting value from their SPSS, fantastic, because the SBO is not. Although the SPSS exists as a school growth plan and accountability contract for submission to and review by the school board (this is in the School Act), it seems that SPSS feedback was a higher priority for the previous C&I department and that there are no known plans to review the current SPSS documents. There does not appear to be any plans to align the DPSS with the SPSSs, something required as part of the District Planning Process.

9. The SPSS/DPSS model is broken. Some of the SPSS documents contain their own contradictions and comical ironies, some schools did not submit an SPSS. Around 2008, the director of school services told a meeting of "POSRs" that after 5 years of District Plans for Student Success, they had produced no measurable results -- no impact!. I had to ask her to repeat that twice, and asked if I could quote her. She said that planning was still important as it provided a chance to discuss common goals, etc., but there was no illusion about these being anything but compliance documents. To her credit, she hoped that the SPSS would become a record of what teachers talked about in schools regarding student learning ("living documents"), and not so much a perfunctory collection of goals and data. The move towards inquiry-based SPSS documents was meant to address this, although many of the inquiries in the SPSS documents are indistinguishable from the old "data dumps" other than stating goals in the form of questions (like Jeopardy responses). A survey of school plans reveals many challenges to overcome: some looked slapped together, confuse correlation with causality, mash up bits of educational ideas or data types with the hopes that they are congruent, lack editing, and use backwards-engineered goals to describe ordinary activities in the school. This last characteristic is at least close to "recording the conversation about learning at your school." Again, those schools who take inquiry seriously (e.g. they leverage the best of what the Network of Performance Based Schools offers) should be commended; what they're doing is closer to what the SPSS could have been.

10. A report is a report. Having written a few SPSS documents, I probably feel more put out than I need to be that the last SPSS was "shelved" but I should not be surprised. Didn't we have the same concerns about the Accreditation process and documents? Isn't this common in bureaucracies? We hear far often that reports and plans are a waste of time, but it behooves us to move beyond derision and either abandon perfunctory exercises or redeem the process. I think we should take the DPSS for what it is worth, a compliment on the good work done by educators and students, and an encouragement to keep thinking about how your practice can improve. Actually, if one crossed out the whole goal part, the rest of the DPSS would make more sense as a living description of what is already happening in the district. I've read some excellent SPSS and school growth plans from our district and others, and many poor ones, but sensible, inspiring district-level growth plans are quite rare. Imagine how hard it is to built a tent over the diversity of teaching and learning that occurs across an entire district. We would be better off having a wiki, forum, or annual gathering in which to share successes and challenges than we are with the present format.

These are observations and, of course, opinions, but I believe they are factual and documentary evidence for all ten of these points are widely available (as well as suggestions for improvement and alternate models). These are not blunt criticisms (which are perhaps not appropriate for a blog post), but they are nonetheless critical in nature, as in "critical inquiry." I believe we work in organizational contexts that produces these kind of results regularly and perhaps inescapably, and so critical inquiry is needed if wish to improve public education. Any one of these ten observations should be enough to raise questions about the District Planning Process; the fact that there are ten (which is where I chose to stop), tells me the problem is endemic to a culture for which we are all responsible as public educators. These observations centre on processes used by our SBO, but should be owned by the whole district as we are all asked to contribute to the SPSS/DPSS cycle and have many opportunities to stop the comical parts in their tracks if we so choose. I would recommend starting this by ensuring that each of our own school's SPSS have goals that are legitimate and logical,  data that matches the problem, and inquiry that is worthwhile and engaging. I think we have ended up with reasonable SPSS documents at my school in the past, but our plan needs to change if the context in which they are received no longer complies with the District Planning Process. In particular, a survey of SPSS documents shows that there is confusion over what constitutes valid data and inquiry. This is a wide-spread problem that requires attention.

Again, I applaud the schools who use their SPSS to truly reflect the best of what they do, and I applaud the parts of the DPSS that recognize success where is it due. My motivation for sharing these thoughts is that school and district plans are published on the internet and reflect on all educators and can be linked to individual schools, administration, departments, and teaching staff. I consider myself responsible for a part of the "plan -writing culture" in our district as I was paid some money and time for five years to be, among other things, a plan author. As it stands now, the contradictions in the DPSS are embarrassing. SBO staff have talked about data-driven decisions and the knowing/doing gap; the first place these become an issue are in their own published plans. I really hope that trustees, school administrators, or SBO staff can take the time and form the resolve to let the writers of our DPSS know that their plan-writing is in need of some formative assessment.