In the name of time and coherence, I am planning on cutting the last two paragraphs when I address the school committee.
My freshmen year in high school, two men, Mr William Carroll and Mr Ronald Howland, co-taught my Humanities course. They, among others, shaped my teaching, and I think of them often. Tonight, I am picturing two hand made posters that hung in Mr Carroll's room. They simply read, "Says who?" and "How do you know?" These two phrases have stayed with me for twenty years. When I was teaching, I tried to share their sentiment with my students, and now, I reflect on them in the face of education reform.
I do not accept information thrust at me without knowing "says who?" And I do not accept change without asking "how do you know" that it is a step in the right direction for students, teachers, and schools.
Last school committee meeting, a teacher raised several technology related concerns with the MEA. Since then, the district applied for paper tests for students using iPads, if the problems are not resolved. I hear Mr Carroll saying, "Who will say the glitches are fixed? How will you know the problems are fixed?" And I answer, "Well, the people who said there were no problems in the first place will tell us they are fixed. And, I have no idea if anyone in the district will be given proof that those problems are fixed and that no additional ones will arise."
But Mr Carroll's words echo from the past on more than just the technology issues associated with the test.
How do I know the test content contains no mistakes? I don't know. But Sarah Blaine's piece entitled "Pearson's Wong Answer" raises serious concerns about small inaccuracies in tests, especially when dealing with non-released test items devoid of accountability for the test writers.
How do I know the open answer portions of the test will be scored appropriately? I don't know how they are scored beyond the packet on SB's website. Who is doing this work? People hired off of Craigslist with no education background or interviews? Ludicrous, I know, but its happened before.
How do I know the test content is appropriately rigorous? My personal experiences with the practice tests tell me it is not. The 11th grade exam is "hard," but not for the right reason. The
questions require the student to guess
what the question writer wants, which is a practice about which standards and assessment expert Rick Wormeli would have much to say. (And blogger Peter Green already did address in his piece "Sampling the PARCC"). And, as I shared with the school committee in an unanimously unanswered email, directions and interface for the 3rd grade exam are overwhelming and developmentally questionable. So, who says the test is rigorous? Smarter Balanced thinks its the wave of the future. The DOE seems pretty pleased, too. But, I have not heard a single parent or teacher say, "I think this is spot on, and it is what I want for my kids."
I am also a citizen and a tax payer and as such my questions are not frivolous. They are not breed out of fear or misunderstanding. They are the types of questions Mr Carroll taught me to ask, first as a student, and then as a teacher. These are college and career ready, 21st century thinking skills we should be teaching our students. They are the types of questions we should be purposefully and actively asking all of our teachers to raise. We should not accept what we are told without evaluating who is pushing it and if it is valid. Doing so devalues our professionals and set a poor example for our students. Mr Carroll would not be proud.
So, I ask them. But, I have an awfully hard time finding the answers. The information is simply not out there about Smarter Balanced the way it is about PARCC. In fact, I read an excellent series of blog posts by Russell Walsh about the readability of the PARCC exam. I tweeted him, and he posted corresponding information about the Smarter Balanced exam. And guess what? It was all good news. But, previous to that, I saw a respected professional question the PARCC test, but when I wanted to know the same information for our test, I couldn't find it.
Last week, I wrote to the US DOE asking for specific detailed information about consequences for Maine if testing participation rates were not met. I received a brief reply, written in legalese, that did not actually answer my questions at all. Lucky for me, acting Education Commissioner Tom Desjardin was CCed on the email. Sunday morning, I awake to a copy of the message he sent to the US DOE. He called the US DOE out on their non-answer. It was the only time in my memory, that I can recall a person in a position of power advocating for me. Someone who said, this is unacceptable and you deserve a real answer. That one email sent me from banging my head against the wall in frustration to a renewed energy to persist that these questions should be heard and answered.