NJDOE’s Declaration of War

I am an opinionated blogger, and I blog here in my personal capacity. Unlike some other bloggers doing excellent work in the world of education policy and beyond, I do not claim to be a citizen journalist objectively reporting the news. I’m just a mom with a keyboard and opinions. I occasionally manage to put my thoughts into words as I explore education policy from my perspective as a public school parent. And although I am an attorney, I do not pretend to be blogging in my professional capacity, and I certainly do not intend any of my musings here as legal advice.

That being said.

That being said.

That being said, New Jersey’s Acting Commissioner of its Department of Education, David C. Hespe, appears to have declared war on parents and children who oppose his standardized testing policies.

Specifically, today the Acting Commissioner issued guidance to chief school administrators, charter school lead persons, school principals, and district and school test coordinators regarding “Student Participation in the Statewide Assessment Program.”  Go read it yourself.

But here’s my synopsis:

This. Means. War.

Acting Commissioner Hespe has declared war on us — and our children.

Acting Commissioner Hespe advocates punishing children for their parents’ political opposition to NJDOE’s destructive over-testing policies.

NJDOE has crossed the line.

Hespe says:

We have received a number of inquiries regarding the ability of parents and students to choose to not participate in the statewide assessment program, including the Partnership for Assessment of Readiness for College and Careers (PARCC) assessment. In an effort to clarify school district responsibility in this regard, the Department is providing the following guidance.

First, I want to take a moment to celebrate each and every person who has forced the Acting Commissioner of New Jersey’s Department of Education, David Hespe, to respond to the opt-out/refusal movement.

Thank you for fighting back against the over-testing of our children along with its predictable results: test-prep focused classrooms practices and narrowing of curriculum (just once, I’d like to see social studies instruction at my daughter’s school that is something more than map skills). There is something wrong when our 9 year old fourth graders are expected to sit for more testing than our state’s aspiring attorneys must take to become licensed to practice law. (A New Jersey fourth grader is expected to sit for 10 hours of PARCC testing plus 90 minutes of NJ ASK science testing, for a total of 11 hours and 30 minutes of testing. The New Jersey Bar Exam is a total of 11 hours and 15 minutes of testing.)

I am still in the process of educating my fourth grader about the pros and cons of the PARCC testing, and as a parent of a high-functioning and inquisitive fourth grader who does not suffer from test anxiety, I am letting her have input into the decision our family is going to make regarding whether she will be sitting for these tests this spring, rather than forcing my decision on her. After all, this is her education, and she is the one who will ultimately suffer any consequences. As much as I’d dearly love to just refuse on her behalf, I won’t do so unless she is on board. So for now, our family remains on the fence.

Second, Hespe’s key argument supporting testing is: “Federal funding of key education programs is dependent upon districts meeting [No Child Left Behind’s Adequate Yearly Progress] requirement.” Really? That’s the best you’ve got?

Note that Hespe does not identify which key education programs’ funding is contingent on our kids taking these tests. As FairTest.Org notes, “In a state with a waiver, a ‘priority’ school must set aside 5-15% of its federal Title I and II funding to use in state-approved programs in the school. The money is not ‘lost.’ It generally may be used for various school improvement efforts.” Here’s the link. New Jersey is a waiver state, so I’d love to know exactly what federal funding the Acting Commissioner believes my daughter’s school will lose if she and more than 5% of her peers refuse the test. I have not done the research myself, but from the limited reading I have done, it appears that this is a toothless threat.

Third, here is Acting Commissioner Hespe’s actual guidance — or, more accurately, his declaration of war:

In accordance with the above, State law and regulations require all students to take State assessments. For the 2014-2015 school year, the PARCC assessment will replace the prior statewide assessments – the NJASK in grades 3-8 and HSPA in high school; as such, all students shall take the PARCC assessment as scheduled. Since the PARCC assessment is part of the State required educational program, schools are not required to provide an alternative educational program for students who do not participate in the statewide assessment. We encourage all chief school administrators to review the district’s discipline and attendance policies to ensure that they address situations that may arise during days that statewide assessments, such as PARCC, are being administered.

In short, Hespe says:

  • State law requires all students to “take” State assessments;
  • PARCC is required by the State, so schools are not required to provide an alternative education program for students who do not participate in the statewide assessment; and
  • NJDOE “encourages” all chief school administrators to review district discipline and attendance policies “to ensure that they address situations that may arise during days that statewide assessments, such as PARCC, are being administered.”

Let’s take Hespe’s Declarations of War one at a time.

(1) State law requires all students to “take” State assessments and “all students shall take the PARCC assessment as scheduled.”

Or what?

Or what, Acting Commissioner Hespe?

At the end of the day, no one at my daughter’s school can force her to click a mouse, type on a keyboard, or pick up a pencil.

So I say, bring it. It doesn’t take a Ph.D to realize that it is fundamentally wrong to base education policy on essays our fourth graders are required to type — when they’ve never taken typing classes. Last spring when I began exploring these tests, I watched my daughter struggle for over 7 minutes to input the answer to a math question she’d solved in 30 seconds.  I’m a mom and a former teacher, and I see no value whatsoever in tests that measure my 4th grader’s computer savvy rather than her academic skills.

(2) PARCC is required by the State, so schools are not required to provide an alternative education program for students who do not participate in the statewide assessment.

Last spring, as New York’s opt out movement alone grew to more than 60,000 students, a lot was written about so-called “sit and stare” policies. See, e.g., this piece from The Answer Sheet blog at The Washington Post

But unless I’m fundamentally misreading this memo, Hespe appears to be encouraging districts to adopt sit and stare policies in an effort to intimidate parents into not opting their kids out.

Bring it on, Acting Commissioner Hespe. Bring it.

It appears to me that you’re taking a page from your boss’s playbook by telling those of us who disagree with you to “sit down and shut up.”

It appears to me, Acting Commissioner Hespe, that you’re trying to bully those of us who do not see the value in your precious PARCC tests by punishing our children.

That’s low, Acting Commissioner. Really low. And do you know what? You don’t intimidate me. All you’ve done is piss me off. And Acting Commissioner, I’ll tell you this: pissing off parents — and voters — like me is probably not the way to ensure the long-term success of your policies. You were just a faceless bureaucrat. Now I want to get you fired. You deserve no less for attempting to bully parents by punishing our children.

(3) NJDOE “encourages” all chief school administrators to review district discipline and attendance policies “to ensure that they address situations that may arise during days that statewide assessments, such as PARCC, are being administered.”

I’m not certain what Acting Commissioner Hespe is getting at here, but I suspect his purpose may be to suggest that districts should implement attendance and discipline policies that will impose punitive consequences on children whose parents opt them out of these tests.

Is Acting Commissioner Hespe really suggesting that parents who keep their children home during testing are risking their children’s promotion to the next grade as a result of too many absences?

Is Acting Commissioner Hespe really suggesting that school districts should implement discipline policies that will impose punishments on children who refuse testing?

Would Acting Commissioner Hespe attempt to link state funding to local districts with the local districts’ willingness to implement punitive measures against those children whose parents refuse PARCC on their behalf?

The next step for me will be to see how my district interprets this guidance. Will it opt to provide alternate educational experiences and keep its promise that no academic decisions will be made based on this year’s test results?

But one thing seems certain. Acting Commissioner Hespe is scared. Really scared. He’s scared that the PARCC consortium is coming apart at the seams.  He’s scared that his precious testing regime is about to implode before it gets started. He’s scared that his tests aren’t going to generate enough data about enough kids to satiate the data monsters.

And he’s terrified of the growing opt out movement. So Hespe’s doubled down on PARCC. First he linked high school graduation to PARCC testing. Now he’s threatening parents and children who refuse PARCC. Defensiveness is rarely a sign of strength.

So as awful as this guidance is, it tells me that we’re winning. In a post-Citizens United world, there’s still some hope for grassroots activism and organizing. We are winning the war to do away with excessive and punitive standardized testing. And, of course, the whole education reform movement relies on its standardized testing foundation.

All the Acting Commissioner did with this policy was to galvanize me, for one, to fight harder. Who’s with me?

P.S.  For a terrific analysis of the Acting Commissioner’s magical thinking with respect to the supposed benefits of PARCC vs. the now apparently fatally flawed NJASK/HSPA (the portion of his letter I didn’t get around to analyzing), check out Peter Greene’s terrific piece over at Curmudgucation.  Will Hespe’s Magical PARCC promise my kids ponies?  What about unicorns?

Questioning the Test (Or, My List of Skeptical but Respectful Questions Regarding PARCC)

My daughters’ school district is holding a series of “PARCC Family Presentations” over the next few weeks. The presentation targeted at parents of third through fifth grade students is set for this Thursday. In preparation for the presentation, the district has — to its credit — announced that it is soliciting questions regarding the PARCC assessments. So I sat down and generated a list of my current questions. Then I went to submit them via the District’s Google Docs form, and promptly discovered that the district’s form imposes a 500 character limit per subject.

As you will see below, after spending the past year or so educating myself about the PARCC, my questions far exceed 500 characters so I emailed my questions directly to our district’s Chief Academic Officer. I really hope that we get some honest answers to these questions. Here are my hopefully skeptical but respectful questions (slightly edited to take out the district-specific language I used in my email), plus a few additional questions that I thought of after I sent my email. Please suggest additions to my list, comment on my list of questions, and let me know if your school districts are holding similar information sessions. If your district is holding similar sessions, please attend one so that you can learn what your district is saying and ask your own questions. Of course, if my concerns mirror yours, please feel free to adapt my questions for use in your own school district.

Most of all, even if your children are not third through eleventh graders, please educate yourselves about these tests, and think critically about where our schools are headed now that many states, including but not limited to my state, New Jersey, have doubled-down on implementing high-stakes standardized testing for our students.  

PARCC FAMILY PRESENTATIONS — QUESTIONS

I. Testing Administration

1. What will happen if I decide to have my child refuse PARCC testing? Will there be consequences for my child, his/her teacher, and/or his/her school? Will my child be forced to “sit and stare,” or will s/he be provided with an alternate educational experience?

2. How many hours of testing for 3rd graders? 4th graders? 5th graders? How much total time per school will be spent on a testing schedule given that all children in the grade level cannot test simultaneously? Will children miss their [elective] classes during PARCC administration, even if they themselves are not testing? What impact will testing have on the [elective] programs at [my daughter’s school]? Is [the technology teacher] teaching fewer technology electives than in the past due to PARCC preparation?

3. Why is it necessary (from a pedagogical perspective) for our students to be tested in both March and May?

4. What in-district adults are proctoring and reviewing the PARCC tests to ensure that the test questions are not poorly worded, ambiguous, and/or that correct answer choices are provided for multiple choice tasks? Will those people be able to speak out if questions are poorly worded or if no correct answer choices are provided, or are they going to be required to agree to gag orders before they can administer the tests?

II. Scoring and Reporting

1. Will a school or schools in this school district face in-district consequences (e.g., steps taken to dismantle a school’s magnet theme) now or in the future as a result of its performance on PARCC?

2. I understand that although New York is not a PARCC state, it has been giving Common Core aligned assessments for two years now, and the passing rate has dropped from over 60% to under 30%. What percentage of New Jersey/[our local district] children are expected to pass PARCC in 2014?

3. What data do you expect to receive from PARCC that will be available to classroom teachers to guide instruction? When will PARCC scores and results be available?

4. Who scores the subjective portions of the PARCC tests? What are those people’s qualifications?

5. Will PARCC results be part or all of the criteria used to identify [gifted and talented] students going forward? What happens if my child was previously identified as a [gifted and talented] student, but loses that designation because s/he lacks the technology skills to succeed on the PARCC assessments?

III. Technology Skills

1. What steps are you taking to ensure that our 8, 9, and 10 year old students have the typing skills necessary to compose essays with keyboards? How much time is being spent on preparing children to acquire the skills necessary to master the PARCC interface? Is the preparation process uniform throughout the district? If it is not, doesn’t this mean that we won’t be able to make apples-to-apples comparisons of student scores even across the district?

2. Have you done comparisons of the time on task necessary for students to answer PARCC sample questions with paper and pencil versus with computers? If so, what were the results?

3. What happens if computers break, internet service goes down, or the children encounter other technological difficulties during their testing windows?

IV. Content Areas

1. I have seen virtually no evidence of specific social studies instruction (stand alone ELA worksheets with “social studies themes” do not count in my book) and very little science instruction since [our district] started implementing Common Core and preparing for the PARCC assessments. What steps are you taking to ensure that our children are learning the history and civics necessary to become informed citizens and voters?

2. Will students lose points on math assessments if they do not use specific Common Core strategies to solve problems (e.g., performing multiplication the traditional way rather than drawing an array)? My child lost full credit on the following Envisions math test problem this year: “Write a multiplication sentence for 3 + 3 + 3 + 3 + 3 = 15” because she wrote 3 x 5 = 15 instead of 5 x 3 = 15. Will children be losing points on PARCC for failure to make meaningless distinctions such as this one?

 

V. Additional Questions I Should Have Asked

1. What effect do you expect the PARCC test to have on our district’s efforts to close the achievement gap? Given the wealth disparity — and resulting inequities in home access to technology — in our district, aren’t these assessments likely to magnify our district’s pre-existing achievement gap?

2. What preparations are you making to care for our children’s emotional and social health during these tests (and when the results become available), given the likelihood that far more students are going to struggle with — and fail — these tests than struggled with and failed the NJ ASK?

and finally

3.  How can it be developmentally appropriate for our 9 year old fourth grade students to spend 10 hours on PARCC testing when many adults cannot handle the stress of the 11 hour and 15 minute New Jersey Bar Exam?

UPDATE:

VI.  Additional Questions Suggested by Readers — Please also see the additional excellent questions in the Comments section, and feel free to add your own!

1.  What demographic information will be collected in connection with our students taking this test?  Who will this demographic data be shared with, and what controls are in place to make sure our students’ demographic data isn’t sold for marketing or other purposes?  

2.  Will some or all of the tests be made public after testing so that we, the community, can review the questions and the sample/model answers and so that our children’s teachers can actually use the assessment data to guide classroom instruction?  In the absence of such a release, what value does the assessment data provide to classroom teachers?

3.  What costs — in addition to the one million dollars the district allocated to capital spending this year to support technology upgrades — are associated with preparing our students and their teachers for the PARCC tests?  What portion of our personnel budget is attributable to time spent on preparing for and proctoring these exams?

Pearson’s Apology

For everyone who read and commented on my prior post, Pearson’s Wrong Answer, first of all, thank you.  The response has been overwhelming.  Second, I just wanted to take a moment to let you know that my post did eventually percolate its way to Pearson, and a Pearson representative named Brandon Pinette appears to have left a comment on the blog post today:

Pearson did make an error on the specific quiz question in a lesson in the Envision Math textbook and we sincerely apologize for this mistake. We corrected the error for future editions of Envision, but failed to adjust the question in editions currently in the field. We owe it to our students and teachers to ensure these types of errors do not happen in the future, and are committed to adapting new protocols to fix mistakes before they happen. Trust in our products and services is key and we have to earn it every day with students, teachers and parents.

Thank you,
Brandon Pinette
Pearson

It seems only fair to make sure that this specific apology for this specific mistake gets highlighted more than as one of almost a hundred comments to a blog post.  

However, from the overwhelming responses and comments this blog post has received (here, on Facebook, and on Valerie Strauss’s blog at The Washington Post, The Answer Sheet) one thing seems clear: this is not an isolated problem (either for Pearson or for textbook and academic material publishers in general).  Because my child is slated to take the Pearson-developed PARCC tests this spring, my focus is on Pearson.  Mistakes in other textbooks are annoying, but my specific concern about Pearson is its vertical integration throughout the education world: i.e., Pearson writes the textbooks (mistakes and all), Pearson writes and grades the PARCC tests, Pearson provides remedial programs for students who fail the Pearson-generated tests, and Pearson writes the GED tests for those students who drop out of high school.

I encourage anyone who finds other mistakes in Pearson materials to take photos of the specific mistakes, and then Tweet them with the hashtag #PearsonsWrongAnswers.  

I am glad that Pearson is “committed to adapting new protocols to fix mistakes before they happen” and that Pearson recognizes that “Trust in our products and services is key and we have to earn it every day with students, teachers and parents.”  

But I still think that we need to continue to hold Pearson accountable.  

Many commenters have pointed out, with validity, that there is supposed to be statistical analysis of standardized test questions, and that mistaken questions on the standardized tests will be thrown out as invalid.  I am sure that they are correct that this does happen.  However, with tests as high-stakes as these, I am not sure that this is a sufficient response.  

For instance, imagine if this was a standardized test question.  I could easily see a 9 or 10 year old test taker, who figures out that the correct answer is 546, struggle as she looks at multiple choice responses such as (a) 78 (b) 130 (c) 500 or (d) 63.  And I think that some kids are more likely than others to be distracted by (and therefore waste time on) issues generated by mistakes such as this one.  As a result, on a high-stakes timed standardized test, the time wasted on the wrong questions like this one may artificially deflate a child’s score.  And similarly, the child who gets the intended but mistaken correct answer (in this case, 78 miles, which would be correct if Curtis walked 3 miles a day for 26 DAYS) may obtain an artificial advantage because she isn’t bogged down by catching and mulling over the mistake.  Throwing out the specific question will not address these issues.  

And as long as we are addressing comments, for those commenters who think my post was an overreaction, so be it.  Perhaps it was.  But as noted above, Pearson has an awful lot of vertical integration throughout the education market, and Pearson’s employee himself admitted that “Trust in our products and services is key.” 

Pearson has to earn my trust.  And since its materials are at the heart of my children’s math education, I will be doing my best to look over its shoulder now, as much as anything as part of my decision-making process concerning whether I think I should join the movement to refuse or opt out of its standardized tests.  

Thank you all again.

Pearson’s Wrong Answer

Updated (Oct. 10): Pearson responded to this post in the comments section.  See Pearson’s Apology. 

Last Friday morning, my fourth grader handed me her “Thursday folder” shortly before we needed to head to the bus stop. I was glad to see a perfect spelling test, and a bunch of excellent math assignments and math tests. Time was short, however, so I flipped to the wrong answers. And sprinkled among the math tests, I came across two wrong answers that caused me concern.

The first problem was this:

Now, I looked at this problem before I’d had my morning coffee, and I wasn’t sure at first that I wasn’t just missing something. So I posted this picture to my Facebook feed, and asked my friends to confirm that I wasn’t crazy.

But my daughter was right: if Curtis walked three miles a day for 26 weeks, Curtis did in fact walk 546 miles.

3 miles/day x 7 days/week = 21 miles/week
21 miles/week x 26 weeks = 546 miles

I double, triple, and quadruple checked myself.  I pulled out a calculator.  

My friends agreed: my initial reaction to this question wasn’t nuts. My daughter’s answer was correct. And they came up with some good theories for why the answer might have been marked wrong.

Perhaps the teacher was trying to teach children, especially girls, to be confident in their answers, and she’d been marked wrong due to the question mark.

Perhaps she’d been marked wrong because she failed to indicate the units.

Perhaps she’d been marked wrong because she hadn’t provided every step of her work (i.e., she’d figured out the first step (3 miles/day x 7 days/week = 21 miles/week) in her head, and therefore had paid what one of my friends memorably described as a “smart kid penalty.”

But they were all wrong.

My daughter is fortunate enough to attend an excellent public school and her responsive teacher both sent a note home and called me that afternoon to discuss (I’d scribbled a quick note asking what the deal was along with my required signature on the front of the paper).

It turned out that my daughter had been marked wrong for a very simple reason: the Pearson answer key was wrong.

Let me say that again: Pearson was wrong.

Pearson listed some totally different — and wrong — number as the answer. The teacher had missed it when reviewing the test with the morning class, but in the afternoon class she’d realized the problem. My daughter’s teacher apologized for forgetting to mention it again to the morning class (and for not having previously changed their grades, but to be honest, I really could not care less if my kid scored a 95% or 100% on a 4th grade in-class math test).

In the olden days, I’d have laughed it off. Once in awhile, the textbook publisher screws up. In the olden days, that screw up was no big deal: it is mildly annoying to those of us who pay the taxes to buy the books, but it’s a pretty minor annoyance in the grand scheme of things.

However, these are not the olden days. These are the days of high stakes testing. These are the days in which our kids’ high school graduations hinge on tests created by the very same company — Pearson — that screwed up the answer to this question.

Tests we parents will never get to see.

Tests we parents will never get to review.

Tests we parents will never get to question.

So Pearson’s screw up on its fourth grade answer key doesn’t exactly inspire confidence.

Presumably, before the enVisions curriculum was published, Pearson checked and rechecked it. Presumably, its editors were well-paid to review problems and answer keys.

After all, Pearson itself describes this math curriculum as:

Written specifically to address the Common Core State Standards, enVisionMATH Common Core is based on critical foundational research and proven classroom results.” 

And yet… it was still dead wrong.

It seems that all of Pearson’s critical foundational research and proven classroom results in the world couldn’t get the question 3 x 7 x 26 correct.

To the uninitiated, I bet I sound nuts.  Who cares, right?  It’s just a question on a math test.  But if we are going to trust this company to get it right on high-stakes tests (where there is no public accountability), then the company better get it right all the time when it is operating within the public eye.  So this isn’t just about a fourth grade math test.  It’s all of the other Pearson-created tests my daughter is scheduled to take: in particular, the PARCC tests this spring, which are the ones that come with no public review, and no public accountability.  

Here, the test came home in my daughter’s backpack. As a result, there was an opportunity for public review and public accountability because I could review the test and question the wrong answer. The teacher could check the question and realize that the book was wrong, and substitute her own professional judgment for that of the textbook publisher.

And most importantly, the mistake was not a big deal, because the outcome of this test would not determine my daughter’s placement into an advanced math class or a particular school or even prevent her from graduating from the fourth grade. The outcome of this test would not determine her teacher’s future salary or employment. This test was nothing more than the kind of test our nine and ten year olds should be taking: a fourth grade in-class, teacher-graded chapter test. At most, this test will determine a small portion of my daughter’s report card grade.

But what about those tests that Pearson will be administering to our students this spring? We won’t be able to review the test questions, the answer keys, or our children’s answer sheets. We won’t be able to catch Pearson’s mistakes.

This spring, even if the answer really is 546 miles, Pearson will be able to write that Curtis traveled 1024 miles, or 678 miles, or 235 miles, or any other distance it wants. And we’ll never know that our kids weren’t wrong: Pearson was. But our kids’ futures — and their teachers’ careers — will be riding on the outcomes of those tests.

There has to be a better way.

In a low-stakes world, Pearson’s screw up was a low-stakes mistake. But now we’re forcing our kids — our eight, nine, and ten year olds — to live in a high-stakes world.

And in a high-stakes world, Pearson’s screw ups are high-stakes. So shame on you, Pearson, for undermining my daughter’s hard-earned (and easily eroded) math confidence with your careless error. I will parent my kid so that she learns not to second-guess herself with question marks after her answers. 

But Pearson, I will be second-guessing you. As publicly as possible.

And perhaps… just perhaps… I will start shorting your stock.