Pearson’s Apology

For everyone who read and commented on my prior post, Pearson’s Wrong Answer, first of all, thank you.  The response has been overwhelming.  Second, I just wanted to take a moment to let you know that my post did eventually percolate its way to Pearson, and a Pearson representative named Brandon Pinette appears to have left a comment on the blog post today:

Pearson did make an error on the specific quiz question in a lesson in the Envision Math textbook and we sincerely apologize for this mistake. We corrected the error for future editions of Envision, but failed to adjust the question in editions currently in the field. We owe it to our students and teachers to ensure these types of errors do not happen in the future, and are committed to adapting new protocols to fix mistakes before they happen. Trust in our products and services is key and we have to earn it every day with students, teachers and parents.

Thank you,
Brandon Pinette
Pearson

It seems only fair to make sure that this specific apology for this specific mistake gets highlighted more than as one of almost a hundred comments to a blog post.  

However, from the overwhelming responses and comments this blog post has received (here, on Facebook, and on Valerie Strauss’s blog at The Washington Post, The Answer Sheet) one thing seems clear: this is not an isolated problem (either for Pearson or for textbook and academic material publishers in general).  Because my child is slated to take the Pearson-developed PARCC tests this spring, my focus is on Pearson.  Mistakes in other textbooks are annoying, but my specific concern about Pearson is its vertical integration throughout the education world: i.e., Pearson writes the textbooks (mistakes and all), Pearson writes and grades the PARCC tests, Pearson provides remedial programs for students who fail the Pearson-generated tests, and Pearson writes the GED tests for those students who drop out of high school.

I encourage anyone who finds other mistakes in Pearson materials to take photos of the specific mistakes, and then Tweet them with the hashtag #PearsonsWrongAnswers.  

I am glad that Pearson is “committed to adapting new protocols to fix mistakes before they happen” and that Pearson recognizes that “Trust in our products and services is key and we have to earn it every day with students, teachers and parents.”  

But I still think that we need to continue to hold Pearson accountable.  

Many commenters have pointed out, with validity, that there is supposed to be statistical analysis of standardized test questions, and that mistaken questions on the standardized tests will be thrown out as invalid.  I am sure that they are correct that this does happen.  However, with tests as high-stakes as these, I am not sure that this is a sufficient response.  

For instance, imagine if this was a standardized test question.  I could easily see a 9 or 10 year old test taker, who figures out that the correct answer is 546, struggle as she looks at multiple choice responses such as (a) 78 (b) 130 (c) 500 or (d) 63.  And I think that some kids are more likely than others to be distracted by (and therefore waste time on) issues generated by mistakes such as this one.  As a result, on a high-stakes timed standardized test, the time wasted on the wrong questions like this one may artificially deflate a child’s score.  And similarly, the child who gets the intended but mistaken correct answer (in this case, 78 miles, which would be correct if Curtis walked 3 miles a day for 26 DAYS) may obtain an artificial advantage because she isn’t bogged down by catching and mulling over the mistake.  Throwing out the specific question will not address these issues.  

And as long as we are addressing comments, for those commenters who think my post was an overreaction, so be it.  Perhaps it was.  But as noted above, Pearson has an awful lot of vertical integration throughout the education market, and Pearson’s employee himself admitted that “Trust in our products and services is key.” 

Pearson has to earn my trust.  And since its materials are at the heart of my children’s math education, I will be doing my best to look over its shoulder now, as much as anything as part of my decision-making process concerning whether I think I should join the movement to refuse or opt out of its standardized tests.  

Thank you all again.

Pearson’s Wrong Answer

Updated (Oct. 10): Pearson responded to this post in the comments section.  See Pearson’s Apology. 

Last Friday morning, my fourth grader handed me her “Thursday folder” shortly before we needed to head to the bus stop. I was glad to see a perfect spelling test, and a bunch of excellent math assignments and math tests. Time was short, however, so I flipped to the wrong answers. And sprinkled among the math tests, I came across two wrong answers that caused me concern.

The first problem was this:

Now, I looked at this problem before I’d had my morning coffee, and I wasn’t sure at first that I wasn’t just missing something. So I posted this picture to my Facebook feed, and asked my friends to confirm that I wasn’t crazy.

But my daughter was right: if Curtis walked three miles a day for 26 weeks, Curtis did in fact walk 546 miles.

3 miles/day x 7 days/week = 21 miles/week
21 miles/week x 26 weeks = 546 miles

I double, triple, and quadruple checked myself.  I pulled out a calculator.  

My friends agreed: my initial reaction to this question wasn’t nuts. My daughter’s answer was correct. And they came up with some good theories for why the answer might have been marked wrong.

Perhaps the teacher was trying to teach children, especially girls, to be confident in their answers, and she’d been marked wrong due to the question mark.

Perhaps she’d been marked wrong because she failed to indicate the units.

Perhaps she’d been marked wrong because she hadn’t provided every step of her work (i.e., she’d figured out the first step (3 miles/day x 7 days/week = 21 miles/week) in her head, and therefore had paid what one of my friends memorably described as a “smart kid penalty.”

But they were all wrong.

My daughter is fortunate enough to attend an excellent public school and her responsive teacher both sent a note home and called me that afternoon to discuss (I’d scribbled a quick note asking what the deal was along with my required signature on the front of the paper).

It turned out that my daughter had been marked wrong for a very simple reason: the Pearson answer key was wrong.

Let me say that again: Pearson was wrong.

Pearson listed some totally different — and wrong — number as the answer. The teacher had missed it when reviewing the test with the morning class, but in the afternoon class she’d realized the problem. My daughter’s teacher apologized for forgetting to mention it again to the morning class (and for not having previously changed their grades, but to be honest, I really could not care less if my kid scored a 95% or 100% on a 4th grade in-class math test).

In the olden days, I’d have laughed it off. Once in awhile, the textbook publisher screws up. In the olden days, that screw up was no big deal: it is mildly annoying to those of us who pay the taxes to buy the books, but it’s a pretty minor annoyance in the grand scheme of things.

However, these are not the olden days. These are the days of high stakes testing. These are the days in which our kids’ high school graduations hinge on tests created by the very same company — Pearson — that screwed up the answer to this question.

Tests we parents will never get to see.

Tests we parents will never get to review.

Tests we parents will never get to question.

So Pearson’s screw up on its fourth grade answer key doesn’t exactly inspire confidence.

Presumably, before the enVisions curriculum was published, Pearson checked and rechecked it. Presumably, its editors were well-paid to review problems and answer keys.

After all, Pearson itself describes this math curriculum as:

Written specifically to address the Common Core State Standards, enVisionMATH Common Core is based on critical foundational research and proven classroom results.” 

And yet… it was still dead wrong.

It seems that all of Pearson’s critical foundational research and proven classroom results in the world couldn’t get the question 3 x 7 x 26 correct.

To the uninitiated, I bet I sound nuts.  Who cares, right?  It’s just a question on a math test.  But if we are going to trust this company to get it right on high-stakes tests (where there is no public accountability), then the company better get it right all the time when it is operating within the public eye.  So this isn’t just about a fourth grade math test.  It’s all of the other Pearson-created tests my daughter is scheduled to take: in particular, the PARCC tests this spring, which are the ones that come with no public review, and no public accountability.  

Here, the test came home in my daughter’s backpack. As a result, there was an opportunity for public review and public accountability because I could review the test and question the wrong answer. The teacher could check the question and realize that the book was wrong, and substitute her own professional judgment for that of the textbook publisher.

And most importantly, the mistake was not a big deal, because the outcome of this test would not determine my daughter’s placement into an advanced math class or a particular school or even prevent her from graduating from the fourth grade. The outcome of this test would not determine her teacher’s future salary or employment. This test was nothing more than the kind of test our nine and ten year olds should be taking: a fourth grade in-class, teacher-graded chapter test. At most, this test will determine a small portion of my daughter’s report card grade.

But what about those tests that Pearson will be administering to our students this spring? We won’t be able to review the test questions, the answer keys, or our children’s answer sheets. We won’t be able to catch Pearson’s mistakes.

This spring, even if the answer really is 546 miles, Pearson will be able to write that Curtis traveled 1024 miles, or 678 miles, or 235 miles, or any other distance it wants. And we’ll never know that our kids weren’t wrong: Pearson was. But our kids’ futures — and their teachers’ careers — will be riding on the outcomes of those tests.

There has to be a better way.

In a low-stakes world, Pearson’s screw up was a low-stakes mistake. But now we’re forcing our kids — our eight, nine, and ten year olds — to live in a high-stakes world.

And in a high-stakes world, Pearson’s screw ups are high-stakes. So shame on you, Pearson, for undermining my daughter’s hard-earned (and easily eroded) math confidence with your careless error. I will parent my kid so that she learns not to second-guess herself with question marks after her answers. 

But Pearson, I will be second-guessing you. As publicly as possible.

And perhaps… just perhaps… I will start shorting your stock.