Are Teachers Professionals?

Peter Greene recently published a pair of pieces, here and here, on the quality of teacher education programs.  Reading his pieces — and the Ed Week blog post that inspired them — inspired me to share a few quick thoughts.  

A dozen years ago, as I sat in my Professional Ethics course one day, my ears perked up.  My professor was discussing what it means to be a professional, and was listing the traditional professions: law and medicine.  I spoke up: “What about teachers?  Aren’t teachers professionals?”  His response: “Absolutely not.”

As a former teacher, I was floored.  I think I had to reach down and physically pick my jaw up off the floor.  But in hindsight, as infuriating as I found my professor’s pronouncement at the time, his reasoning actually makes sense.  As my professor explained it, one belongs to a profession if current members of that profession take responsibility for controlling entry to that profession.  That is, lawyers — in law schools — educate future lawyers, and lawyers — through state bar examinations created and scored by lawyers — determine whether law school graduates are fit to enter the legal profession.   As I understand it, the same holds true for doctors, who are educated in medical schools, internship programs, and residency programs by doctors, and who must pass their medical boards — i.e., exams for future doctors created and scored by doctors — in order to practice medicine unsupervised.  

Superficially, traditional routes for entry into the teaching profession sound similar.  Those of us who have been licensed teachers completed a degree — either undergraduate or graduate — in a program taught by some combination of former and current teachers, and then most likely passed some iteration of the Pearson-produced Praxis test or other licensing tests required by our state departments of education.  The difference, however, is in those final words of the prior sentence: “required by our state departments of education.”  Teachers do not regulate entry into the teaching profession: rather, government bureaucrats and for-profit testing companies do.  That distinction makes a world of difference.

These days, we are constantly subjected to assaults on the teachers: by the media, by parents, by politicians, by members of the public, and sometimes by other teachers, who complain about the quality of their coworkers (I heard this from a couple of public school teachers just in the past few weeks).  We hear that teachers are lazy, that they’re lacking in content knowledge, and we parents are known to judge some of them pretty harshly ourselves.  I know that I have a habit of seeing red when teachers send assignments home from school that are riddled with spelling, grammar, and/or syntax errors.  

But take a moment, and imagine an alternate universe in which teachers are responsible for regulating their own profession.  Imagine communities where practicing teachers make the final determination of whether candidates for the teaching profession are ready to be granted professional licenses — with the knowledge that they themselves are responsible for the perceived quality of their profession.  Would a teacher agree to license a new colleague who appeared to lack a grasp of the conventions of written English?  Would a teacher agree to license a new colleague who did not have deep content-area knowledge?  Would a teacher agree to license a new colleague who had not proven himself capable of effective classroom management?  Would a teacher agree to license a new colleague who hadn’t proved himself knowledgable of the latest theories of child development and principles taught in educational psychology courses?   

Imagine teachers observing, mentoring, and evaluating candidates based on metrics they themselves developed for determining who merited a license to teach in a classroom filled with children.   Imagine the entrance exams that teachers — not Pearson — would create to ensure that those who are to follow in their footsteps are adequately prepared for the awesome task — and it truly is awesome — of ensuring that our country’s children are educated to be thoughtful, compassionate, and productive members of a society that embodies democratic values.  I truly believe that we humans tend to rise to a task when we are granted the autonomy necessary to take pride in our work, our colleagues, and our professions. Imagine, if you will, a public policy in which master teachers — like Peter Greene — truly have a say on not only what happens in the classroom, but on who is qualified to be counted among their colleagues.  Imagine teaching as a profession.  

Personally, I’d rather see these guys (included in these pictures from NPE are Jesse Hagopian, Jose Vilson, Anthony Cody, Stan Karp, and Peter Greene along with dozens of other teachers I didn’t get a chance to speak with):

yup
EduShyster Jennifer Berkshire Interviews Jose Vilson and Peter Greene at NPE 2015

yup
Jesse Hagopian speaks on Black Students Matter at NPE 2015

yup
The room, packed with teachers, at Jesse Hagopian’s Black Students Matter presentation at NPE 2015

yup
Geralyn Bywater McLaughlin and Nancy Carlsson-Paige of Defending the Early Years Present at NPE 2015 

 determining entry into the teaching profession rather than people like these guys:

 

Chris_Christie.jpg

After all, our kids deserve teachers selected by professionals who know what they’re doing.  I, for one, place my faith in the teachers, not the bureaucrats and politicians.

P.S., Obviously, we lawyers could also do a far better job at self-regulation than we do.  I certainly count myself among those attorneys who have had the experience of wondering how, exactly, my adversary managed to graduate from law school and pass the bar exam.  But at least we only have ourselves to blame.

 

Pearson’s Yellow Brick Road

I have never been happier that we refused to allow my fourth grader to take the PARCC. Yesterday, I asked her what she’s heard at school about the PARCC tests her peers have been taking. Although she has never sat for a PARCC test herself, she was able to tell me that some of the 4th grade PARCC reading passages were from the Wizard of Oz (apparently one passage was about the Emerald City, and another told the story of the Tin Man). So in theory, if your child has not yet sat for the 4th grade PARCC, you could embark on a Frank Baum marathon this weekend to give your child a leg up on his or her upcoming PARCC test.

This is one of the many logistics issues that has never made sense to me about the PARCC test security protocol: especially in the age of social media, how could the state departments of education and Pearson possibly have expected their testing materials to remain secret when one 4th grader might take the test as early as March 2nd, but that child’s cousin in another district might not be scheduled to take the same test until March 20th?

Well, now we know. As you have probably heard, yesterday afternoon blogger (and former Star Ledger education reporter) Bob Braun reported that Pearson is monitoring children’s social media accounts to look for Tweets and other social media posts that allegedly compromise the security of its PARCC tests. Here in New Jersey, at least, when Pearson finds what it believes to be test-security infractions, it then tracks down those students’ personal data to figure out what schools they attend. Then Pearson reports the alleged infractions to the New Jersey Department of Education (“NJDOE”). As of yet, we parents have no idea if the NJDOE stores the report of this alleged infraction in its NJSMART database. What we do know is that the NJDOE has been notifying individual districts’ test coordinators of their students’ alleged infractions. Furthermore, we know that NJDOE has requested that the individual districts punish students for writing about test questions on social media.

Today, Valerie Strauss of The Washington Post (yes, the same Valerie Strauss who graciously publishes many of the pieces I’ve written for this blog) confirmed the story, and obtained additional information from both the Watchung Hills Regional High School superintendent who expressed her concerns about the practice in the email published on Braun’s blog and from a Pearson spokesperson.

So what does it all mean? Pearson thinks its monitoring (like Peter Greene, I think that Braun’s term, “spying,” misses the mark, as there is no expectation of privacy when you post a Tweet, at least) is hunky-dory:

The security of a test is critical to ensure fairness for all students and teachers and to ensure that the results of any assessment are trustworthy and valid.

We welcome debate and a variety of opinions. But when test questions or elements are posted publicly to the Internet, we are obligated to alert PARCC states. Any contact with students or decisions about student discipline are handled at the local level.

We believe that a secure test maintains fairness for every student and the validity, integrity of the test results.

I think that Pearson, however, has missed the mark. First of all, what, exactly, is an “element” of a test question? Are the Pearson Police going to be at my doorstep tomorrow because I mentioned that I heard from my kid (who herself has not and will not sit for the test) that the 4th grade PARCC includes excerpts from Frank Baum’s work? And if I can’t — as I can’t — be held accountable for posting the tip above, why is it okay for Pearson, through its patsy, the NJDOE, to seek to impose disciplinary action against students who allegedly shared “test elements” (although not, according to the student’s superintendent, a tweet containing a photograph of the test itself)?

Second, what does it mean for a test to be “secure”? Does Pearson really think that kids are not talking about these tests among themselves? Does Pearson really think it can bind our kids to secrecy? The last time I checked, our kids were minors — and therefore they, unlike their teachers, cannot be bound to a non-disclosure agreement even if it could be argued that our kids receive consideration in connection with these tests (and, my lawyer friends, even if there was consideration and such a contract could bind a minor, it sounds awfully like a contract of adhesion, anyway…). Along those lines: are our kids being instructed to keep the test materials secret? As a practical matter, it strikes me as pathologically naive to think that such instructions to kids could actually work.

But more importantly, as a parent, I vehemently object to adults in our schools instructing children to keep secrets from their parents. In this day and age, we parents work hard to make sure that our kids know that they can talk to us about anything — and that openness is how we parents try to thwart possible predators and bullies, because we know that predators and bullies use children’s shame and fear to hide their abuse of children. Indeed, the American Academy of Pediatrics recommends:

Teach children early and often that there are no secrets between children and their parents, and that they should feel comfortable talking with their parent about anything — good or bad, fun or sad, easy or difficult.

So if you think I’m mad about these tests now, Education Commissioner Hespe and Pearson’s Brandon Pinette, I better not hear that our schools are sending our children mixed messages by telling kids to be open with parents  and trusted adults — except when it comes to testing.

Third, what sort of people has Pearson hired to track children’s social media presences, and what steps has Pearson taken to ensure that its employees are properly vetted before it directs them to obtain personally identifiable information about our kids? Have Pearson’s employees been required to submit to background checks? It seems an odd person who would choose to make his or her living by delving into individual children’s social media use to the extent that the person can figure out the school the child attends. How are those people vetted? What steps has Pearson taken to safeguard our children? And should a private company, which is not subject to public oversight through OPRA, really be tasked with obtaining this sort of information regarding our children?

Finally, what does it mean that the NJDOE (and also, apparently, other state education departments, such as the Maryland equivalent) has so cavalierly agreed to a scheme that encourages a for-profit corporation to hire adults to monitor children’s social media for supposed test security breaches? The twitter hashtags that have sprung up in response to this scandal seem largely on point:

#Pearsoniswatching
#PeepingPearson
#PearsonisBigBrother
#Pearson1984

Yes, it doesn’t shock me that Pearson, a for-profit corporation, is scanning our children’s social media, but it’s  disturbing that Pearson is reporting the results of its social media monitoring of children to government agencies (presumably in return for the $108 million that NJDOE is paying Pearson to administer these tests). In particular, Pearson is reporting kids’ alleged test-security infractions to a government agency — the NJDOE — that maintains a database with a personal identification number for each and every public school student in New Jersey. Assuming for the sake of argument that our children are somehow precluded from freely sharing — either in conversation or via social media — the reading passages and questions asked of them on the PARCC, are we parents really okay with the NJDOE possibly noting our children’s poor judgment on their records of our children’s academic careers? I have no idea if NJDOE is tracking this information or not, of course — but then again, until yesterday I didn’t know that NJDOE was regularly receiving reports from Pearson about New Jersey children’s social media use either.

It’s been a long time since I read Orwell’s 1984, but it really does feel like Pearson and the NJDOE expect parents to be okay with such corporate and government intrusion into their children’s lives. Maybe my opposition to this level of Orwellian intrusion is naive, but I, for one, find this level of intrusion into our children’s lives downright creepy. And I think that the New Jersey Department of Education officials who condoned this without full advance disclosure to the public should be summarily fired. David Hespe, that means you.

Parents, if, until now, you’ve let your kids take these tests, remember: you can still choose to refuse.

It’s a Matter of Trust – My Testimony to the Governor’s PARCC Study Commission

Today, I joined a group of about 40 amazing parents, teachers, school board members (although all of them gave the caveat that they were testifying in their personal capacities) and even an amazing Superintendent to testify publicly before Governor Christie’s PARCC and Assessment Task Force. Every single person who testified did a terrific job, and I’ll note that at this public hearing, not a single member of the public spoke in favor of the PARCC test. I’ll see if I can do a better write up later about the meeting. But in the meantime, here’s a link to a video of me testifying.

And here is the text of what I said (I’m starting to feel like something of a broken record):

My name is Sarah Blaine. I’m an attorney and the mother of two daughters, ages 6 and 10, both of whom attend public schools in Montclair. I’ve been practicing law for almost a decade now, but before I went to law school, I earned a master of arts in teaching degree and taught high school English for a couple of years in rural Maine. Over the past year, as I saw the changes in my older daughter’s curriculum, I began looking into the Common Core State Standards, and the effects they were having on our kids’ schools. As part of that inquiry, I also began looking at the new tests to measure our kids’ achievement, the PARCC tests created by the Partnership for Assessment of Readiness for College and Career. The more I learned, the more concerned I grew. I began writing about my concerns on a personal blog, parentingthecore.com, and quite a few of my pieces – including one this morning – have been picked up by The Washington Post’s Answer Sheet blog.

I stand here today to tell you that the PARCC tests are not going to improve education or help our students.

It’s all really a matter of trust. That’s what a lot of our debate about PARCC and Common Core boils down to: trust – and, where opinions and plans for public education diverge, who we, as parents, can trust to have our kids’ best interests at heart. Do we trust our children’s teachers and principals: the people we can hold accountable on a personal level – by looking them in the eyes and raising our concerns – and on a policy level, by publicly bringing any concerns we may have to a democratically constituted school boards who are elected solely to deal with education issues? Or do we trust state and federal bureaucrats and elected officials who are accountable to a broad swath of voters for far more than just education policy, and therefore, as a practical matter, outsource a great deal of their education policy setting to private corporations, such as Pearson, which sells curricula and designs tests, all to benefit its own bottom line?

In an ideal world, we could trust both, but unfortunately, our world is far from ideal. I, for one, start out more skeptical of the for-profit corporation than of our local teachers, principals, and school boards.

I was recently engaged in a discussion about the Common Core and the PARCC with a parent who is a strong proponent of the Common Core and the PARCC. He shared a little bit of his story, which illuminated the trust issue for me. In short, his oldest child began his education at a private school. The parents eventually learned that their child was lagging far behind his peers in the public schools, and moved the child to our public schools. But because those parents’ trust was broken early on by that private school experience, they’ve become huge proponents of testing to ensure that they receive up-to-date information regarding their children’s progress.

I hear and can relate to those concerns, although I find it ironic that it was a private school’s failure that triggered them.

My take, however, is different. As a parent, I start the school year each year trusting my kids’ teachers. Now, that doesn’t mean that such trust can’t be eroded over the course of an academic year. But over the years, in preschool and then in our public schools, of the 15 classroom teachers my kids have had, I’ve come across exactly one teacher who broke that trust. That’s not perfect, but those are still pretty good odds. So I’m willing to trust our schools and our teachers with our kids, and to trust that they have our kids’ best interests at heart. If I find that my trust is misplaced, I’m confident that I can go up our chain of command, and ultimately publicly air my concerns at a Board of Education meeting.

What those parents I told you about, whose trust in schools was broken by a bad private school experience, have effectively done is to substitute trust in their children’s teachers for trust in a test, under the mistaken belief that a test will ensure real accountability. And perhaps in a world where the tests are trustworthy, there’s no harm in trusting tests. But unfortunately, given the amount of time and energy I’ve spent inquiring into the PARCC tests, I think those parents are going to discover that their trust in the PARCC is at least as misplaced as their earlier trust in their child’s private school.

The PARCC consortium recently published its sample “Performance Based Assessments,” which New Jersey’s public school kids will be attempting starting just over a month from now. I have a fourth grader, so I took a quick look at the fourth grade math PBA this morning. Here’s a sample question:

Jian’s family sells honey from beehives. They collected 3,311 ounces of honey from the beehives this season. They will use the honey to completely fill 4-ounce jars or 6-ounce jars.

Jian’s family will sell 4-ounce jars for $5 each or $6-ounce jars for $8 each.

Jian says if they use only 4-ounce jars, they could make $4,140 because 3,311 divided by 4 = 827 R 3. That rounds up to 828, and 828 multiplied by $5 is $4,140.

Part A

Explain the error that Jian made when finding the amount of money his family could make if they use only 4-ounce jars.

Enter your explanation in the space provided.

Take a moment. Can you tell me the error?

Here’s the error: Jian’s math was correct, but his reasoning was mistaken, because he rounded up to 828 instead of down to 827, when the 828th jar won’t be totally full, so he can’t sell it for $5. So since his family can only sell 827 full jars of honey, the answer should have been 827 jars x $5/jar = $4,135.

What is this question really testing? Is it, as the PARCC’s proponents would state, testing whether kids “really understand” remainders and have the “higher-order thinking” skills necessary to pick up on the “trick” in this question? Or is it, as is obvious to any parent, testing whether our 9 and 10 year olds are (a) picking up on the word “completely” and (b) testing whether our kids have the life and business savvy to make the connection – under the pressure of a high-stakes test – that in this one case, they should be ignoring the remainder, instead of rounding to the nearest whole number.

My concern is compounded by the emphasis that the fourth grade Common Core math standards – and the Pearson produced Envisions math curriculum purportedly aligned to those standards – places on estimating, rounding, and reasonableness. This is a copy of one of my daughter’s Envisions math workbooks. The flagged pages are all pages that ask the kids to rely on estimating, rounding, or reasonableness to arrive at their answers. It’s not every page, but as you can see, the emphasis on those skills, which I’m not denigrating, is heavy. Given that emphasis, not only is this a trick question, but the kids themselves are already primed to miss the trick by the emphasis their standards and curricula place on estimating, rounding, and reasonableness.

Trick – and other unfair – questions like this are endemic throughout the PARCC practice tests I’ve reviewed. The English Language Arts questions are even worse. It is unfair by any measure to ask 9 and 10 year olds to type thematic essays about a Maya Angelou poem and a Mathangi Subramanian story. In addition, the essay prompt itself is ambiguously worded: the prompt asks the kids to “Identify a theme in ‘Just Like Home’ and a theme in ‘Life Doesn’t Frighten Me.’ Write an essay that explains how the theme of the story is shown through the characters and how the theme of the poem is shown through the speaker.”

What’s unclear from the language of the essay prompt is whether the kids should be identifying one theme common to both the story and the poem, or whether the kids should be identifying separate themes for the story and the poem.

One trend I’m sure you’re aware of across the state is that many of our high schools have been eliminating midterms and finals to make room for PARCC. The ambiguity of the essay prompt – and unfair trick of the math question – illustrate why eliminating local, teacher-created tests, midterms, and finals in favor of standardized tests is a bad idea. All teachers make mistakes – both in inadvertently drafting ambiguous questions and incorrectly grading answers, but kids are savvy and can point those mistakes out to the teachers. That’s built-in accountability.

Pearson and PARCC, however, are accountable to no one. Our teachers won’t even be allowed to look at – much less discuss – the PARCC tests. They won’t see the kids’ results, or where they might have been tripped up by a trick. So there will be no one to tell us which questions were unfair, ambiguous, or otherwise problematic. That is, there’s no accountability for Pearson, and as I’ve discovered this year, there is no question that even the Pearson materials – be they PARCC practice tests or Envisions – that the public does have access to have plenty of mistakes.

If the stakes for these tests were low – and the time they sucked away from “real” teaching was minimal – then I doubt the outcry would be so intense. Instead, however, these are the tests that even this year, unproven as they are, will make up 10% of teachers’ evaluations, and they will continue to take on an increasingly central role in evaluating our teachers, schools, and children. Not surprisingly, for those kids who don’t make the cut, Pearson now controls the G.E.D. test.

I urge you to carefully review the PARCC sample tests yourselves, keeping in mind the age level of the students who are intended to take them. Please recommend to the Governor that we join the dozen or more other states, including, most recently, Mississippi, in rejecting the PARCC as a valid means of assessing our kids. New Jersey’s kids deserve better.

Our New Jersey teachers and principals are accountable to our local communities. Where individual districts have ongoing problems with ensuring educational equity, the state can step in through its Quality Review system to address those problems on a case-by-case basis. But the PARCC test is setting up a narrative that all of our schools are failing. After spending significant time reviewing what’s happening in our schools and what’s contained on a for-profit corporation’s PARCC tests, I, for one, am confident that the problem is with the test, and not with our kids, our teachers, or our schools.

Thank you.

Pearson’s Apology

For everyone who read and commented on my prior post, Pearson’s Wrong Answer, first of all, thank you.  The response has been overwhelming.  Second, I just wanted to take a moment to let you know that my post did eventually percolate its way to Pearson, and a Pearson representative named Brandon Pinette appears to have left a comment on the blog post today:

Pearson did make an error on the specific quiz question in a lesson in the Envision Math textbook and we sincerely apologize for this mistake. We corrected the error for future editions of Envision, but failed to adjust the question in editions currently in the field. We owe it to our students and teachers to ensure these types of errors do not happen in the future, and are committed to adapting new protocols to fix mistakes before they happen. Trust in our products and services is key and we have to earn it every day with students, teachers and parents.

Thank you,
Brandon Pinette
Pearson

It seems only fair to make sure that this specific apology for this specific mistake gets highlighted more than as one of almost a hundred comments to a blog post.  

However, from the overwhelming responses and comments this blog post has received (here, on Facebook, and on Valerie Strauss’s blog at The Washington Post, The Answer Sheet) one thing seems clear: this is not an isolated problem (either for Pearson or for textbook and academic material publishers in general).  Because my child is slated to take the Pearson-developed PARCC tests this spring, my focus is on Pearson.  Mistakes in other textbooks are annoying, but my specific concern about Pearson is its vertical integration throughout the education world: i.e., Pearson writes the textbooks (mistakes and all), Pearson writes and grades the PARCC tests, Pearson provides remedial programs for students who fail the Pearson-generated tests, and Pearson writes the GED tests for those students who drop out of high school.

I encourage anyone who finds other mistakes in Pearson materials to take photos of the specific mistakes, and then Tweet them with the hashtag #PearsonsWrongAnswers.  

I am glad that Pearson is “committed to adapting new protocols to fix mistakes before they happen” and that Pearson recognizes that “Trust in our products and services is key and we have to earn it every day with students, teachers and parents.”  

But I still think that we need to continue to hold Pearson accountable.  

Many commenters have pointed out, with validity, that there is supposed to be statistical analysis of standardized test questions, and that mistaken questions on the standardized tests will be thrown out as invalid.  I am sure that they are correct that this does happen.  However, with tests as high-stakes as these, I am not sure that this is a sufficient response.  

For instance, imagine if this was a standardized test question.  I could easily see a 9 or 10 year old test taker, who figures out that the correct answer is 546, struggle as she looks at multiple choice responses such as (a) 78 (b) 130 (c) 500 or (d) 63.  And I think that some kids are more likely than others to be distracted by (and therefore waste time on) issues generated by mistakes such as this one.  As a result, on a high-stakes timed standardized test, the time wasted on the wrong questions like this one may artificially deflate a child’s score.  And similarly, the child who gets the intended but mistaken correct answer (in this case, 78 miles, which would be correct if Curtis walked 3 miles a day for 26 DAYS) may obtain an artificial advantage because she isn’t bogged down by catching and mulling over the mistake.  Throwing out the specific question will not address these issues.  

And as long as we are addressing comments, for those commenters who think my post was an overreaction, so be it.  Perhaps it was.  But as noted above, Pearson has an awful lot of vertical integration throughout the education market, and Pearson’s employee himself admitted that “Trust in our products and services is key.” 

Pearson has to earn my trust.  And since its materials are at the heart of my children’s math education, I will be doing my best to look over its shoulder now, as much as anything as part of my decision-making process concerning whether I think I should join the movement to refuse or opt out of its standardized tests.  

Thank you all again.

Pearson’s Wrong Answer

Updated (Oct. 10): Pearson responded to this post in the comments section.  See Pearson’s Apology. 

Last Friday morning, my fourth grader handed me her “Thursday folder” shortly before we needed to head to the bus stop. I was glad to see a perfect spelling test, and a bunch of excellent math assignments and math tests. Time was short, however, so I flipped to the wrong answers. And sprinkled among the math tests, I came across two wrong answers that caused me concern.

The first problem was this:

Now, I looked at this problem before I’d had my morning coffee, and I wasn’t sure at first that I wasn’t just missing something. So I posted this picture to my Facebook feed, and asked my friends to confirm that I wasn’t crazy.

But my daughter was right: if Curtis walked three miles a day for 26 weeks, Curtis did in fact walk 546 miles.

3 miles/day x 7 days/week = 21 miles/week
21 miles/week x 26 weeks = 546 miles

I double, triple, and quadruple checked myself.  I pulled out a calculator.  

My friends agreed: my initial reaction to this question wasn’t nuts. My daughter’s answer was correct. And they came up with some good theories for why the answer might have been marked wrong.

Perhaps the teacher was trying to teach children, especially girls, to be confident in their answers, and she’d been marked wrong due to the question mark.

Perhaps she’d been marked wrong because she failed to indicate the units.

Perhaps she’d been marked wrong because she hadn’t provided every step of her work (i.e., she’d figured out the first step (3 miles/day x 7 days/week = 21 miles/week) in her head, and therefore had paid what one of my friends memorably described as a “smart kid penalty.”

But they were all wrong.

My daughter is fortunate enough to attend an excellent public school and her responsive teacher both sent a note home and called me that afternoon to discuss (I’d scribbled a quick note asking what the deal was along with my required signature on the front of the paper).

It turned out that my daughter had been marked wrong for a very simple reason: the Pearson answer key was wrong.

Let me say that again: Pearson was wrong.

Pearson listed some totally different — and wrong — number as the answer. The teacher had missed it when reviewing the test with the morning class, but in the afternoon class she’d realized the problem. My daughter’s teacher apologized for forgetting to mention it again to the morning class (and for not having previously changed their grades, but to be honest, I really could not care less if my kid scored a 95% or 100% on a 4th grade in-class math test).

In the olden days, I’d have laughed it off. Once in awhile, the textbook publisher screws up. In the olden days, that screw up was no big deal: it is mildly annoying to those of us who pay the taxes to buy the books, but it’s a pretty minor annoyance in the grand scheme of things.

However, these are not the olden days. These are the days of high stakes testing. These are the days in which our kids’ high school graduations hinge on tests created by the very same company — Pearson — that screwed up the answer to this question.

Tests we parents will never get to see.

Tests we parents will never get to review.

Tests we parents will never get to question.

So Pearson’s screw up on its fourth grade answer key doesn’t exactly inspire confidence.

Presumably, before the enVisions curriculum was published, Pearson checked and rechecked it. Presumably, its editors were well-paid to review problems and answer keys.

After all, Pearson itself describes this math curriculum as:

Written specifically to address the Common Core State Standards, enVisionMATH Common Core is based on critical foundational research and proven classroom results.” 

And yet… it was still dead wrong.

It seems that all of Pearson’s critical foundational research and proven classroom results in the world couldn’t get the question 3 x 7 x 26 correct.

To the uninitiated, I bet I sound nuts.  Who cares, right?  It’s just a question on a math test.  But if we are going to trust this company to get it right on high-stakes tests (where there is no public accountability), then the company better get it right all the time when it is operating within the public eye.  So this isn’t just about a fourth grade math test.  It’s all of the other Pearson-created tests my daughter is scheduled to take: in particular, the PARCC tests this spring, which are the ones that come with no public review, and no public accountability.  

Here, the test came home in my daughter’s backpack. As a result, there was an opportunity for public review and public accountability because I could review the test and question the wrong answer. The teacher could check the question and realize that the book was wrong, and substitute her own professional judgment for that of the textbook publisher.

And most importantly, the mistake was not a big deal, because the outcome of this test would not determine my daughter’s placement into an advanced math class or a particular school or even prevent her from graduating from the fourth grade. The outcome of this test would not determine her teacher’s future salary or employment. This test was nothing more than the kind of test our nine and ten year olds should be taking: a fourth grade in-class, teacher-graded chapter test. At most, this test will determine a small portion of my daughter’s report card grade.

But what about those tests that Pearson will be administering to our students this spring? We won’t be able to review the test questions, the answer keys, or our children’s answer sheets. We won’t be able to catch Pearson’s mistakes.

This spring, even if the answer really is 546 miles, Pearson will be able to write that Curtis traveled 1024 miles, or 678 miles, or 235 miles, or any other distance it wants. And we’ll never know that our kids weren’t wrong: Pearson was. But our kids’ futures — and their teachers’ careers — will be riding on the outcomes of those tests.

There has to be a better way.

In a low-stakes world, Pearson’s screw up was a low-stakes mistake. But now we’re forcing our kids — our eight, nine, and ten year olds — to live in a high-stakes world.

And in a high-stakes world, Pearson’s screw ups are high-stakes. So shame on you, Pearson, for undermining my daughter’s hard-earned (and easily eroded) math confidence with your careless error. I will parent my kid so that she learns not to second-guess herself with question marks after her answers. 

But Pearson, I will be second-guessing you. As publicly as possible.

And perhaps… just perhaps… I will start shorting your stock.