Initiative error rates vary from year to year

Initiative error rates vary from year to year

petitions1A new Elections Division posting shows a wide variety of error rates over the past two decades.  The average has been about 18.5 percent.

Twenty of the 57 initiatives and referenda checked since 1990 had invalidation rates of 20 percent or more.  The biggest invalidation rates were tribal gaming measures in 1995 and 1996, with error rates of 27.46 percent and 26.61 percent, for I-651 and I-671, respectively. More recently, a medical malpractice initiative to the Legislature sponsored by trial lawyers, I-336, in 2005, had an invalidation rate of 26.10 percent, and a renewable energy measure, I-937, posted a 24.33 percent rejection rate in 2006. Of those four measures, only the energy initiative was approved by the voters.

At the other end of the scale were a handful of measures with between 10 and 13 percent. The lowest or “cleanest” of the bunch was a measure dealing with a late-term abortion procedure, I-694 in 1998, with an error rate of just 10.53 percent.  Close behind: I-593 in 1993, the “three strikes and you’re out” law, 11.33 percent; and R-55 in 2004, authorizing charter schools, 11.69 percent.  Several were in the 12 percent range, including this year’s I-1033, dealing with caps in state, county and city general fund revenue growth, 12.14 percent; last year’s I-1000 dealing with the terminally ill, 12.27 percent; I-607 in 1994, dealing with denturists, 12.72 percent; and I-601 in 1993, limiting state spending growth to inflation plus population growth, 12.5 percent. The abortion restriction and charter school measure were turned down by; the others were approved by voters, except for the one that is pending for November,  I-1033.

The error rate for the other potential ballot measure for this year, R-71, dealing with benefits for state-registered domestic partners, has been running around 11 to 12 percent.

One footnote: It’s very unusual to have an every-signature check, as is occurring for R-71. In the two decades covered by this study, full checks were needed only for I-655 in 2002, dealing with bear-baiting; I-917 in 2006, dealing with car tabs; and I-534 in 1990 dealing with pornography. Only I-655 was validated to the ballot; it passed.

14 thoughts on “Initiative error rates vary from year to year

  1. (Re-posting this here since it somehow didn’t get posted on the last thread) I’ve noticed that the occasional volume has over 300 signatures. Most notably, Vol 165 has 320 signatures and Vol 31 has 315 signatures. The petition formatting rules mandate no more than 20 spaces per petition sheet for signatures (http://www.secstate.wa.gov/_assets/elections/Initiative%20and%20Referenda%20Manual.pdf). Multiplying this times 15 petitions per volume, the maximum number of signatures we should see is 300 per volume. Apparently, some petition sheets have more than 20 signatures on them. What is the fate of signatures in excess of 20 on a sheet?

  2. Bound Volume refers to petition sheets bound together into a volume. The petition sheets are bound into volumes to conduct the verification process. Each volume contains approximately 15 petition sheets. Each bound volume is numbered. Referendum 71 has approximately 623 bound volumes.—Verifying Signatures for Referendum 71

  3. Lurleen– this is from Elections Division in reply to yours: “Volume 165 has 16 pages. That happens very occasionally while we’re assembling the volumes, either an extra page is included, or one is left out.

    Also, it is our normal procedure to accept the signature if a person writes in their name at the bottom of a filled petition with all the necessary information and a matching signature. So, some volumes may have 21 signatures, I’ve also seen a couple with 22 or 23 signatures (the extra ones are written in at the bottom).

    We’ve also seen a couple sign on one line…each person signs, (one in the box for printed, one in the box for signature) and then the address. If the checkers match what we have on the database, we’ll accept them.”

  4. Clearly, there is no reason whatsoever for the petitioners to complain about this process as it clearly favors giving them every benefit of the doubt. Both sides are being served in this case: every signature is being checked, one by one, and we will all know by exactly how many signatures the petition either stands or falls. The SoS is clearly using objective and transparent methods that any petitioner should be more than satisfied with.

  5. David:

    Not to nag, but many of us are interested in hearing about the results of the master check of the micro-sample of 200 initially accepted signatures. Any estimate of when that might be disclosed?

    Thanks, and I hope you all are not too stressed from all this scrutiny and comment.

    Susan

  6. D. Ammons,

    The cartoon images and photos chosen for each posting are simple but pretty creative and professional looking. Do you do it all yourself or do you have a team helping you?

    Bill W.

  7. Susan and others who asked about the special reviews … here is the Elections Division report …

    Shane Hamlin
    Assistant Director of Elections
    Referendum 71 Petition Check
    August 19, 2009

    Special review of selected rejected and accepted signatures on Referendum 71

    Request for a special review from Protect Marriage Washington (opposed to SB 5688)

    On August 7, 2009, Roy and Valerie Hartwell, observers working on behalf of referendum sponsor Protect Marriage Washington, raised concerns regarding search methods being employed by two master checkers. Mr. and Mrs. Hartwell specified a number of concerns regarding how Nick and Marc Pharris, the two master checkers in question, did or did not use various data points (first name, last name, middle initial, partial name, etc.) to search for a matching record, as well as the amount of time spent searching for a matching record.

    The Hartwells requested that all of the rejected signatures reviewed by Marc and Nick be rechecked. Alternatively, I agreed to have a third master checker conduct a special review on a sample of rejected signatures Marc and Nick reviewed and did not change to accepted. I asked Mr. Hartwell to provide the volume number of three volumes for each master checker that included rejected signatures Mr. Hartwell believed should have been accepted during the master check review.

    Mr. Hartwell submitted volumes 47, 48, 70, 73, 87 and 89 for this special review. Nick Pharris conducted the master check on volumes 47, 73 and 89. Marc Pharris conducted the master check on volumes 48 and 70. Mr. Hartwell believed Marc also reviewed volume 87 but, as it turns out, Mr. Hartwell was mistaken.

    David Valiant conducted the special review of Nick and Marc Pharris’ work on volumes 47, 48, 70, 73 and 89. David Valiant did not conduct a special review for volume 87 because he had actually conducted the master check on volume 87. Another master checker, Nancy Jo Armstrong, conducted the special review of volume 87.

    The six volumes were originally checked by first checkers, prior to the master check process. The six volumes included a total of 225 signatures that were rejected following the first and master checks. Of these 225 rejected signatures, 11 signatures that were previously not found in the voter registration rolls were found and then accepted during the special review. Another two signatures that were previously not found in the voter registration rolls were found during the special review, but the image of the signature was missing or of poor quality so they changed to Pending status. None of the signatures that had been rejected due to the signature on the petition not matching the signature on the voter registration file were accepted during the special review.

    Volume Number Signatures in Volume Number of Rejected Signatures after First and Master Checks Changes
    47 300 42 Changed 2 from Not Found to Accept +2
    48 300 63 Changed 6 from Not Found to Accept
    Changed 2 from Not Found to Pending +8
    70 299 49 Changed 1 from Not Found to Accept +1
    73 196 27 No changes +0
    87 300 17 Changed 1 from Not Found to Accept +1
    89 299 27 Changed 1 from Not Found to Accept +1
    225 Net Change +13

    Request for a special review from Washington Families Standing Together (supports SB 5688)

    The same day that the Hartwells expressed concern about the master checking conducted by the Pharris brothers, I spoke with the lead observer for Washington Families Standing Together, Mona Smith.

    I asked Ms. Smith if she had any concerns about the signature verification process or checkers, including the master checkers. She offered careful, light praise of the process and stated that several of the checkers appeared to be moving quickly. When I explored this concern with her a bit more, she mentioned Marc Pharris as an example of someone moving quickly. Ms. Smith also mentioned concerns that accepted signatures were not getting a second review. During this conversation she asked if we would consider reviewing all or many of the accepted signatures.

    Later that day, Friday, August 7, I informed the Hartwells that we would grant their request, but would limit our review to six volumes, chosen by them. That same day, or possibly Monday, August 10, I informed Ms. Smith of our decision to review six volumes for the Protect Marriage observers. She again requested we review the accepted signatures. I told her that, out of fairness, I would agree to her request in a limited manner by conducting a special review on a number of accepted signatures equal to the number of rejected signatures in the six volumes submitted by the other side.

    Ms. Smith provided a list of accepted signatures, by referencing the volume number, page number, and line number of the accepted signatures for which her observer team had concerns. I believe I took these from her when I did not yet have a total number of rejected signatures for the other side. Eventually, I was able to compile the number of rejected signatures under special review; at that time I believed that number to be 199. I passed this number on to Ms. Smith. Later, I learned that there were actually 225 rejected signatures in the six volumes the Hartwells asked to have reviewed.

    On Tuesday, August 11, Ms. Smith gave me a second list of accepted signatures for the special review. This list included 226 signatures in 50 volumes, referenced by volume number, page number, and line number. I believe I again told Ms. Smith that we would only check the same number of accepted signatures as rejected signatures checked for the other side.

    The special review of these signatures was conducted by Zach Tobin and Cooper Hjelm.

    Of the 226 accepted signatures submitted for the special review, four could not be found, likely due to incorrect volume, sheet and line references. The special review was conducted on 222 signatures.
    Of the 222 accepted signatures reviewed, 14 were changed from Accept to No Match, 6.3%. Two were changed from Accept to Signature Pending. Fifteen were designated as reject because a registration record could not be found for the signer. These 15 will be investigated further; the status of these signatures may or may not change depending on further research.
    Due to the number of volumes included in this special review the results are not displayed in a table. A copy of the signature references submitted, and the notations made to the list by the master checkers who reviewed the accepted signatures, accompanies this document.
    The reported numbers of signatures accepted and rejected were adjusted to reflect the results of the two special checks conducted at the request of the opposing parties.

    On Monday, August 17, I provided each side with a summary of the results of the special review conducted at each side’s request.

    No further special reviews will be conducted

    Both sides of Referendum 71 have asked for special reviews of signatures they felt, if reversed, would aid their cause. Repeated requests were made throughout the first week of the signature check.
    The Elections Division accommodated these requests out of a desire to foster a fair process. The Elections Division has conducted those reviews and has now decided that no further special reviews will be conducted for the following reasons.

    The results of these special reviews have no statistical validity, and carry different meanings for different parties. The special reviews are not statistically valid because they were not conducted using random samples. Rather, the 450 signatures were specifically selected by the opposing observers.

    The opponents of SB 5688 may believe that the results indicate that the special review triggered changes in their favor.

    The supporters of SB 5688 may believe that the results indicate that the special review triggered changes in their favor.

    However, the Elections Division believes that the results reflect an expected result in a review or recount environment. That is, changes will occur if additional research or review is done, particularly in a process like signature checking that is, in and of itself, an imprecise and sometimes subjective process. But these changes tend to be offsetting. At the end of the special reviews, 13 signatures were added to the accept pile and 14 signatures were added to the reject pile. This is a net change of 1 more rejected signature out of a pool of 450 signatures reviewed. This is the expected or typical result of a recount or review.

    The special reviews took more than three days of staff time to conduct.

    The Elections Division is now devoting all resources toward completing the regular signature verification process, including the recent registration check. The Division is operating two shifts of checkers, master checkers, and the recent registration checkers. The shifts operate 15 hours a day. The staff is growing fatigued, particularly under what appears to be ever increasing scrutiny by the observers, and the pressures of a close signature check.

    Further, litigation may occur over the outcome of the signature verification. Both sides of inquired about an appeals process, and our schedule allows time for this.

    As a result of these factors, the Elections Division has determined that it will not conduct any further special reviews. Elections Division supervisors will continue to be available to the lead observers on both sides to ensure both sides have an opportunity to express concerns regarding the check, and to review particular situations, but, absent special circumstances, this will not include further broad scale reviews.

  8. Bill– the artwork, cartoons, layouts, photos, etc., are the work of Christina Siderius, deputy communications director for new media. The blog, From Our Corner, is only 7 or 8 months old. We also do Facebook aps, YouTube and Twitter. fun stuff, and it allows us to have a “conversation” with people, and not just one-way communication from Olympia. we also have the State Library, the State Archives, Digital Archives, Corporations and Charities Division, international trade promotion, and various programs, such as oral history and the Heritage Center, and the address confidentiality program for battered women. We even have a “State Seal” store and sell state and US flags! Quite a busy agency.

  9. Mr. Ammons, thank you to you and your staff for reviewing this. The fact that haste seems to reduce the error rate by .22% should dull the rather mob-like call for slowing down, since that would ostensibly increase the error rate by a similar margin.

    It should also temper claims that you’re “in the tank” for any one side, since statistically it would raise the error rate by an amount that could easily decide the measure’s fate, but does not stand so far-afield that it’s a deliberate and concerted effort on the part of the checkers.

  10. Thanks for these stats. Just cusious: for the 3 where all the names were checked, was it because the amount of signatures turned in were so close to the over/under mark? Was it because the spotcheck typically done were inconclusive? Hardly it seems like it was done because they were controversial issues (however, bears might have different opinions on I-917).

    Thanks for all your blogging!

  11. Would it be fair to summarize the results of the three days special checks as follows:

    Of 225 rejected signatures re-checked, 13 signatures previously rejected (because they were not found in the database during the initial and master checks) were found in the database by the third check. It’s not clear if the same database was used for the master check and the third check–this could represent error on the master checker’s part or it could represent updates in the database. In any case, it’s about 5.8% of the signatures checked.

    Of 222 accepted signatures rechecked, 14 (6.3%) were rejected as “no match” (meaning, I assume, that the signature did not match the signature in the database), 2 (1%) were made “pending” because the signature image in the database wasn’t good enough to match, and 15 (6.8%) weren’t found in the database. These last 15 will be investigated further, but my guess is that they won’t be found because they (along with the other 16) represent the “false accepted” error I’ve been talking about–perhaps the initial checker simply clicked/pressed the wrong button. Totalled up, there were 31 errors found, or 14%.

    This is MORE THAN DOUBLE the 13 errors found in the other direction, not different by 1. The net change will likely be 16 more rejected signatures, or 3.6% of the sample. 1% represents a little under 1400 signatures.

    This is a specially-selected sample, certainly not a random and therefore NOT statistically valid, but I’m rather surprised at the error rates nontheless–and the difference between the rates could imply that the master check check process reduces errors for rejected signatures but not the errors for initially-accepted signatures, which, if true, introduces a bias into the process.

    I noticed that no further special checks will be done in order to expidite the basic process. I agree with that decision. I would like to ask that as soon as the standard process is done that your do a statistically-valid random check of the signatures accepted at the first pass in order to establish the expected error rate range for that process–that information will take time to create, but it will be useful in the event a judge needs to decide what to do.

  12. In thinking about the decision not to establish expected error ranges before finishing the process, I find I need to change my mind. I really WOULD like the error ranges established before the count is finished–and then used to decide whether to reexamine those signatures initially accepted as valid.

    The reason has to do with public perception of the process. If there’s a press release announcing that R-71 has made it on to the ballot, it will be difficult to change that.

    I’m remembering the effect that Fox News’s announcing the 2000 Presidential election for George W. Bush had in making it difficult for the Democrats to establish that Gore actually won Florida (which the press recount later on found to be the case). Instead, the U. S. Supreme Court stopped the recount process and declared Bush the winner.

    We have a very different Supreme Court in Washington, and a much different history with recounts–I’m thinking of Gov. Gregoire’s 2004 victory on the final recount. It’s a history where the courts and the election officials really did their job well. So I’m not as concerned about this as I might be. But I’m still concerned–and if R-71 qualifies narrowly, I’d like the public to know how many false valids there might be.

  13. With regards to the re-checked samples, perhaps it would be a wash if an equal number of accepted and final rejected signatures were re-checked, but this doesn’t take into account two simple facts: 1. accepted signatures outnumber rejected ones by a factor of more than 4-1; 2. As stated recently the vast majority of “final” rejected signatures (those were no registration was found) ARE being given another check against a live registration DB. Does anyone not see the problem with this? Not to mention the fact that I see no indication whatsoever that any consideration is being given to the date one which one registered. I would think that, at the very least, no one registered after July 25 2009 should have their signature accepted. It says right on the petition that the signer is stating that they ARE a registered voter. Anyone, who wasn’t registered as of July 25 could not possibly have been a registered voter at the time they signed.

  14. The only problem with cutting the registration date off at July 25 is that existing common practice is to get people to fill out registration forms as they sign. For those forms, the registration date is the date the forms are delivered to a county auditor or the Secretary of State. The forms are required to be delivered in one week at most, so that would give a latest date of about July 31, depending on how you count seven days and weekends.

    Given this current practice, that’s appropriate. I’m not clear that the practice is legal, however, unless the registration forms are delivered the same day.

Comments are closed.

Comments are closed.