Guest Post: Fast Examiners; Slow Examiners; and Patent Allowance

Prof. Shine Tu (WVU Law) has been doing interesting work studying patent prosecution and how differences between patent examiners impact the process.  I asked him to provide a guest post to help readers get started on his work. – DC

by Shine Tu

Although we know that individual patent examiners can greatly affect an inventor’s chance to (1) get a patent at all and (2) get it in a timely way, there has been very little work determining how examiners are able to either delay or compact prosecution while still maintaining their quotas via the count system.  Understanding how examiners work the quota system with very different outcomes can be critical for practitioners trying to understand what sort of responses or claim narrowing they should make. It also has significance for those looking to understand and improve the very process intended to spur invention.

In a previous study, I have shown that there are extreme variations on allowance rates between examiners.  For example, in analyzing 10 years of patents from Technology Center 3700 I found that there were approximately 200 examiners from 3700 who had issued over 120,000 patents (approximately 51% of the patents from this Technology Center). In contrast, there was a group of approximately 300 examiners who issued less than 800 patents (less than 1% of the patents from this Technology Center). [See https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1939508].  In this current dataset, I find that not only is there a difference in allowance rates, but there is a significant difference in prosecution times. Fast examiners allow applications in approximately 1.64 years, average examiners in 3.07 years, and slow examiners on average will allow a case in 5.85 years.  This delay of over four years (fast versus slow examiners) increases direct costs to applicants in the form of PTO and attorney fees, as well as indirect costs such as reduced growth, sales, and follow-on innovation.

In a set of two articles, I explored how examiners can either: (1) slow down the patent prosecution process by using a strategy of constant rejections [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3539731] or (2) speed up the patent prosecution process by using a strategy of fast allowances [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3546944]. To create a sufficiently large sample to be statistically significant, I coded the patent prosecution histories of 300 patents and reviewed 100 patents from slow, average, and fast examiners from workgroup 1610.  Every rejection issued by the examiner and every response and traversal argument by the applicant was recorded.

As an initial matter, these data show that examiners in each group have similar amounts of experience at the PTO and similar average current docket sizes. However, the allowance rates of these examiner groups vary dramatically, with 79.55%, 61.65%, and 27.7% allowance rates corresponding with fast, average and slow examiners, respectively.  The Office Action to Grant (OGR) score shows that these fast examiners grant a patent for every 1.5 Office Actions written, while it takes average examiners roughly 4 Office Actions and slow examiners a stunning 10.5 Office Actions before they grant one patent.

Fast examiners seem to be using a count maximization strategy based on allowances. A typical applicant who gets a fast examiner will usually have one or two Office Actions before an allowance.  Fast examiners do not use many prior art rejections.  Additionally, the rejections employed by fast examiners rely heavily on Obviousness-type Double Patenting (ODP) and/or 35 USC 112 rejections. Fast examiners have four times as many ODP rejections compared to slow examiners.  Most applicants can (and do) traverse these ODP rejections by simply filing a terminal disclaimer.  Interestingly, use of the ODP rejection is a super-efficient way to employ a count maximization strategy. This is because little work is needed to find an ODP rejection, due to the closed universe of patents, and an ODP rejection is relatively easy for the applicant to traverse. Thus, an ODP rejection followed by a terminal disclaimer gets the examiner to maximum counts with minimal effort.

In contrast, slow examiners seem to be using a strategy based on rejections.  First, slow examiners have a much higher restriction rate (almost twice) and encounter three times as many traversals to these restriction requirements.  These data are consistent with a rejection strategy because examiners can create a large patent family and cycle through rejections with less work, especially since they should already be familiar with the specification from the other restricted family members.  Furthermore, slow examiners may not be able to avail themselves of the ODP rejection strategy employed by fast examiners because of the safe harbor created by 35 USC 121.

Not only do slow examiners use more prior art, the sources of prior art differ for slow examiners versus fast and average examiners. Slow examiners employ a rejection strategy based on prior art, with five times as many 102(a/e) rejections and six times as many 103 rejections compared to fast examiners. For 102(a/e) rejections, slow examiners rely on both US patents as well as printed publications, while fast and average examiners rely on US patent applications.  Interestingly, for 102(b) rejections all examiners rely more on printed publications and secondarily on US patents. With 103 rejections, examiners also all mainly rely on US patents and, secondarily, on printed publications. Thus, all examiners search and employ prior art from different databases, however, they use the prior art that they find in different ways.

Unsurprisingly, applicants traverse prior rejections from slow examiners at a much higher rate than fast examiners. Specifically, with 102 and 103 rejections, applicants will push back against slow examiners most commonly with a missing elements argument.  In contrast, most applicants respond to fast examiner 102 and 103 prior art rejections by simply filing claim amendments.  Interestingly, applicants will also push back against 103 rejections from slow examiners by making a “no motivation to combine” argument. This may be because slow examiners use seven times as many references as fast examiners.

Slow examiners also put the brakes on prosecution by filing multiple 112 rejections.  Specifically, slow examiners utilize three times as many 112 second rejections, four times as many enablement rejections and seven times as many written description rejections. With slow examiners, applicants use arguments to traverse enablement and written description rejections. In contrast, applicants with fast examiners usually only make claim amendments to traverse enablement or written description rejections.

Practitioners need to understand what type of examiner they have.  Understanding and using this data is paramount to help manage client expectations as well as to help create a rational prosecution strategy.  I note that all of these data can be accessed through services such as LexisNexis PatentAdvisor® to help determine which examiner you may encounter.  This may also be important for patent prosecution strategy since slow examiners may require a strategy that involves an appeal.  While fast examiners may require a strategy that involves fewer amendments and more arguments.

Although I do not make any definitive judgements about the quality of the claims passed by different examiners nor even if there is an “optimal” or “ideal” allowance rate, these varying trends indicate a wide discrepancy in examiners’ methodology that may be affecting the overall quality and number of patents created. By analyzing the differences, my studies suggest how the counts system might be modified to ensure a more efficient and balanced process where all examiners apply the rules of patentability fairly and consistently. One possible solution, for example, would be to review applications from both fast and slow examiners at a higher rate. Another solution may be to deduct counts from examiners who make too many erroneous rejections.  Conversely, adding counts for examiners who dealt with difficult applicants could also be in order.  Alternatively, we could completely reform the count system and create an examiner incentive structure that focuses more on quality and less on quantity.  Only by looking in-depth at examiner behaviors will we be able to (1) better understand and navigate the current system and (2) make reforms to the current process that will truly encourage innovation.

81 thoughts on “Guest Post: Fast Examiners; Slow Examiners; and Patent Allowance

  1. 19

    In other words there are some “fast” examiners doing a rough shod quick word search and just allowing allowing allowing (maybe with exam amend to put a few more limitations in idk he doesn’t say). Then that leads to a bunch of continuations, that then get ODP rejections. Somehow the author believes that sending out ODP rejections helps get counts faster (pro tip: it doesn’t). Then there are some examiners spending more time on the search finding a whole sht ton of references which then have to be sorted through, used in rejections and overcome. And these same slow examiners aren’t making all that great of rejections it sounds like. Sounds like the slower ones are, in general, just not that great at finding, recognizing and applying all that great of art, and then are forced to find more of it in that art specifically that was being looked at. They don’t address the middle time examiners all that much.

    “Slow examiners also put the brakes on prosecution by filing multiple 112 rejections. Specifically, slow examiners utilize three times as many 112 second rejections, four times as many enablement rejections and seven times as many written description rejections.”

    In other words they’re probably not understanding enablement all that well, and they’re actually looking for WD problems, and likely not understanding that all that well either, whereas the fast examiners mostly just skip WD problems and let them sail and if they’re going to make a WD rejection it’ll be a good one. I’ve known about that later strategy of mostly skipping for quite awhile, don’t use it myself but it will hurry things up. I don’t myself practically ever get args on WD problems as I only send a rejection when there is one.

    “By analyzing the differences, my studies suggest how the counts system might be modified to ensure a more efficient and balanced process where all examiners apply the rules of patentability fairly and consistently.”

    Don’t think the problem is “fairly”, as all examiners are probably already trying their best more or less. The problem is that different people get different impressions from trainings (and their impromptu trainings from different primaries and spes etc.) that then cause them to go do different things.

    “One possible solution, for example, would be to review applications from both fast and slow examiners at a higher rate. ”

    That costs money (time), presumes you have good reviewers (pro tip you probably don’t), and assumes that such would change anything (pro tip it mostly doesn’t or it’d be happening already just at a slower rate).

    “Another solution may be to deduct counts from examiners who make too many erroneous rejections. ”

    Would just make it easier to fire someone for lack of production, no real point unless you just want to fire people (pro tip they generally don’t want to just fire people they sunk 150k+ into training). And it will just take more time away from the next guy in the examination que causing still more problems.

    “Conversely, adding counts for examiners who dealt with difficult applicants could also be in order.”

    Not really sure how you would detect that.

    “Alternatively, we could completely reform the count system and create an examiner incentive structure that focuses more on quality and less on quantity.”

    I wish, but the corporate overlords don’t want that. And they never will. The $$$ rules here now more than ever and it ain’t goin away now that the agency is well captured by corps, and quality ain’t the name of that game.

    “Only by looking in-depth at examiner behaviors will we be able to (1) better understand and navigate the current system and (2) make reforms to the current process that will truly encourage innovation.”

    I agree with that. But I disagree with much of your analysis. The part about ODP’s being a “strategy to get counts” is the most lolable part. ODP’s do nothing in 90%+ of cases but delay the getting of counts.

  2. 18

    When you mention that ‘slow’ examiners issue 112 rejections much more than ‘fast’ examiners, did you control for the number of typos in the patents from each group?

    Regarding every allowed patent undergoing SPOE, I find that very suspect – in the past it QR has always been a sampling of allowed applications.

    To add an anecdote, I know of a retired examiner who had much more than two allowed cases bounce back from QR. The QR examiners have production, too, and some will issue ridiculous supposed errors to get their own counts, and then the examiner and their supervisor have to defend the decision to allow the application.

    One other anecdote, many applicants file for an application without having done as thorough of a prior art search as an examiner (or a STIC searcher) might do, and it is not outside the realm of possibility that more than one examiner has been assigned applications from subclasses where this is common, and (sadly) end up not allowing any applications.

  3. 17

    You mention at p. 21 that the PTO can delay prosecution for twenty years so that it is DOA when finally granted, yet you mention PTA in the footnote, so not really sure why you would include this paragraph?

    As for p. 22, Jerry Lemelson might disagree. 😀

  4. 16

    Another variation I don’t see controlled for is, within a particular technology (classification), are some examiners fast/slow when dealing with a particular company/organization/law firm/attorney?

    Some attorneys are highly motivated to get applications granted and are readily amenable to examiner indications of allowable subject matter.

  5. 15

    I was glad to see that you had at least tried to factor to the Art Unit, however, you surely could have controlled for perhaps the issued classification, as that will likely be quite a bit more narrow than an Art Unit’s breadth of technology covered. I have not read the papers in depth, but a quick search found only a mention that junior examiners have greater amount of time (‘expectancy’) per count than senior examiners, but apparently no controlling for variations in expectancy between ‘fast’ and ‘slow’ examiners, which can be an incredibly large variation, even within a particular art unit. The issued classification may be a good proxy for this variation, since as I recall that is the basis for determining how much time examiners are given per count.

  6. 14

    Does anyone here sincerely believe that a majority of the “fast” examiners are doing their jobs well?

    1. 14.1

      Conversely, does anyone here simcerely believe that a majority of the ‘slow’ examiners are doing their jobs well?

      1. 14.1.1

        If you write a claim that forces the examiner to find the actual best prior art then it doesn’t matter whether the examiner is “slow” or “fast.” By the second OA I can tell whether the examiner is a professional or a game playing hack. There are plenty of “fast” examiners who are terrible examiners and plenty of “fast” examiners who are professionals. Same with “slow” examiners.

        1. 14.1.1.1

          +1

          (Yes, many of us – quite opposite the assertions of the former blight such as Malcolm – WANT excellent examination (slow OR fast).

          Hence, my common missive to the examiners that post here of “Do your F N job.”

          1. 14.1.1.1.1

            I want them to do the job correctly. But I realize they have no incentive to do so. And in fact have much greater incentives to not do it correctly. So I do my best to “help” them do the job correctly. I always write a claim that is a “test” for them. Many, maybe even most, fail it. At least the first time. Those that fail it the second time get taken to the woodshed.

            1. 14.1.1.1.1.1

              And in fact have much greater incentives to not do it correctly

              And THAT is a problem that needs to be eliminated.

        2. 14.1.1.2

          “By the second OA I can tell whether the examiner is a professional or a game playing hack. There are plenty of “fast” examiners who are terrible examiners and plenty of “fast” examiners who are professionals. Same with “slow” examiners.”

          That seems about right.

  7. 13

    35 USC 134(a): An applicant for a patent, any of whose claims has been twice rejected, may appeal from the decision of the primary examiner to the Board of Patent Appeals and Interferences, having once paid the fee for such appeal.

    On the topic of “fast” and “slow” examiners, has anyone experienced any examiners delaying prosecution by repeatedly withdrawing a case from appeal by issuing a new ground of rejection on other prior art instead of answering the appellant’s brief? Of course, this requires permission from a supervising examiner, but if the supervisor is/was also a “slow” examiner, that tactic is not impossible and our firm has seen one such case with multiple rounds of withdrawals from appeal. Has anyone else?

    1. 13.2

      Saw an interesting case, 13/617,320, on another site. Applicant petitioned. The petition was dismissed but it got the job done (of preventing the examiner from continuing to re-open).

  8. 12

    Thank you for the article. One error noted:

    While Applicants can traverse an ODP with appropriate words stating such, we “obviate” the OPD by filing the Terminal Disclaimer. We don’t “traverse” the OPD by filing the TD.

  9. 11

    This is one of the most useful articles posted. The conclusions are very much consistent with our daily prosecution expereinces.

    Yes, there are also fast practioners and slow practitioners, and fast applicants and slow applicants. Many have business and economic reasons for their behaviors.

  10. 10

    How about examiners who cheat?

    I had one examiner who misquoted one of my claims in order to reject it.

    I looked at the prosecution history of the last 20 patents he had examined and he had done that to several others but they hadn’t noticed it. (How could they not have noticed it?)

    I filed a Response which he ignored and did a copy and paste of his first rejection.

    He did some other dishonest things, too.

    And after he allowed it and I paid the issue fee, somehow the patent wasn’t getting issued. They kept telling me to wait.

    The only way I was able to make them issue it was to tell the guy in the Commissioner’s Office that I was going to file a complaint with the Department of Commerce IG.

    If they issued the patent before the IG people came over to investigate the examiner then all they would have to do is investigate the examiner. But if it still hadn’t been issued they would have to go up the chain of command to see how far the misconduct went.

    They issued the patent the next week.

    1. 10.1

      Would you mind sending me the name of the examiner? You can send it to my e-mail at shine.tu@mail.wvu.edu if you don’t feel comfortable posting it here. I would be interested in looking into his or her docket.

  11. 9

    I tried at # 7 but have not yet had an answer to my question, whether at the USPTO an Examiner has any discretion in choosing the order with which she or he picks up a file for continued examination, after Applicant responds to the FAOM.

    Why is this relevant in a thread about fast and slow Examiners? What I am driving at is the notion that what determines whether an Examiner is fast or slow, in any particular case, is Applicant behaviour.

    OK, the notion might be unimaginable. Nevertheless, at least at the EPO, with the same Examiner, you might get a second Action on the Merits 2 months after you reply to the first Action on the Merits. Or it might be two years later. It all depends.

    And often I get the feeling that it might depend on whether the response to that FAOM does or does not offer a serious basis for a Notice of Allowance. Which can only mean that, at the EPO, Examining Divisions have discretion which file, out of all the ones allocated to them, an Examiner may pick up next, when starting their work shift. Is it like that at the USPTO too?

    1. 9.1

      You are right, Applicant behavior can definitely also delay prosecution. Also the delay in picking up the case can also delay prosecution. However, I account for this by using the Office Action as my denominator metric. So, hopefully that helps normalize for some of the delays. I go deeper into this analysis in a paper entitled “Three New Metrics for Patent Examiner Activity” 100 J. Pat & Trademark Off. Soc’y 277 (2018). I hope this answers at least part of your question…

      1. 9.1.1

        I note that I coded for all of the arguments that the applicant made in response to the rejections. This gives you some insight into the validity of the rejection / arguments against the rejection, albeit not a very deep insight.

        I note that I also separately coded for if I thought they were legitimate arguments or bogus arguments (on the applicant side) and if I thought the rejection was legitimate or bogus (on the examiner side). These questions goes into the substance of the rejections and responses. I did not address this in these two papers. These two papers are only descriptive in nature, they make no real normative judgements. These normative elements that I coded for will be the basis for a separate paper…

    2. 9.3

      At the USPTO, each examiner has a docket and has discretion in choosing an application for FAOM. How deep is that docket depends on management.
      But when an applicant responds to a FAOM there is a clock that starts ticking and the examiner must answer within a limited time (or suffer the consequences).

      1. 9.3.1

        Thank you, Morse. That was exactly what I suspected, and reveals what is for me a very significant difference between the two Offices.

        AFAIK, there are no such “consequences” for an Examining Division at the EPO, which is therefore free to choose for itself the order in which it addresses the various replies to FAOM’s in its In-tray. After all, every Applicant is free to put in a request for expedited handling, whenever the need arises, and that request will be acted upon promptly.

        You might not think that Examiners should be allowed such discretion but as far as I can see, it works to the benefit of everybody.

        Somebody once said, the essence of skilful management lies in the expression, not of things one can measure, but in the things that are not measurable.

        1. 9.3.1.1

          Someone also said, “You get what you measure.”

          If your aim is the unmeasurable, how then do you get there?

  12. 8

    Prof. Tu, Did you limit the data set to original applications? Continuations, divisionals and perhaps foreign priority applicaitons will affect timing.

    1. 8.1

      No, I randomly chose applications from these examiners. I did choose patents that were all pre-AIA because I didn’t want a change in law to effect my results. I also chose patents that were associated with examiners who had at least 5 years of experience and primary signatory authority when the application was taken up (again to make sure the results were consistent).

      1. 8.1.1

        It would be interesting to know if the fast examiners had more continuation applications. These cases are the most like to receive ODP rejections.

        1. 8.1.1.1

          See Appendix 1 in the paper. Fast and Average examiners have slightly more priority applications. However, probably not enough to explain the large differences in allowance rates and prosecution times.

    2. 8.2

      “Continuations … will affect timing.”

      I agree.

      Now, I suspect that this is only one component in the difference between “fast” examiners and the average, but I’d like it to be eliminated so that the remainder can be attributed to genuinely bad examination.

  13. 7

    Can anybody comment on the possibility that the degree of expedition delivered by any given Examiner depends on the conduct of Applicant following issue of the FAOM?

    In other words, might the speed of delivery by the USPTO depend not on which Examiner you get but how you treat the particular Examiner who has been tasked with your case.

    I ask because this is my impression, how one manages most intelligently the prosecution process at the EPO. It’s a sort of “You scratch my back and I’ll scratch yours” world or, as it is said in the German language “One hand washes the other”.

    Perhaps at the EPO Examiners have some discretion, in which order they take up files for study. Is this not so, also at the USPTO?

    1. 7.2

      Examiners are people, so the way one treats an examiner after the first action can in most cases affect how the examiner proceeds. But it’s clear that many examiners think their job is to deny patents, or at least do not view their job as helping applicants identify patentable subject matter, and how one responds is not going to change that outlook. To the contrary: in those cases, going easy on an examiner can be disadvantageous to the applicant if an appeal is filed later, because some issues may be deemed to be waived.

      Even those examiners who aren’t necessarily adverse may be lazy, and inclined to take whatever route leads to a “count” with the least amount of effort. That may include cheating, i.e. short-cutting the rules, because they know they can get away with such cheating: it is usually more expensive to force the PTO to correct a procedural error than it is to file a continuation or RCE.

      1. 7.2.1

        Of course, being professional is a must – but that does NOT mean that one spares the rod.

        There seems to be a general societal disconnect between being able to be stern (and maintaining professionalism) and this obsequious fawning to a bureaucrat that seems to be being advanced (not by you, AM).

    2. 7.3

      “Can anybody comment on the possibility that the degree of expedition delivered by any given Examiner depends on the conduct of Applicant following issue of the FAOM?”

      Seems unlikely that the discrepancy in references could be attributable to Examiner response to prosecutor conduct.

      I’m not denying that a friendly and relenting prosecutor might get more help in identifying allowable material. But I think it’s extremely doubtful that search would be curtailed, or references left uncites, based on such conduct.

      1. 7.3.1

        You do realize that ALL examiners (regardless of the applicant side) are directed to be helpful in identifying allowable material, eh?

  14. 6

    Dr. Tu apparently spent two or three years as an associate at a large firm. I’m glad he discovered for himself what experienced patent practitioners already know, and was so excited about his “findings” that he felt the urge to share them. And I really appreciate the advice he gives to patent practitioners in the second half of the piece, because, being a mere scrivener who prosecutes patent applications day-in-and-day out, I would never have deduced that that advice on my own.

    I’ll bet if I study 300 law review articles about patent law, I’ll find that articles written by law school professors tend to cite to writings by other law professors, whereas articles by practitioners tend to contain few to no citations to writings by law school professors.

    I’m not even go to begin to discuss what my high school English teacher might have said about this piece.

    (Maybe it’s time to change my moniker to “Old Fart”.)

  15. 5

    (2) make reforms to the current process that will truly encourage innovation.

    I wonder if the contingent of the usual examiners that post here recognize the basis of this statement.

  16. 4

    I do not understand why “the sources of prior art differ for slow examiners versus fast and average examiners… for 102(b) rejections all examiners rely more on printed publications and secondarily on US patents” matters at all.

    For examination purposes, they are basically the same document (a shared disclosure and priority date). A PG-PUB will be preferentially used because it has citable paragraph numbers while an issued US patent does not.

    The only place the PG-PUB vs. issued Patent choice should be brought up is with regard to forward and backward citation information (an important element of the search process). In the antiquated computer system used at the USPTO, the two patent family members have different citation data associated with each member and there is no way to merge this information as has been done in every other major patent database.

    1. 4.1

      Printed publications are usually journal articles and not PG-PUBs. The question I wanted to get at here is: are examiners searching the same databases. I mainly wanted to see if slow examiners were searching more broadly than fast examiners. The answer is that all examiner groups seem to be using more than just the US patent / US patent application database. However, they are using these references in slightly different ways.

    2. 4.2

      Thank you, PS, for some insight into the workings of the PTO’s antiquated computer systems.

      As to the statement you quote from the piece, I assumed what the author meant by “printed publications” was *non-patent* printed publications, although it’s not clear if by “US patents” he meant only issued patents, or all US patent publications, including published applications. But your take is certainly plausible. Either way, a bit more precision on the author’s part would lend a more than a little clarity to the piece.

      1. 4.2.1

        I think most people think of “printed publications” as non-patent publications due to the reading of 102(a): … (1) the claimed invention was patented, described in a printed publication, …

        Technically, PG Pubs would be printed publications, since, like journal articles, they are not patents.

    3. 4.3

      There are at least two good reasons to cite the PGPub instead of the issued patent: (1) it will almost always have an earlier publication date, and (2) the paragraph numbers are visible in the text window that is searchable via Ctrl+F. In other words, if an examiner does a word search for “titanium” in a document, it is much easier to cite the portion of the spec that has this disclosure when you are searching a PGPub, because the paragraph numbers appear in the searchable text and the pdf version of the document. When you are searching a US Patent, you have to switch over to the pdf version and spend time trying to find the appropriate paragraph so you can cite the column and line numbers.

      PS DIP is also correct, that if you forward/backward search a PG Pub, you will only get the references that were in the file before it was examined. When you forward/backward search a US Patent, you also get the references cited during examination.

      1. 4.3.1

        Granted US patents are available as prior art as early as the priority date or application date, which are the same dates as for the corresponding PG-Pub.

        I do like the idea of being able to cite paragraph numbers vs. col/line, though! However, US examiners now have access to PatentPak, which provides text searchable PDFs of granted patents (from over 35 patent authorities).

        The tools available to examiners, such as SciFinder-n and STN, allow comprehensive forward/backward patent family citation searching.

    4. 4.4

      All examiners at USPTO have access to STN (including CAPlus, WPINDEX, INPADOC, SciSearch, DPCI – Derwent Patents Citation Index™) and SciFinder-n for very thorough patent search and citation.

  17. 3

    Hard documentation? No, but maybe a FOIA request would turn something up. But anecdotally, knowing a few judges, yeah, they get some counts for the institution decision (either way), and then a similar amount for the final written decision, with a few hours credit here and there for miscellaneous that come up during the trial phase. So, yeah, basically, if they can come up with a reason to institute, then they can double their credit by making that reasoning into the final decision as well. Essentially, denying the petition cuts their counts in half (and cuts their work a bit, too, but don’t know if it’s proportional).

    -this is not the official position of the PTO, any law firm, or any lawyer.

    1. 3.1

      “But anecdotally, knowing a few judges, yeah, they get some counts for the institution decision (either way), and then a similar amount for the final written decision, with a few hours credit here and there for miscellaneous that come up during the trial phase”

      No offense intended, but I’d need more than an anecdote to accept this as accurate.

      The reason why is that the examining side of the PTO is very deliberately structured so that “decisions which end the matter” get full credit, and “decisions which lead to more process” and “decisions that rely mostly on prior decisions” get fractional credits. There’s no reason to expect APJ work to be substantially different, and the PTO overlords are not exactly known for creativity and trying new management techniques. That prior outweighs an anonymous anecdote.

        1. 3.1.1.1

          … does the examiner union cover the administrative judges…?

          If not, why in the world would anyone even begin to think that the same mechanisms ‘must’ apply?

            1. 3.1.1.1.1.1

              POPA thinks APJs are management and not employees. The Treasury union for Trademark employees could not muster the votes from the APJs to unionize (i.e., make them part of a “professional organization”).

              1. 3.1.1.1.1.1.2

                APJs are not management in the sense of that word in Union contexts.

                POPA probably should expand its vocabulary.

                1. “APJs are not management in the sense of that word in Union contexts.” I think that’s why they got to try to raise votes from them, but they didn’t choose to unionize so no biggie.

      1. 3.1.2

        “Essentially, denying the petition cuts their counts in half”

        Not really. He is correct that APJs get the same credit whether granting or denying a petition. But the Final Written Decision is based on a different record and on different standards of proof from the Decision to Institute. Not based on the “same reasoning.”

  18. 2

    “Additionally, the rejections employed by fast examiners rely heavily on Obviousness-type Double Patenting (ODP)”

    “…rejections employed by fast examiners rely heavily on Obviousness-type Double Patenting (ODP) and/or 35 USC 112 rejections. Fast examiners have four times as many ODP rejections compared to slow examiners.”

    I wonder if these results could be revealing a structural characteristic of the system rather than a characteristic of the examiners. What seems to be a significant fraction of applicants file continuations of allowed applications with claims that are quite related to the previously allowed claims. Such applications will normally go to the same examiner. Those examiners “already know” the prior art and “have an idea of what is obvious”, resulting in fewer prior art rejections and more ODP rejections (due to being a con. of an allowed application). Examiners who allow more applications in general will go through this process more often.

    So do “fast” examiners actually rely on ODP rejections more, or are “fast” examiners more likely to be docketed an application which plausibly only requires an ODP rejection because it’s a continuation of an allowed claim?

    1. 2.1

      Excellent question. See Appendix 1 of the paper. The answer is that all examiners have about the same number of priority documents. Average and Fast examiners do have slightly more but probably not enough to explain the differences in time and allowance rates that I see.

      1. 2.1.1

        I don’t think Appendix 1 is relevant to my question. Provisional applications would not have the structural effect I’m trying to describe, nor would applications examined by another examiner.

        Luckily, Appendix 2 seems very relevant.

        It looks like “fast” examiners have roughly 40% more follow-up applications (i.e., an application with priority to an application examined by the same examiner)! That seems like a pretty big difference?

        1. 2.1.1.1

          Yes, sorry Appendix 2 is the relevant graph. The difference would be about 30% for slow examiners versus 50% for fast examiners. So there is a 20% increase for fast examiners versus slow examiners. You are correct, this is a significant increase in priority documents. However, the 20% increase probably cannot account for the 50% increase in allowance rates or 4-5 years in extra prosecution time.

          1. 2.1.1.1.1

            Why not?

            Why not more?

            After all, all the hard work is supposedly already done (we are not talking CIPs I take it, and the ‘no new matter’ rule is in effect).

          2. 2.1.1.1.2

            “So there is a 20% increase for fast examiners versus slow examiners.”

            No, that is a “twenty percentage point” difference.

            I think the relevant metric is the percent difference/change. The framing matters a bit, but either way, it is significantly more than the percentage point difference.

            For example, slow examiners have 40% less of these follow up applications than fast examiners [(30%-50%)/50%*100%].

            1. 2.1.1.1.2.1

              OK…we are still talking a difference of 20 out of 100 patents reviewed. I can tell you that even if I removed the 20 longest prosecuted patents from the slow examiner mix, you would still see these trends. The fact is that ALL of the patents that I reviewed from slow examiners had tons of rejections and took a long time. It was not just 10 or 20 or 30 patents that were creating this disparity.

    2. 2.2

      An obviousness type double patenting rejection can only be made if there is another application or patent by the same inventors with a claim to essentially the same claimed invention. If not violating 101’s restriction to one patent per invention, or this ODP judicial extension of it, it must be a regular 103 rejection. Thus I am surprised that there would be enough ODP rejections available to examiners in a broad enough sampling of examiners to make that big a difference?
      Isn’t misuse of “restriction requirements” as a “first action” [instead of a proper first action search and examination] a bigger problem?

      1. 2.2.1

        Restriction practice (and the vagaries therein) is a whole nother animal.

        There are pros and cons for the client with improper restriction practice, which is probably why there is not open revolt against the arbitrary nature of the current practice.

      2. 2.2.2

        “Thus I am surprised that there would be enough ODP rejections available to examiners in a broad enough sampling of examiners to make that big a difference?”

        Once you start popping out patents you start getting a lot of CONs and then you start doing a lot of ODPs (unless you just call them for a TD).

  19. 1

    I can confirm that it has no impact on the “quality” of the patents. In fact most of the patents that are invalidated in IPRs are the most carefully examined and reexamined ones.

    Getting a patent is not the problem. Keeping it is.

    There is no point in examining applications. Look at the APJ point system. If they need a point, they will institute a trial If they institute a trial, they will invalidate. Points, bonuses, and job security. In at least one case a high paying job with stock options for the petitioner afterward (see Matt Clements now at Apple after delivering a 96% invalidation rate).

    1. 1.1

      “Look at the APJ point system. If they need a point, they will institute a trial If they institute a trial, they will invalidate. Points, bonuses, and job security.”

      This is an interesting claim. Do we actually have any documentation of an APJ count system confirming this assertion?

      1. 1.1.2

        I can state for a fact that an IPR denial accounts for the same production credit as an IPR institution in the APJ production system.

Comments are closed.