Laboratory Administration & Management of Pathology Practices

Quality management

Peer (case) review in anatomic pathology



Last author update: 1 May 2014
Last staff update: 3 March 2021

Copyright: 2002-2024, PathologyOutlines.com, Inc.

PubMed Search: Peer review in anatomic pathology

Mark Priebe, M.T. (A.S.C.P.), S.B.B. (A.S.C.P.)
Page views in 2023: 491
Page views in 2024 to date: 131
Cite this page: Priebe M. Peer (case) review in anatomic pathology. PathologyOutlines.com website. https://www.pathologyoutlines.com/topic/managementlabAPQA.html. Accessed March 29th, 2024.
Relevance
Formal and informal programs: case review quality assurance
  • For most laboratories, the quality strategy is made up of multiple QA / QC programs that best fit the institution's patient mix, staff experience and specialty status
  • QA programs can be:
    • Formal: those that are scheduled, (volume and time) predictable and under your control
    • Informal: having programs that apply as QA but do not have a formal schedule, frequency or under your control (table 1)
  • For this discussion we will focus on the formal QA programs, although the informal programs can offer a wealth of quality information and should be tracked and documented as part of your overall quality program
  • In a CAP Q-Probe (Arch Pathol Lab Med 2014;138:602) with 73 labs responding, of those reporting (56), 45% of the laboratories reported using post (retrospective) sign out case review as the means to help detect defects, followed by don't know 29%, clinician request 21% and tumor conference of 5%; table 2 breaks down the current formal QA programs
Diagrams / tables

Table 1:
Formal quality assurance programs Informal quality assurance programs
Retrospective case review Autopsy
Proficiency testing Diagnostic consult
Prospective case review Patient referral


Table 2: Current formal quality assurance program for AP
Characteristic Proficiency testing Internal case review (retrospective) Internal case review (prospective) External peer case review by subspecialist (retrospective)
Adds to the pathologist workload
Peer reviewed ? ?
Standardized
False negative & positive cases
QA total process
Benchmarking
Ability to influence the diagnosis in real / near real time
Key positive feature(s) Established minimum quality tool Most common QA practice Real time External subspecialist review, does not use pathologist time
Negative consideration Does not QA the full case detail from gross to report Demanding on pathologist and technologist time Most demanding on pathologist and technologist time Program needs to be double blinded for confidentiality
Best demonstrated practice CAP and ASCP proficiency programs ADASP guidelines on QC & QA in AP quality assurance UPMC QualityStar™ external QA case review by subspecialist
Proficiency testing (PT) or external quality assurance (EQA)
  • This compares a laboratory's test results using unknown specimens (usually digital images) to results from other laboratories
  • It is the most established QA program and should be considered the minimum requirement for AP laboratory quality assurance
  • Clinical feedback and reference to subspecialists are provided and standardization allows for national benchmarking capabilities
  • PT programs from CAP, ASCP and others are approved by the American Board of Pathology and meet level IV requirements for Maintenance of Certification (MOC) (see American Board of Pathology website for a complete listing of PT programs that are level IV compliant)
  • Drawbacks: adds to pathologist workload, does not offer full case review from gross to clinical report and is not representative of pathologist caseload
Internal case review (retrospective)
  • Random selection of 1 - 10% of cases or more, for secondary QA case review also referred to as peer review
  • This is the most common practice today for QA case review, allows for complete case review and represents the pathologist's workload
  • However, it is also subjected to onsite biases and personnel conflicts
  • It is not standardized, so benchmarking is difficult between institutions and it adds to the pathologist's workload
  • Most laboratories also lack true peer review from specialists and subspecialist in all tissue types
Internal case review (prospective)
  • Case reviews like above but performed prior to sign out in real time to allow findings to influence the final diagnosis and add additional comments that may contribute to enhanced patient care
  • Elegant example was presented by University of Pittsburgh Medical Center UPMC, Professional QA / QC Surgical Pathology Program - Yousem 2010
    • Presentation demonstrated similar error rates pre and postsignout with no effect on case turnaround time
    • Program does require a significant depth of pathology, software (AP / LIS) and development support that is not found in most AP laboratories
    • As the program is not standardized, it is difficult to receive the benefits of benchmarking with similar programs nationally
External (peer) case review by subspecialist (retrospective)
  • This is a new AP / QA program that is built around case review outside the institution (interlab) as a new level of granularity with error detection of 1 to 45% (Patient Safety & Quality Healthcare: Quality, Assurance, Diagnosis, Treatment and Patient Care [Accessed 17 November 2017])
  • It offers a significant enhancement in the ability to provide quality feedback for guidance and continuous improvement
  • Two characteristics stand out when comparing the sensitivity of error detection between intra and interlaboratory case review:
    • Difference in the ability to gain incremental case scrutiny by using subspecialists for review (when compared to using generalist pathologists)
    • Difference in moving the review outside the institution to reduce onsite bias and feedback confrontation
  • It is very difficult for a general pathologist to stay current in all organ systems and cancer types
  • As with all disciplines, frequency of interactions builds confidence and skills and helps keep practitioners current with evolving diagnostic tools such as molecular assays and immunohistochemical stains
  • With the current scarcity of pathologists and expected continued demand for pathology services resulting from an aging population, having subspecialists onsite is rare in the average hospital setting (3 - 4 pathologists) and having multiple subspecialists to provide quality assurance peer review is extremely rare
  • Cases can be submitted via glass slides or digital images (cases are de-identified prior to submission and cases with digital images are uploaded to a secure cloud)
  • Academic medical centers provide blinded subspecialist case review
  • Benefit is a standardized program that allows benchmarking at an increased level of granularity without adding to pathologist's workload
  • Program is also ABP approved for MOC level IV
  • However it does require additional efforts to blind each case prior to submission and uploading of multiple WSI images takes time and may need to be coordinated within the lab
Conclusion
  • As professionals in the healthcare system, we realize that focusing on quality is imperative
  • For the first time in 2014, the National Patient Safety Foundation placed "diagnostic errors" as a top patient safety challenge
  • That ranks it with initiatives such as healthcare acquired infections and readmissions, proving that additional focus on this area will build over the coming months and years
  • Case (peer) review is just one component of a good quality program but it plays a major role in continuous improvement and is the closest to patient outcomes

  • When thinking about the programs reviewed in this article, each have a contribution to your quality initiatives and goals
  • It is not about choosing which one to implement but more about mastering one and moving to the next
  • Goal is to build your quality tool set to close the gap on diagnostic errors in anatomic pathology
Back to top
Image 01 Image 02